Importance of clocking


There is a lot of talk that external clocks because of the distance to the processor don‘t work. This is the opposite of my experience. While I had used an external Antelope rubidium clock,on my Etherregen and Zodiac Platinum Dac, I have now added a Lhy Audio UIP clocked by the same Antelope Clock to reclock the USB stream emanating from the InnuOS Zenith MkIII. The resultant increase in soundstage depth, attack an decay and overall transparency isn‘t subtle. While there seems to be lots of focus on cables, accurate clocking throughout the chain seems still deemed unnecessary. I don‘t understand InnuOS‘ selling separate reclockers for USB and Ethernet without synchronising Ethernet input, DAC conversion and USB output.

antigrunge2

Showing 6 responses by erik_squires

This thread has become a full time job, with terrible work life balance and benefits so I'm checking out after pretty much saying everything that I felt I needed to share. 

Best of luck and happy listening.

The issue with S/PDIF or AES "reclockers" is that you have two clocks arguing over what should the absolute clock rate be.

The DAC is forced to take one of two approaches: Abandon it’s internal clock or attempt to keep it’s internal metronome and "fix" upstream deviations from it’s own mechanism. This is exactly the thing pro clocks do, but only because there are upstream devices manipulating the data stream. They are there to stop an inevitable argument that arises as a result of a studio’s workflow. Home users HAVE no such arguments to solve, but can create them by adding upstream clocks.

Maybe the best of these situations is to use an Asynchronous Sample Rate Converter, like in the Schiits, but then you’ve got to deal with the fact that your DAC is no longer being given bit-perfect conversions.

In measurements done, I’ve seen original DAC jitter perform actually get degraded, and the signal looks like the upstream jitter PLUS the DAC’s original jitter signature.

Either use an integrated streamer/DAC or a streamer with a multi-second buffer plus USB / asynchronous communication with the DAC is the way to go IMHO.

I should point out, use whatever you want to which sounds good to you, but so far all I'm reading is a misunderstanding of how and why studio clocks work.  I'm going to go with the documentation from Benchmark and Mytek and say it's a bad idea.

A recording studio is using an external clock as much for synchronization as for anything else, as they often have a situation when devices are daisy-chained together and must work together correctly at every clock beat.  The multiple pieces of equipment can't "hear" each other the way an orchestra does, so the master clock is there to make sure all the different pieces are in synch.

It's the daisy chaining effect that CAUSES jitter unless mitigated by the master clock.

On the other hand, with Internet streaming, there's no such synchronization. 

PS- Error correction is taken care of by the TCP portion of TCP/IP and handles requests for retransmission when packets are corrupted.  Another reason for big buffers, having enough time to ask for a retransmission when an error occurs.

What about the clock accuracy releasing the buffer, then?

 

I specifically addressed this in my first message. You literally can’t avoid jitter without buffering, but in the case of a network stream from outside the home having more "clocking" at upstream devices, without buffers doesn’t help you because the original stream has huge amounts of packet to packet variation (huge relative to an audio or video playback). In fact you can end up in a situation where your original DAC’s jitter is worse because there’s an upstream "reclocker" that has so much jitter it’s forcing the downstream DAC to misbehave.

The best possible place to put a fancy clock is less than an inch away from the DAC. External clocks are used to produce music in the cases of multiple DAC’s or streams happening at once, as in the case multi track recordings and mix-downs. Those clocks do NOT however dejitter anything.

OP:

Um, yeah, OK. I’m going to sit here and wait for you to explain to me how on earth you would even reclock a TCP/IP stream without actually buffering it.

Just lemme know when you work out that mathmagic.

 

Erik

Everything upstream of a streamer is what we call asynchronous. The timing is always rather loose as an Internet based connection does not have guaranteed latency or inter-packet arrival time. Attempting to add picosecond clocking to those events which are varying by hundreds of milliseconds is madness.  For reference:

0.1 seconds is 100 milliseconds is 100,000,000 picoseconds and 100,000,000,000 femtoseconds.

This is why TV’s and streaming devices have relatively large buffers. Everything goes into a bucket which it attempts to keep full from one side, and then carefully doled out by a metronome on the other. That’s where the clock matters.

Everything else, IMHO is to keep the noise out of the AC lines and interconnects.

As an example, I've watched my Internet fail and failover to a backup Internet provider.  The process takes around 70 seconds during which I had no Internet.  My music never stopped playing.