Digital Coax Cable - Does the Length Make a Difference


Someone (I don't remember who) recently posted something here stating:
"Always get 2 meter length digital cables because I have experienced reflection
problems on short cables, resulting in smeared sound and loss of resolution."

With all due respect to the member that posted this, I'm not trying to start a
controversy, just wondering if others have experienced this.  

I will be looking for a Digital Coax cable soon to run from my Node2i to a Dac.
I really only need 1/2 meter. Not sure if a short cable like this is a problem or 
just a case of Audio Nervosa.  

ericsch

Showing 4 responses by kijanki

It might sound counterintuitive, but if you cannot use short cable (<1ft), then you have to use long cable (>1.5m).  Most of the time 1.5m is the minimum so 2m would be safer.
@ericsch  We want to avoid reflections from the end of the cable.  These reflections appear on characteristic impedance boundaries (changes).  Characteristic impedance is related to geometry and dielectric of the cable and is roughly equal to SQRT(L/C).
We want smooth fast transitions of digital signal.  Reflection can alter them by coming back and adding to the signal.  Typical transition in average CDP is about 25-30ns while the threshold (point of level recognition) lies in about half of that (mid voltage point).  So we don't want reflection back in 30ns/2=15ns. Beginning of the transition starts reflection.  It has to travel to the end of the cable, reflect and come back.  15ns is equivalent to about 3m at 5ns/m signal speed.  It means 1.5m each way.

Now, case for the short cable.  Reflections start appearing when travel time is longer than 1/8 of transition time (cable becomes transmission line).  In our case it would be 30ns/8=3.75ns - equivalent to about 0.75m.  For 25ns transition it would be closer to 0.5m   This distance includes internal wiring of the transport and receiver (DAC).  I would not go further than 1ft cable.  1/8 is a rule of thumb, but normally you would have to know the fastest transition point, calculate frequency, then the wavelength and take 1/10 of it. 
I would only add, that it seems logical for D/A converter clock to be synchronized with incoming data samples (to avoid losing data).  In S/Pdif  D/A clock is adjusted to average data rate from S/Pdif while in asynchronous USB D/A converter runs at the rate of independent fixed stable clock. DAC receives data in packets (frames), places it in the buffer and signals back "Give me more (or less) in the next frame".   Synchronous USB is pretty bad since D/A converter rate depends on average rate of the data coming from computer - rate that is very uneven.  As Jaytor mentioned, USB connection has ability to inject either computer noise or ambient electrical noise picked-up by the cable.

Optical cable might seem better than coax, since it doesn't inject electrical noise and it has no transmission line effects (reflections).  The problem is that transitions produced by transmitting optical diode are slow.  Slow transitions when crossing threshold point make uneven time of level recognition if system is noisy (on either end).