Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho
Original post on 12-01-00: Why do digital cables sound different? After much discussion, may I venture that, as of 07-02-09, the question has not been answered?
Could it be possible that no one knows? 8^)
Rja

Question has been answered many times. Digital cables introduce jitter. Jitter creates sidebands at very small level not harmonically related to root frequency. It is basically noise in time domain. Noise is reducing resolution, imaging, clarity etc.

Jitter is induced in coaxial cables by external noise or characteristic impedance mismatch (signal reflects on impedance boundaries). In Toslink jiiter is induced by system noise in presence of slow rise/fall time (slow transmitters and receivers).

It is possible that we could summarize a few points:

If you DAC re-clocks really really really well (and most very modern ones focus on that now in varying degrees)

and

If your SPDIF cable is 2 meters or so (not less)

and

It is rated somehow to maintain 75 ohms within very tight tolerances every few mm along its length

and

its termination methodology maintains the 75 ohm aspect

then the difference in cables should be pretty minimal.

In that sense are we getting somewhere over the 8 / 9 years?
Lightminer - If DAC reclocks like Benchmark (asynchronous upsampling) then quality of the cable is not of importance. Benchmark tested it with thousand feet of CAT5 network cable and it did not show any audible effect. The reason for that is few Hz jitter bandwidth of the Benchmark providing suppression in order of -100dB at the frequencies of interest (kHz) on the top of already low level of jitter (order of -80dB).

For traditional non-upsampling DACs I would look for 1.5m coax (Toslink gives about 2x jitter) double or even triple shielded with high quality connectors. I would also look for the transport with slew rate in order of 5ns instead of typical 25ns and very good power supply (to minimize system noise induced jitter).
So may I assume that the less jitter a digital cable induces the "better" that cable will "sound"?

Is this induced jitter measurable from cable to cable? Could an absolute "best" be established using the lowest jitter criteria. And if this is true, why are so many digital cables (100s or more) available?

Or, is this purely a theoretical discussion?