Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho

Showing 4 responses by knownothing

Jstropp - read my user name. Please improve the situation - how exactly is music represented digitally?
Mapman, are you saying digital signals in a coax cable for example are harder to mess with than pure analog signals, and so theoretically the cable construction should have less impact on the resulting sound? I.E. It is harder to mess up coded and sent 1s and 0s than the analog wave forms of a bass drum or a cymbal in transit through a wire. I have found differences in the sound performance between different USB and HDMI cables, but those are different animals than coax.

kn
It seems to me that this is more of an issue with data loss than it is with timing errors, although I have read in other threads RE transport quality that re-clocking can only go so far, and that the more conservative the initial translation and transport of the digital signal, the better the outcome, even with the most successful implementation of re-clocking on the DAC end of things.
Kijanki, let me make sure I am tracking with you. You are saying that jitter is important. That jitter can also result from cable induced errors. That re-clocking at the DAC does not necessarily correct for all/any errors related to jitter that could occur during delivery of the raw digital signal through a cable. Does it follow that some digital cables are better at delivering digital signals free of or with less added jitter?

On a related note, in theory or in measurement, can a digital signal be corrupted in a cable, say due to exposure to strong EMF, to the point where 1s and 0s are actually deleted or unreadable at the DAC. I.E. outright data loss?