Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho
Sean,

Can i still have your tax guys number? just kidding.
I agree...audio and high speed are two different worlds
and I do get them confused sometimes...

Forever,

I think you brought up some very good points. The more
i think about it the more im thinking perfect length
of wires might not be ideal for audio. Hmmm.. So what do
you say guys lets sell different lenght MATCHED cables
and get rich?

Best regards
forever, are you saying the reason the systems you site as sounding lousy was because of the attitudes of the owners, or that they chose bad cables? I douby in either case they were using lamp cord

steve.
I am not well read on auditory theory, but I seem to remember reading somewhere that auditory memory is good for a few seconds at most. Also that non-blind tests are statistically worthless since visual clues, foreknowledge and so forth demonstrably invalidate the test. Also, that expectation ("I am now changing the cable") leads to perceived differences in sound (even when the cable is not actually changed). Or do we dismiss this overwhleming scientific evidence under the illogical rubric that music is so very complex and objective undestanding so very limited that subjective experience outweighs it, and hence cables must make a difference?
"Yes" is the short answer. I don't necessarily agree, but that seems to be the point of view of this community.
I'm jumping into this late but here goes.

First, don't assume that people who can't hear the difference in digital cables have a bad system or bad ears. Maybe they are just fortunate that their system components match well enough that the cable isn't much of an issue. Having a system that is very sensitive to cable changes may not always be such a good thing.

Second, I've tested a few ADC's (similar to DAC's) and clock jitter is extremely important in reducing harmonic distortion at audio accuracy levels. As mentioned above, errors in the time position of the sine wave samples will distort the sine wave, just as voltage errors will.

What we're talking about is two issues: data recovery from a serial bit stream and also clock recovery from the same serial bit stream.

Although data bits errors should be rare in a well-designed and well-matched system, they will cause serious problems when they occur. As noted above, there is no error detection or recovery mechanism for data errors. All serial data transmission schemes will miss data occasionally and the measure of that is the "bit-error rate".

Clock recovery circuit design is almost an art. I'm not familiar with the CD standards. I assume that the DAC's clock recovery circuit uses a PLL. Is this correct? The PLL will filter out alot of the high frequency jitter. Better PLL designs will filter out more jitter.

We have equipment to measure jitter where I work. The accuracy is in tens of picosends. The inputs are all 50 ohm. This equpment should be usable for testing the transport/cable jitter. Does anyone have ideas for some easy experiments using this equipment? It might be interesting to measure several types of transports.