Does the quality of a digital signal matter?


I recently heard a demonstration where a CD player was played with and without being supported with three Nordost Sort Kones. The difference was audible to me, but did not blow me away.

I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.

An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.

I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.

There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.

I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
mceljo
keep in mind that while digital info is just 0's and 1's there also has to be a timing element that insures that the bits are in the correct place. Easy to get them in order but if your disk is spinning too fast then your read is too fast and such. Mind I'm not advocating anything as I am suspicious of so many things here but I could see timing perhaps being affected by cheap stuff. One problem with isolation is that there is a delay between setups and the listener often knows which is which. Any blind tests out there with multiple listeners and under statistical or sampling control?
I'm glad to see some objective insight on this subject. Keep in mind that I'm making a clear distinction between the two cases of having analog vs. digital being output from the CD player.

In my mind, a clean analog system would be the following:

(1) turntable - pre-amp - amp - speakers

In the digital world it would look like one of the following:

(1) CD player (DAC) - pre-amp - amp - speakers

(2) CD player - seperate DAC - pre-amp - amp - speakers

(3) CD player - integrated DAC/amp / receiver

I suspect that having an analog signal go through my home theater receiver would probably cause more degradation of the signal quality to nullify any advantage of an auidophile grade CD player.

I do not think that timing would be an issue for CDs as most can read a disk much faster than is required. This is was is known as a buffer and we all know what happens on youtube when the buffer isn't adequate.

My thought is that someone that will be using a home theater receiver (possibly other options depending on budget) would do better to put their money towards a better receiver and speakers than to invest in an expensive CD player and increase the number of things in the analog signal stream.

The issues assocated with degradation begin wherever the D/A happens.
Ever hear a CD "skip"? That's when the quality matters.

The distortion from a bad reading of 0's and 1's will most certainly not be euphonic.

So if your tone-deaf buddy/brother/significant other/etc. that thinks that you're crazy to spend all that time and $$ on your stereo listens to your system and immediately tells you "dude, your CD is broken", then you've got a problem with the quality of your digital signal. If such a person does not immediately point the finger at your CD player, then you don't have a problem with your digital signal.

Don't worry, you will find other problem with your stereo--I know I have. :)
The problem with the just 1s and 0s is that it simply doesn't hold up in practice. To repeat a story I have alluded to before, years ago a large Japanese CD pressing firm send [I think] HiF News some different pressings of the same CD, some with standard material and some with a mix of materials that would cost slightly more and which they were attempting unsuccessfully to get the record companies to adopt. There was such a huge difference in sound between them that they had to download them into a computer to see if the data was the same. It was exactly the same, if digital is so foolproof what made the difference? The laser system is a mechanical one and constitutes a change from analog to digital, they are not ones and zeros but REPRESENT ones and zeros in the same way the groves in an LP represents sound waves. Many mechanical factors can interfere with the ability to correctly read the pits and translate them into digital signal.