Does the quality of a digital signal matter?


I recently heard a demonstration where a CD player was played with and without being supported with three Nordost Sort Kones. The difference was audible to me, but did not blow me away.

I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.

An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.

I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.

There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.

I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
mceljo
From How Stuff Works:

"In analog technology, a wave is recorded or used in its original form. So, for example, in an analog tape recorder, a signal is taken straight from the microphone and laid onto tape. The wave from the microphone is an analog wave, and therefore the wave on the tape is analog as well. That wave on the tape can be read, amplified and sent to a speaker to produce the sound.

In digital technology, the analog wave is sampled at some interval, and then turned into numbers that are stored in the digital device. On a CD, the sampling rate is 44,000 samples per second. So on a CD, there are 44,000 numbers stored per second of music. To hear the music, the numbers are turned into a voltage wave that approximates the original wave."

What this means is that until the numbers of the digital signal are converted back to a analog voltage wave via a D/A converter the only thing that matters is that the signal be transferred.

Maybe this is my my CD player recommends using a digital connection to my receiver rather than analog. At my equipment level he preservation of the analog signal isn't able to match bypassing the component all together and "shortening" the path of the analog signal.
This is something laypeople never seem to grasp. The entire advantage of digital is just that = "signal quality matters much less than analog".

This is a FACT.

The idea of representing information as 1's and 0's means that information can be stored and transmitted with no loss - something that is IMPOSSIBLE to achieve with analog.

So I would say YES - signal quality is much less of a factor in digital than in analog.

In fact the biggest source of quality differences with digital audio is the conversion to analog - this is where differences are audible - in the quality of the D to A converter.

You can copy a CD with a cheap drive 1000 times and it will be the same (copy of a copy) however it will sound better with a dedicated high quality DAC or a good quality CD player.
CD's are not 0s and 1s. They are pits burnt into the metal layer and are measured for lenght by the lazer and then converted into a digital format. Then they are run through a Digital to Analog converter.
If Shadorne's argument is correct then why should the quality of the DAC or CD player matter? Even cheap one seem to measure very well. "Signal quality much less of a factor than in analog"? No wonder my LPs sound so much better. Why do transports make such a difference? A friend of mine didn't believe they would until he heard different ones on his system. Why do the best CD playback systems cost so much unless they are susceptible to degradations just as analog is?
CD's are not 0s and 1s. They are pits burnt into the metal layer and are measured for lenght by the lazer and then converted into a digital format.

These pits (or the transition) represent bits: a 1 or a 0. All digital information must be stored in analog form including what is on your computer hard drive. However, the digital approach allows the use of a threshold level or clear demarcation between a 1 and a 0 that does not exist in analog approaches.

Example of a digital scheme (not from a CD)

Signal level between -.5 and +0.5 Volts = 0. Signal level between +0.51 to 1.5 volts = 1.

This means you can have a lot of analog error or noise in the media and still get a perfect translation of the data as exactly what it should be - a 1 or 0.

If you add parity bits or polynomial redundancy check bits to the data you can also improve the robustness further (allows detection of data errors or even allowing for recovery of completely missing data)

Using the same example, compare this to an entirely analog approach where the difference between 0.O and 0.4 volts may be significant.