Does the quality of a digital signal matter?


I recently heard a demonstration where a CD player was played with and without being supported with three Nordost Sort Kones. The difference was audible to me, but did not blow me away.

I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.

An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.

I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.

There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.

I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
mceljo

Showing 10 responses by shadorne

This is something laypeople never seem to grasp. The entire advantage of digital is just that = "signal quality matters much less than analog".

This is a FACT.

The idea of representing information as 1's and 0's means that information can be stored and transmitted with no loss - something that is IMPOSSIBLE to achieve with analog.

So I would say YES - signal quality is much less of a factor in digital than in analog.

In fact the biggest source of quality differences with digital audio is the conversion to analog - this is where differences are audible - in the quality of the D to A converter.

You can copy a CD with a cheap drive 1000 times and it will be the same (copy of a copy) however it will sound better with a dedicated high quality DAC or a good quality CD player.
Jitter is not a problem with "digital" part of digital (the robust part). Jitter is part of the analog problem with digital and can be regarded as a D to A problem (or, in the studio an A to D problem). It is an analog timing problem whereby distortion can be introduced at the DAC/ADC stage because of drift in the clock. To accurately convert digital to analog or analog to digital requires an extremely accurate clock.

I stand by my statement that you can copy a copy of a digital signal and repeat the copy of each subsequent copy 1000's of times with no degradation.

You cannot do this with any analog media - within ten to twenty copies or a copy the degradation becomes extremely audible (or visible in the case of a VHS cassette)

The evidence is that digital signals are extremely robust compared to analog.
CD's are not 0s and 1s. They are pits burnt into the metal layer and are measured for lenght by the lazer and then converted into a digital format.

These pits (or the transition) represent bits: a 1 or a 0. All digital information must be stored in analog form including what is on your computer hard drive. However, the digital approach allows the use of a threshold level or clear demarcation between a 1 and a 0 that does not exist in analog approaches.

Example of a digital scheme (not from a CD)

Signal level between -.5 and +0.5 Volts = 0. Signal level between +0.51 to 1.5 volts = 1.

This means you can have a lot of analog error or noise in the media and still get a perfect translation of the data as exactly what it should be - a 1 or 0.

If you add parity bits or polynomial redundancy check bits to the data you can also improve the robustness further (allows detection of data errors or even allowing for recovery of completely missing data)

Using the same example, compare this to an entirely analog approach where the difference between 0.O and 0.4 volts may be significant.
If Shadorne's argument is correct then why should the quality of the DAC or CD player matter?

I think I covered this:

In fact the biggest source of quality differences with digital audio is the conversion to analog - this is where differences are audible - in the quality of the D to A converter.

Clock & converter accuracy as well as the quality of the analog circuitry in the output stage can still make a difference.

However, digital eliminates the problems of media storage degradation and analog read errors from media. (dust, feedback, surface noise, pressing imperfections, pre-echo, poor chanel separation, lack of dynamic range of analog storage methods etc.)
It seems to me that the bit stream speed is independent of the bit content. If this is correct than should not the jitter be either constant of possible a function of the disc itself (like radial position or burn/pressing quality)?

That was assumed when CD players were first invented. However, many things can affect the accuracy of the clock signal in the DAC. And even the bitsream is variable - error bursts and misreads may be cyclical and perhaps only the digital "preamble" is fairly consistent - so the data may vary in a certain repeating patterns.

Provided jitter is random, it is in general a negligible problem. However when patterns - such as power supply oscillations due to cyclical laser servo movements to track the pits on the rotating disc occur - then we can get non-random jitter. Another major cause of non-random jitter may be the Phase Locked Loop between teh master and slave clock - in this case, the very act of trying to keep the slave clock in time with the master cause oscillatory patterns as the slave hunts back and forth trying to keep in time. These repetitive patterns in clock timing erros cause new oscillatory audio signals to appear in the analog music coming out of the DAC - sometimes called sidebands - non-harmonically related signals. It is these very small (-40 db) but 'correlated' sounds that become audible - usually as hash or lack of clarity in upper midrange and HF (although this may significantly affect the perceptive sound of percussive instruments with low frequencies - like piano or drums - due to the way we "hear")

Anyway - jitter is an analog problem - it only appears upon conversion to analog or, up front, when converting analog to digital.

If you have a perfect clock then you will not have jitter.
DAC's have evolved to have better clocks. Early designs like Meitner used patterns in the digital data called "preamble" to try and achieve a more accurate clock. Others like Lavry used algorithms to maintain a very slow correction pattern on the slave clock that could be filtered out. Since about 2002 the problem has been substantially addressed by "asynchronous DACs" - basically these type DACs ignore the master clock altogther - and in these designs the jitter is totally determined by the clock quality in the DAC along and nothing upstream of the DAC.
Not to open up another can worms, but there is differences in sound heard from a blank CDR burned on a home computer and one burned on a stand alone CDR recorder. And yes even when using "Exact Copy".

Since the CD copy should normally be a bit perfect copy (you can confirm this easily using a computer), you may want to invest in a better CD player or DAC. What you are experiencing are differences in sound quality due to small differences in the media disc such as weight, color, coating, central hole alignment, balance of the disc etc. - normally a good player will be immune to such differences - it should read the bits correctly without affecting the built in DAC and low jitter clock:it should result in identical sound.
Above when I say "not sure" - what I mean is that the software can tell the drive what to do and I don't know if MAX software is telling the drive "not to interpolate" uncorrectable errors. Some softwares will do this and then you get an error report during ripping so you know for sure that the original CD was not readable and your rip is not "bit perfect".
Kijanki,
The Reed Solomon interleave is actually quite robust however CD players will indeed "interpolate" as a last resort when data is missing

It is true that you might not know when your CD player is interpolating unless the disc is quite badly damaged and you get pops or clicks. Normally you should be well aware of errors when music is copied with a PC with good software (sometimes you need to set the software to warn you about read errors).

I have had some CD's that suffered CD rot - they played on a CD player but could not be copied without error on a PC - to me this means that they are beyond repair and the data cannot be recovered - but this problem is possible with any digital format that gets really badly corrupted or damaged. Under normal use with good quality discs one should not normally run into problems.
I knew when I posted it,... I would get that answer.... All I can say to you is, have you tried it? I suggest you try it for yourself.

The science is pretty clear on this. If you suffer issues then the difference between the disc material is causing poor D to A conversion. (The machine gets the digital info off the disc but is unable to convert it reliably without audible distortion)

I am not surprised that there were differences - there is a lot of inadequate equipment out there - much of it at the high end. Jitter only became well understood in the mid 90's and it only takes poor isolation of the servo motor driving the CD lens from the clock driving the DAC to get distortion due to jitter. Since the lens servo and motor will be acting in a cyclical pattern (highly likely since it is reading a rotating disc) then these patterns can mess up the sound quality of the CD player - if you replace the disc and it behaves a little differently when rotating then bingo you get a slightly better or worse sound.

The solution is to get a better CD player that will read the data without affecting the quality of D to A.
Shadorne - I'm not sure how this interpolation works. Is it happening also when I use program that rips CD as data (like MAX) - I hope not. Do you know?

Not sure if the software or drive will conceal an error or not but from what I understand - only damaged (scratched) CD's are likely to require interpolation, as the error correction (although less rigorous than data CD's) should be more than enough for well looked after CD's. The CD's error rate is very low - certainly much lower than the number of glitches coming out of the studio and on to the CD master. Only a few out of one thousand CD's should require interpolation in a few places (when in new and unscratched condition).