Which is more accurate: digital or vinyl?


More accurate, mind you, not better sounding. We've all agreed on that one already, right?

How about more precise?

Any metrics or quantitative facts to support your case is appreciated.
mapman

Showing 14 responses by atmasphere

Out of curiosity, what was the analog set-up that you had to compare to the high dollar digital playback?

The analog setup was a re-tipped Grasshopper 2, mounted on a Triplanar arm which was in turn on a Kuzma Stabi Reference. The arm was set up with a balanced connection driving the balanced phono section of the Atma-Sphere MP-1 preamp.

The difference between that and the Staltek system (easily the best digital I have heard so far, regardless of the digital source file) was readily audible as an increased smoothness and level of detail on the part of the LP. For the most part ticks and pops did not give away the vinyl either; if things are set up correctly the vinyl rig will not enhance ticks and pops (although it is my opinion that so many rigs do have troubles with this; thus one important reason that digital has done as well as it has).
Its a matter of degree though. You could say that the record is made of individual molecules but they fit together perfectly and occur at a scale that makes it insignificant. Similar with digital. It all depends on sampling frequency, sample size, (and accuracy of the device that creates the samples). Gets back to Nyquist Theorem or similar models assuming the minority opinion perhaps that Nyquist does not cut it as teh basis for CD format. I think it is an extremely close call in theory especially for younger better ears (although older ears are better trained perhaps even if not able to hear above 12-14Khz or so in general) but a good one in practice. Plus, as time goes on and technology improves and becomes more affordable, teh bar can be raised further if needed until it finally becomes clearly insignificant, like those molecules.

Digital is clearly improving all the time. Vinyl format stopped getting better probably almost 50 years ago now. The conclusion down the road seems inevitable if not already the case.

We had a $72,000 digital system at a recent show. I had heard an earlier version several years earlier and back then it was easily the best I had heard, and this new version was even better. The designer was in the room with us, has an LP system (a good sign), and upon hearing our analog system in the room, turned to me and said 'digital has such a long way to go' and sighed...

Molecules are no comparison to bits, its really not an acceptable analogy.

Analog also continues to improve :) It did not stop at some sort of roadblock 50 years ago. I guess you could say its my opinion that analog is much more accurate. Also, I try to be careful not to assume that one example of the medium, whatever it is, is representative of the whole, just as one playback of such is not either.
I am very used to hearing things that are hard to see on the oscilloscope. Interestingly, instrument manufacturers use a different rule for scanning a signal. The rule of thumb is 10x the highest frequency to be displayed.

This is quite different from what we see in audio, where Redbook only asks for 2x the highest frequency to be reproduced.

The instrument manufacturers use higher scan frequencies in order to maintain waveform fidelity. This is not the case in audio, seems to me that audio reproduction has been treated as the poorer cousin.
When you bake a tape, it takes a few years for it to regain the moisture chased out while baked. IOW, you have plenty of time to work with the tape- certainly more than 48 hours.

The reason you have to bake them has nothing to do with whale oil :) Modern tapes are made with polyesters, which can absorb moisture at the ends of broken molecular strands. The water molecule allows the magnetic substrate to come unglued. Baking chases out the moisture so the substrate can function normally.

Older tapes from the 1950s were made with acetate. Acetate does not have the moisture issue, so although they have less performance and break easily, they do store much better.
Yes. Nyquist assumes an analog sample of unlimited resolution, not a 16-bit sample. Its application to digital audio is thus, not. Ah, people don't like to talk about this! Or they do but it just turns into a ridiculous argument. But I suggest anyone look into the life of Nyquist:
http://en.wikipedia.org/wiki/Harry_Nyquist

(you will note that Nyquist had no concept of digital audio back when he proposed his sampling theorem)

and

http://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampling_theorem#The_sampling_process

If you read carefully, you will note that the samples are not defined as '16 bit', instead they are samples of the 'bandwidth-limited' signal, which have an analog value.

Now 16 bits can define a fairly precise value, but that is by no means the same as saying it can define the exact value. Further, the significance of 'bandwidth limited' should not be ignored. Current Redbook specs put the sampling frequency at 44.1KHz, if you think about it, the significance is that anything above about 19-20Khz is ignored. It is not so much that Nyquist is out to lunch that it is that Redbook specs are poorly applied.

The Redbook specs were created in the late 1970s and early 1980s. Seems to me I heard one of the first CD players about 1981. Back then, the IBM PC was king; a $10 cell phone has *considerably* more computing power! IOW, Redbook was **intentionally** limited in order to cope with the limitations of the hardware of the day. It is quite anachronistic that we still take it seriously today...
Just wondering how good is your hearing at 19-20Khz?

Certainly not as good as it was...

Bandwidth problems can be heard without 20KHz response in the ear though. A cutoff at 20KHz has artifacts that extend down to 2KHz. This is why amps and preamps endeavor to have wide bandwidth- to reduce audible phase shift components well within the audio passband.

Most high quality analog formats can extend well past 20KHz (remember CD-4 from the 1970s?); my 1/2" tape machine can do 30KHz passably well at 30ips. Even though you can't hear that high, you sure can hear how much more resolution it has!

One of the keys to superior CDs these days is higher scan frequencies during record mode. We use 88.2KHz 24 bits as a backup of our analog recordings. 88.2KHz is nice because there is no algorithm required to produce a Redbook file, and you don't need a brickwall filter during record mode either. My point here is that Redbook is intentionally limited and compromised, and as long as it is around digital will *never* (which is a very long time) be better than analog- to do so would violate the laws of physics.

I don't think anyone should blame consumers for rejecting other digital formats like SACD... they already got talked into selling a perfectly good LP collection to be replaced by bright-sounding soulless junk. Collectively I don't think they trust the audio industry anymore.
I'm stumped on that one. Why is that?

re-read my previous posts. The Nyquist theorem is poorly applied.

I will add that until human perceptual rules are understood and kept in mind during the design of the 'next' digital codex, digital will continue to display the same colorations that it does now.

One last point is also obvious- digital audio showed up in the early 1980s, about 3 decades ago. Yet analog is still very much alive, with 1993 being the year of the least vinyl production. If digital was really 'more accurate', 'better' or anything like that, it would have been able to supplant the prior art in that time. I can name plenty of examples wherein that has happened on other fields. Its not happened in audio because digital has failed to bring home the promise. I don't think anyone takes 'perfect sound forever' seriously anymore :)
Ralph, what do you feel is the more significant limiting factor for redbook, sample rate or bits per sample? Just curious.

Well, the Nyquist theorem is looking for an exact sample (IOW with no limitation of resolution) in order to work, by definition it is the number of bits that is the real problem. When you think about it, this can only really be done in the analog domain...

My guess though is that when we can do 64-bit DACs on a regular basis that digital will start demonstrating the promise that its been showing.

I was wrong about the IBM PC being king when Redbook was devised. It was more like the Commodore 64 :)
Al is right, my 64-bit idea is mostly fantasy :)

Here is the problem, in a nutshell, the problem that seems to plague **all** aspects of audio:

There are the Rules of Human Hearing/Perception, and there are the specs on paper. They are not the same- the specs on paper for the most part don't give a damn about human hearing rules.

Now I have gone off about this a lot in the Amps/Preamps forum and won't belabor those issues as examples. Instead, let's look at how the ear treats noise, specifically analog hiss: Normally, our ears employ a masking feature, IOW the presence of a louder sound will block the presence of a quieter sound to our perception. Hiss is the one exception to that rule. I suspect its an evolutionary thing myself- the idea that hiss is similar to the effects of wind in the environment is not that far-fetched to me.... Anyway, we have the ability to hear about **20 db** into the noise floor of an analog system.

(if the noise floor is not composed of hiss, but instead is harmonic or in-harmonic noise related to the signal, our ears will not penetrate that, and so that type of noise floor will define the limit of low level detail that can be retrieved.)

With modern tape, 1/2" format, this means that you have the possibility of a 110 db dynamic range, if you include the range above 0 VU, a range that digital does not have.

This simple fact explains why an analog system of rather modest noise specs can have more low level detail than the best digital systems -*even though it appears to be noisier*. Add to that the fact that digital systems use less bits to resolve lower level signals (IOW, they have loose resolution as signal level decreases, which is why the normalization process is so important in the production of a CD) and you have a great part of why digital systems **as they are** can't keep up with analog.

IOW, part of it has to do with how we hear, and for the most part digital audio has ignored that, which has been a common problem with audio in general in the last 45 years or so :( Put another way, analog just happens to work better with the way our ears work.
The how we actually hear stuff is fine but I don't see how what you say applies to analog only. Both analog and digital are shooting for similar results as best I can tell and I have heard both do quite well despite the inherent limitations of each.

Mapman, not trying to change the subject, just pointing out that in the development of Redbook, human perceptual issues were largely ignored. To give a little more depth, we humans tend to not make things perfect, try as we might. In the case of digital, there tend to be in-harmonic distortions that are related to the scan frequency rather than harmonically related to the signal like we have in analog systems.

The problem is that the human ear takes higher-ordered content like this and interprets it as brightness. (this significance of this is that when you measure the digital system on the bench, it will appear to be ruler-flat in frequency response- it is our human perceptual rules that assign the brightness.) Had this problem been addressed properly from the get-go, I suspect that about 90% of the D vs A debate in the last 30 years simply would not have occurred.

This is an issue that has nothing to do with the misapplication of Nyquist, BTW. Now I have been accused of many things over the years, bias being one of them, but in this matter of digital, all I can say is I would really like it to work! I would much prefer to not have to provide space for all the vinyl I own, to fit it on a RAID array would be awesome! But my system is too revealing and the failings of digital are very obvious on it. Mind you, I've done no 'tuning' or 'voicing' or any particular treatments to somehow favor analog over digital. And I can put on a Redbook CD and enjoy it, but even my girlfriend who has no interest in audio at all comments on the obvious improvement that vinyl demonstrates over digital. I think too many people have not been exposed to decent analog playback (for example, improper setup of the equalization circuits in a phono preamp can exacerbate ticks and pops) and so the debate rages.
Mapman, the in-harmonic issue is easily demonstrated. I have to admit I was shocked the first time it was demonstrated to me. You take a sine wave sweep tone and play it back- listen for the 'birdies'.

If your not hearing the brightness of digital there are really only three explanations- either you have a high frequency rolloff in the system that complements the digital artifact or you have a high frequency rolloff in your ears. The third explanation is you have not heard a good analog playback on your system. Of the three, the latter is the most common- most people can hear digital artifacts just fine- I know someone who was deaf in one ear and 50% in the other and he had no problems discerning digital vs analog.

It is very common to hear really excellent digital recordings these days that make you wonder if they are finally there. The real question is, no matter how good the digital, 'what would this have sounded like if an analog recording system was used?' For that I recommend to anyone to try it themselves, of course they might have a bit of a time chasing down the analog equipment :)

I run a recording studio so we see this sort of comparison all the time. I get asked, 'why do you have all this old analog crap?' all the time. I just sit them down and play the difference. They never leave with any questions.
Duomike, even with the best turntable, if there are problems in the design of the phono section, ticks and pops will be abundant! This can have nothing to do with the actual LP BTW.

Mapman, the birdies I am referring to can be heard by anybody when you employ a sweep tone to ferret them out. Here is someone who discovered this phenomena by accident:

http://www.diyaudio.com/forums/digital-source/34329-cd-frequency-sweep-can-hear-birdies.html

It seems that the slower the sweep, the easier they are to hear. Now consider that this sort of thing (in-harmonic distortion) is going on all the time within the context of music during playback. The ear will interpret this as a brightness, even though some of the 'birdies' content can be low frequency.

To be more precise, the birdie tone is a non-linear manifestation of intermodulation between the scan frequency and the actual tone. As the tone changes frequency, so does the birdie tone. It is caused by poor dithering technique, poor monotonicity in the DAC, and other inter-modulations in the conversion process. I'm pretty sure the industry could have avoided a good bit of this had they been paying attention, but the assumption was that if the digital system had super low THD that is was therefore free of distortion. They just simply didn't *look* for any other forms until much later...
A bit off topic here, but it seems to me that a lot of classical musicians spend a heck of a lot of time working on their "tone".

Electric guitarists do too! Like crazy- very picky as well.
I sincerely don't get you when you say...

"My personal timbre (again, this is a separate thing from style) will come through, no matter which instrument I am playing on, despite the difference in the timbres of each individual horn. " ????

I can answer this one- I am often amazed how different one of my flutes sounds when someone else plays it- its like its a different instrument! IOW 'personal timbre' is maybe not the best expression but it is very real and easy to hear!