It isn't the bits, it's the hardware


I have been completely vindicated!

Well, at least there is an AES paper that leaves the door open to my observations. As some of you who follow me, and some of you follow me far too closely, I’ve said for a while that the performance of DAC’s over the last ~15 years has gotten remarkably better, specifically, Redbook or CD playback is a lot better than it was in the past, so much so that high resolution music and playback no longer makes the economic sense that it used to.

My belief about why high resolution music sounded better has now completely been altered. I used to believe we needed the data. Over the past couple of decades my thinking has radically and forever been altered. Now I believe WE don’t need the data, the DACs needed it. That is, the problem was not that we needed 30 kHz performance. The problem was always that the DAC chips themselves performed differently at different resolutions. Here is at least some proof supporting this possibility.

Stereophile published a link to a meta analysis of high resolution playback, and while they propose a number of issues and solutions, two things stood out to me, the section on hardware improvement, and the new filters (which is, in my mind, the same topic):



4.2
The question of whether hardware performance factors,possibly unidentified, as a function of sample rate selectively contribute to greater transparency at higher resolutions cannot be entirely eliminated.

Numerous advances of the last 15 years in the design of hardware and processing improve quality at all resolutions. A few, of many, examples: improvements to the modulators used in data conversion affecting timing jitter,bit depths (for headroom), dither availability, noise shaping and noise floors; improved asynchronous sample rate conversion (which involves separate clocks and conversion of rates that are not integer multiples); and improved digital interfaces and networks that isolate computer noise from sensitive DAC clocks, enabling better workstation monitoring as well as computer-based players. Converters currently list dynamic ranges up to∼122 dB (A/D) and 126–130 dB(D/A), which can benefit 24b signals.

Now if I hear "DAC X performs so much better with 192/24 signals!" I don't get excited. I think the DAC is flawed.
erik_squires

Showing 12 responses by heaudio123

It's not the accuracy, it is the lack of jitter and that has been possible and relatively cheap for some time. If you are feeding an async data stream, i.e. USB, wired ethernet, wifi, hard-driver, etc. then a basic oscillator with a decent power supply is effectively jitter free and rather inexpensive, practically free by audiophile standards.
It is when you start feeding synchronous data with varying data rates and trying to sync up two clock domains and you enter the realm of PLLs that it gets harder and a lot more expensive, and/or you get into techniques such as ASRC where you are beholden to the underlying math (and resolution) to convert between the two sample rate domains that performance gets far more variable (as does cost).
People still use CDs for live playback in 2020?


On a serious note I doubt there would be many audiophile CD players in the last decade(s) that are not reading ahead if not retrying and buffering on digital output rendering jitter at the player level non existent and if you treat your CDs in any half decent fashion like I assume most audiophiles do, then uncorrectable errors are rare, and as per previous statement as the data is reclocked, any timing issues from pipeline processing of the error correction isn't an issue.


Oh, and this thread is not about CDs.

>>>>>One assumes that is pure speculation or maybe wishful thinking.


Or, contrary to the post the reply was too, it was factual knowledge, and not an opinion or an unproven and evidence lacking hypothesis.
The CD can flop around like a beached whale in a tsunami, but unless there are unrecoverable errors, a buffered and reclocked modern audiophile player's output is not going to be effected.
This isn't the 80's and 90's when due to cost and limited functionality of player mechanisms you were beholden to the recovery of data impacting the clock PLL, and jitter due to the variable error recovery pipeline.
Buffering doesn't stop any errors, it provides the mechanism to eliminate all jitter. 

No one ever said no uncorrectable errors, though when manufactured, based on many industry tests, uncorrectable errors are quite rare. These are test pretty easily recreated on any computer with a CD-rom as well, so it is no "secret".

From a practical standpoint, if you treat your CD the way the average audiophile does, uncorrected errors are not going to impact your listening experience.   However, if you are just going to make stuff up, then I am not sure why you are participating in the conversation?
It's 2020, not 1990. I would get out and listen to some systems. Things have progressed considerably on the CD front.


This still is not a thread about CD players.
No. I am using the proper definition for jitter as it applies to the output of the CD player, not what comes off the disc which is meaningless in a buffered and reclocked player, i.e. modern audiophile players. I have no idea what you are using.
I think you have a fundamental lack of knowledge of the architecture of a modern audiophile CD player and hence why your confusion w.r.t. what happens after the data is read off the disc, pretty much every error corrected in the error correction block, before it hits the buffer and re-clocker. Unfortunately I can't post images here and it would be easier to explain with pictures.
Not really a curve-fitting but okay to think about it that way. In a digital representation, the spectra is reflected around 0Hz, and the sampling rate. Oversampling shifts the effective sample rate so that the base spectra (which does not change), shifts from being centered around 44.1Khz to centered around 384Khz. Being say only 20Khz wide, a digital filter can easily remove most artifacts over 20Khz, with a simple analog filter taking out the rest.
The other benefit is spreading out quantization noise reducing the noise floor.
From a purely technical standpoint oversampling can apply to DA conversion, not just ADC, so from that standpoint, you can use upsampling, oversampling or sample rate conversion to a higher frequency all interchangeably. Feel free to validate that with DAC data sheets that discuss oversampling.


But looking more at the (poorly) written paper linked attempting to compare a readily used term, over-sampling, to one practically made-up at least for this case (upsampling), and then to actually not really give any definition to upsampling except to define it pretty much exactly as asynchronous sample rate conversion, another well understood term, I am not surprised by the confusion.

erik_squires: "Actually that’s exactly how it works for upsampling, but different upsampling algorithms work differently. With the advent of cheap compute, Bezier curves are cheap and easy to do. "

I think you are missing a key element of how a typical asynchronous sample rate converter with inherent over-sampling works, namely that the first step would be an implementation of oversampling (typically fractional delay filters), which provides a smoother curve for the curve-fit which works over a smaller number of samples. Doing this keeps the spurious frequency components higher up allowing for easier final filtering.
In this limited industry somewhat exclusively does upsample mean resample by async sample rate converter (and it was pretty much a made-up term), which most people know, but what they don’t know is that the underlying technology which is pretty much exclusively some form of synchronous oversampling in the form of fractional delay filters, which provides an underlying shift upwards in the spectra from the original sample rate, coupled with a time compensated curve-fit which provides for the final sample rate and provides the jitter attenuation (something not needed with async streaming sources of course). So when claiming advantage of upsampling to a higher frequency, is it the inherent oversampling, the jitter reduction, or the pick of final sample rate, or some combination of?


As Cleeds pointed out, you can’t make a generalization to all cases based on one example.

That Benchmark found that "performance" based on data sheet and some simple performance metrics was better at 24/96 than 24/192 is not at all surprising. At lower speeds, you have less contribution to the output from the switching CMOS switches, less dynamic power (and less glitch energy) contributing to a quieter environment, and even simply more time to settle to the final value. Perhaps in Benchmark’s specific case, decoupling the output frequency from the input also removed sources of synchronous noise. Of course, they are running in 2x mode anyway, so technically the DAC is running close to 192KHz internally, so what exactly does that 96KHz even mean in their argument ... not to mention that it then runs into a much higher rate sigma-delta modulator. Does their product match the data, or does the data match the product ?


However, the items they cited w.r.t the data sheets and their tests, are not guarantees of excellent perceived sonic performance which would trade off high sample rate, system noise, with analog filtering. The article is also 10 years old, so what was best 10 years ago, may have shifted up 2x or more in terms of what was best.

I don’t think you have well made your point, simply because tests have been done with 24/96 native, and 24/96 down-sampled mathematically to 16/44.1 and then upsampled to 24/96 so that the playback path was identical, all that was changed was the information rate.
... if only we had things like dynamometers, or error rate detectors so we would not have to guess if these were issues or not.
What is with the scattered light psychosis? Is this an attempt at humor?


Any CD player can pretty much with correction extract a bit perfect stream. If you don't buffer you have jitter. Buffer and reclock and jitter disappears.  No scattered light psychosis to worry about. Check the calendar. Is 2020 not 1999. You missed the millennium and the 20 years after.


Are you trying to intentionally mislead people?


That's directed at geoffkait, but fleschler, mechanical vibration and impact on jitter disappeared on audiophile CD players over a decade ago. Modern players buffer and reclock. What happens at the mechanism is almost meaningless.