Analogue clipping from digital sources


Given the high output (typically >2v in RCA, >4v in balanced mode) there is in my experience a significant risk of overloading both the analogue stage of the DAC and any pre- or power amp downstream. Fighting that with low volume settings on the attenuator only further aggravates the issue. In my case I have to run the InnuOS Zenith’s Mk3 output at 85% to deal with audible overloading of the DAC/ amp with audible distortion. Anyone with similar experience?

antigrunge2

@ltmandella If I am not mistaken, you are referring to a problem resulting from ‘capping’ peaks during digital recording leading to distortion during the DA conversion, i.e. not clipping although it resembles audibly. This seems to be yet another roadside accident from the loudness wars; thanks for pointing it out.

That is a mastering problem which in my mind cannot be addressed through any means by audiophile consumers (unless they decide to use low resolution equipment) So I don‘t yet understand what your point is in the context of this discussion

and for everyone else, you can research "intersample peak" and intersample peak overs.  It is absolutely a known phenomenon among mastering engineers and discussed regularly, and demonstrated in various testing. 

The consensus on audibility is that it is hardware (DAC) dependant.  Can be very objectionable on very accurate hardware but generally not audible in lossy or low end repro chain.  To avoid requires headroom in the encoding not always allowed due to the loudness wars.

Probably why I prefer DSD.  I am unfortunately very annoyed by any high frequency or peak glitches in digital.  Makes me want to immediately throw the offending component riight out the window...

 

 

 

@ltmandella : care to elaborate? Or point to relevant sources? Pls don’t make unsubstantiated assertions when posting. Thanks.

Are you referring to clipping?  I think it has been mathematically demonstrated that it is possible for PCM to clip during playback even though never detected during encoding...

 

I guess what it boils down to is that in 2024, if you tell me your recent vintage preamp is clipping with 2V of input I am going to have trouble believing it is not malfunctioning.

OP: 

 

With modern amps, input impedance is usually quite high, 25kOhms or more, and plenty of voltage gain, so with many combinations of modern DAC's and solid state amps a separate preamp is no longer technically needed.

 

Best,

 

E

but phono stages have additional gain just for them.  It's true that you almost don't need any gain for CD's, but tape decks and tuners were often near that, no? 

@erik_squires No. quite often a tuner or phono section will make 1Volt. A cassette machine makes 1 Volt (@ 0VU), as do consumer (not pro) reel to reels (also @ 0VU).

In my read a good DAC analogue stage may only need an additional preamp where switching to analogue sources is required. Absent that the pre adds distortion.

thank you to @atmasphere and @erik_squires for explaining an obviously important issue that hasn’t been sufficiently discussed

@atmasphere  but phono stages have additional gain just for them.  It's true that you almost don't need any gain for CD's, but tape decks and tuners were often near that, no? 

The 2V peak outputs should not be a surprise anymore.

Still, that's not to say any particular maker or model doesn't overload more easily than others.

Also, Ralph is right for an AMP, 2V might be overload but preamps have been 100% aware of the CD standard since then and are built for it, so I disagree.

@erik_squires FWIW the high output digital problem is one of the issues that any preamp manufacturer has to find a way to deal with. With a phono section or tuner, you might need 15dB of gain to work with most power amps, but you (most of the time) don't need any for digital.

We've been lucky in that our patented direct-coupled output section of our preamps is neutral enough and is able to prevent coloration from the interconnect that there is still a benefit using our preamps with a digital source.

“IMO Phillips and Sony made a stupid mistake when they set the Redbook spec to 2V output with digital gear, more than many amps need to overload.

The CD was invented in 1982, with full knowledge of this. The reasoning is that a PREAMP could/can easily handle 2V input. Amps may not, but since the idea was never to directly connect a CD player at full output to an amp, I’m not sure why this is an issue. Also, higher voltage = less noise (its complicated) and less need for additional gain downstream.

Also, Ralph is right for an AMP, 2V might be overload but preamps have been 100% aware of the CD standard since then and are built for it, so I disagree. Preamps (and preamp stages) today can easily handle 2 V input and put out whatever arbitrary fraction of that you need for an amp.

Older gear though had far too much gain or too low a supply rail which could cause an issue.  By too much I mean it had a lot more gain than we can use, which contributed to noise.  Better to have lower gain and wider use of the volume knob.

Here is @atmasphere on a different thread addressing my topic:

“IMO Phillips and Sony made a stupid mistake when they set the Redbook spec to 2V output with digital gear, more than many amps need to overload. I think their reasoning must have been that once you hear digital, you'll never want to hear any other source. Obviously if that was the thinking, it was grossly incorrect.

A smarter thing to do would have been to allow for a lower level DAC output in addition to the regular line section that's built into all DACs and CD players. This way if you happen to have a phono, tuner, tape machine or other source (perhaps video) you could use a regular preamp and get maximum fidelity....”

So in addition with @erik_squires insightful comment, there is an overall issue along the chain

I am on InnuOS Sense feeding into the Antelope Zodiac Platinum DAC and believe I can only use the Sense imbedded attenuator. Any other suggestion? Many thanks for your help, much appreciated!

BTW: this subject should find a wider audience.

Technically, the best solution is to lower the output after bit depth conversion (16 to 24 or 32 bits) but before upsampling.

Roon calls this setting Headroom Management.

The end result of this 2-step dance is you can reduce the maximum output without reducing resolution while minimizing how much about the original recording you must know to avoid clipping.

Thank you for this: I am using upsampling, clearly heard clipping like noise and the cure is to lower the output from the server. Learned something new!

I forgot one important point. Most preamps can take significant overload on the inputs.

Amplifier output is tied to input and usually 20x the input. So long as you don’t exceed the maximum output voltage they will take more than 2V in.  Stereophile may state the "maximum input voltage before clipping" or something like that.

Hey,

Have you actually heard this happening, or are you just theorizing?

There is risk of digital clipping, but it’s not from the output voltage. It’s from upsampling. There’s a good paper on it from Benchmark Media somewhere about what happens.

Imagine a 44.1kHz/16 bit recording. Let’s say it’s a simple sine wave and the original samples happen to reach maximum. At the top, you could have two samples at peak output in a recording. Not an uncommon thing as many mastering engineers push the max sample to the peak digital output to ensure the recording has plenty of dynamic range.  Depending on the precise timing, you could have two adjacent peak signals or one very close and the other at peak.

Anyway, if you convert from 44.1/16 to 88.1/16 with linear interpolation there will be no issue, but most upsampling (thanks to cheap compute power) use something like a French curve to interpolate.  Now our new  samples can exceed maximum digital output. That is true digital clipping. The solution is to slightly reduce the signal when upsampling. Not a horrible thing when you also increase the bit depth before doing so.