Running a CD player directly into power amps,


good, deleterious, dangerous or simply stupid? Since I never listen to my tuner, and am too lazy to bother with vinyl anymore and never got that tape deck (thank God!), can I go the direct route? An "audio consultant" (a.k.a "salesperson") told me it was unthinkable because of some mismatch in the output of one and the input of the other... He was trying to sell me a preamp. Since, long ago and far away in a different audio galaxy, it was believed that the shortest signal route (all other factors being otherwise equal) would provide the best, least degraded signal, I thought, and still think for that matter, that my idea is swell. A better CD player + a better power amp + new earthshaking speakers and voilĂ ! Am I missing some great truth here?
pbb

Showing 1 response by croese87bb

Amazing how many people talk nonsense.
Every Cdplayer output has a static output impedance below 300 Ohms. And Every power amp has an input impedance of several tens of thousands of Ohms, save for the two or so esoterics from the far east that you would not pay for anyway. Equal (matching) impedances are only relevant in High Frequency (radio/TV/Digital signal processing) domains, where power has to be transferred without loss, and signal reflections have to be avoided.
In audio only signal VOLTAGES are transferred, and no power is involved.
So the remarks that "the cdplayer's output has to have enough power to drive the input of the power amp" or "beware of impedance mismatches" are complete bullshit.

Be sure that your CDplayer has an analog volume control. Why?
Because digital volume controls in CDplayers are usually implemented in the digital filter circuit preceding the D/A converter chip, and in this configuration every 6.02 dB attanuation from full volume down provokes the loss of 6.02 dB (=1bit)of resolution. Normal listening level is around-20 to -30 dB below maximum volume, meaning that your higly paid 20bits-resolution of 100+dB's(including noise floor) goes 20-30dB down the drain (and the noise floor is NOT changing), leaving you with 70-80dB of real digital resolution which is a mere 13-14bits.

The whole issue of compatibility has nothing to do with imaginary output or input impedances, or output levels(as long as your player has an analog volume control built in), but with signal ground pollution, spurious HF noise on the signal lines, and proper phase behaviour due to the CDplayer's output stage configuration, aka inhowfar this stage can drive the cable capacitance. This is sometimes very difficult , especially for a player with budget opamps in its output stages, that are heavily compensated for stability. Their output impedance can be low enough but that is only the result of lots'a Negative Feedback. A sure recipe for phase errors and instability.

A good preamplifier can vastly improve on the sound of such CDplayer because it acts as a 'corrector' for all those small disturbances, buffering instable outputs of otherwise fine cdplayers, therefore preparing a much cleaner signal for the power amp.

And with a good preamp you can leave the cdplayer's internal digital volume control to maximum output, preserving your 20bits of resolution.

Only a CDplayer with analog volume control and a decent output stage (they mostly go hand in hand) will show its full potential of sound quality when connected directly to a power amp.

Do not listen to self-educated charlatans with distorted technical orientations, but check with reliable retailers and listen to the equipment youself.
No technical explanation can replace a good listening experience.