Running a CD player directly into power amps,


good, deleterious, dangerous or simply stupid? Since I never listen to my tuner, and am too lazy to bother with vinyl anymore and never got that tape deck (thank God!), can I go the direct route? An "audio consultant" (a.k.a "salesperson") told me it was unthinkable because of some mismatch in the output of one and the input of the other... He was trying to sell me a preamp. Since, long ago and far away in a different audio galaxy, it was believed that the shortest signal route (all other factors being otherwise equal) would provide the best, least degraded signal, I thought, and still think for that matter, that my idea is swell. A better CD player + a better power amp + new earthshaking speakers and voilĂ ! Am I missing some great truth here?
pbb

Showing 3 responses by sean

I see no problem with doing such so long as the input impedance of the amp is at least equal to or higher than the output impedance of the CD player. In other words, the CD player might have a 50 ohm output. The input impedance of the amp should be AT LEAST equal to 50 ohms and preferably much higher. If you were to use an amp with a lower impedance than the CD, sonics might be severely compromised along with circuit stability. Something else to take into account is that the less variance between the two figures might contribute to showing bigger sonic variances between the interconnects used. Sean
>

PS....The above numbers are stictly "gibberish" and were used for explanation purposes only.
Yeah, i guess i should have commented that you will either need a variable output on the CD or input attenuators on the amp. Running line level out of the CD into most amps will either blow the speakers or the amp. So much for "assuming" : ) Sean
>
Croese, i'm assuming that your commentary pertaining to "bullshit" and "charlatans" was aimed at myself or possibly one other post previous to yours. While i can't respond for the other participants, I was simply trying to cover all the bases in a very "general" statement.

Since i am not familiar with ALL of the equipment out there, nor do i think that anyone else is, i used an example for those not electronically inclined to follow along with. I was trying to make sure that the analogue output of the CD player or DAC would not be "loaded down" under ANY circumstances. As such, i chose to post "worst case scenario" using generic figures and figured that ANYTHING above that would perform acceptably. While i'm sorry if my use of 50 ohms threw you for a loop or led to a misunderstanding, re-reading of my post should clarify things. I specifically stated that it was preferred that the input impedance of the amp be higher than the output impedance of the source. If i would have used non-RF based impedance figures or figures in the thousands or tens of thousands of ohms range, would that have made you happier ???

As Chris stated, it is common practice in the audio field to shoot for a "X10" difference in impedance from the output of one piece to the next gain stage. As such, i was not suggesting that one need worry about VSWR in audio gear although i do not doubt / deny that this factor MIGHT come into play in different ways. This is a subject that still has a LOT of "grey areas" to it and subject to much debate by folks FAR more knowledgable and experienced than i will ever be.

As far as that goes, i know my limitations and try to work within those confines. If i am making suggestions or comments based on anything but verifiable data or first hand experience, i typically state such. There is nothing wrong with presenting ones' opinion, but it should not be presented as "fact" unless one has done all of the necessary testing under various conditions to verify such. Sean
>