At what point ( min or max value ) does av/preamp output voltage not make a difference?


Hoping my question is not viewed as 'stupid'. I will attempt to phrase the question so it makes sense. I have a basic understanding of EE and AV electronics but still trying to understand specifications of preamp outputs and at what value output voltage of an AV preamp output stage matters ... 

Meaning - Is a preamp with 4v max output... 'better' suited to drive external amplifier than 2v? Is an 8v output 'better' than 4v in real world use?

I understand that higher voltage output is better ... but at what point does it not translate into better performance?

Reason I ask:  Looking to replace aging avr which has an 8v preamp output.  And I have yet to find ( under $3k for avr or prepro ) one with anything close to this. I am looking at the Anthem AVRs, Marantz 770x pre pro, .... and none have preamp outputs remotely close to the 8v of my 20 year old Denon.

Hope someone can shed some light on this.  

I am looking at using Parasound Halo, ATI, Rotel, Music Fidelity, etc for external amplification ( if that makes a difference ).

Thanks in advance.


lightfighter2018

Showing 1 response by tomcy6

Hi lightfighter,

I am not an electronics person at all, but from what I read, the 2 important specs for preamps are output impedance, lower is better and usually only an issue with tube preamps, and gain.  A high gain preamp,  20db and up, may cause you to keep the volume control below its optimal level.  Preamp gain of 6db should be plenty in most applications.

I know that's not what you asked, but somebody will probably respond to tell me I'm full of it and maybe a discussion will ensue.