Sensitivity Ratings on Preamplifiers or Integrateds vs. Amplifiers


Ok, so I am interested in a better understanding of gain issues. As I understand it sensitivity values for an amplifier provide the voltage needed to achieve rated output. I am not sure about the sensitivity rating for preamps, thinking it serves as the minimum required to drive the preamp. In the case of an integrated I have seen this spec provided for the included preamp and amplifier stages.  However for many it is not.

Specifically for my Line Magnetic 211ai it is specified as 200mV and I assume this is for the preamp stage.  This seems low and may be why, when driven by my DAC having 2V output, I need not turn the volume past 10 o'clock for sufficient loudness. I understand the volume position is no measure of the wattage one is asking the amplifier stage to deliver. 

I am seeking clarification /correction on my thinking. Thanks in advance!
mesch

Showing 1 response by kijanki

If there is any standard for sensitivity it might be 1.23V (+4dBu), but I remember few decades ago 0.3V (-10dBV) was common.  By using higher sensitivity they tried to save money - more gain in amp, but less gain stages in all sources.

http://www.harmoniccycle.com/hc/music-26-+4dBu-10dBV.htm

I'm not sure 1.23V (+4dBu) applies to power amps, since my Benchmark AHB2 offers three selectable input levels: 2V (8.2dBu), 4V (14.2dBu) and 9.8V (22dBu).  I use 9.8V, that they recommend.  At higher input level cable electrical noise pickup is lower in relation to signal while gain stages were moved from noisier environment (power amp) to cleaner one (pre-amp).   

My Benchmark DAC3 HGC has nominal 2V (8.2dBu) analog inputs sensitivity.  Benchmark usually knows what they're doing so perhaps the "standard" is 2V and not 1.23V.  Can anybody clarify?