If there is any standard for sensitivity it might be 1.23V (+4dBu), but I remember few decades ago 0.3V (-10dBV) was common. By using higher sensitivity they tried to save money - more gain in amp, but less gain stages in all sources.
http://www.harmoniccycle.com/hc/music-26-+4dBu-10dBV.htm
I'm not sure 1.23V (+4dBu) applies to power amps, since my Benchmark AHB2 offers three selectable input levels: 2V (8.2dBu), 4V (14.2dBu) and 9.8V (22dBu). I use 9.8V, that they recommend. At higher input level cable electrical noise pickup is lower in relation to signal while gain stages were moved from noisier environment (power amp) to cleaner one (pre-amp).
My Benchmark DAC3 HGC has nominal 2V (8.2dBu) analog inputs sensitivity. Benchmark usually knows what they're doing so perhaps the "standard" is 2V and not 1.23V. Can anybody clarify?
http://www.harmoniccycle.com/hc/music-26-+4dBu-10dBV.htm
I'm not sure 1.23V (+4dBu) applies to power amps, since my Benchmark AHB2 offers three selectable input levels: 2V (8.2dBu), 4V (14.2dBu) and 9.8V (22dBu). I use 9.8V, that they recommend. At higher input level cable electrical noise pickup is lower in relation to signal while gain stages were moved from noisier environment (power amp) to cleaner one (pre-amp).
My Benchmark DAC3 HGC has nominal 2V (8.2dBu) analog inputs sensitivity. Benchmark usually knows what they're doing so perhaps the "standard" is 2V and not 1.23V. Can anybody clarify?