What’s the relationship between gain (dB) and power (watts)?


Is there one?  My new used 300+ epic Bryston amp has a gain switch on the back toggling between 23 and 29 dB of gain.  
redwoodaudio

Showing 1 response by rols

I am interested in this because I have just upgraded from one power amp to two monoblocks - the gain has not changed but the available power has gone from 200W per channel to 600W...

... and yes, the monoblocks are no louder than the stereo unit. 

Here is my attempt at understanding the issue. Typed out here more to get feedback, to get my working checked, than to tell anyone else what to think.

OK, so gain tells you how much bigger the output voltage is to the input voltage, i.e. the ratio of input to output. It think (guess) that typically power amps have an input to output voltage ratio around 100 times. 

Power considerations appear to come into play when you connect the amplifier to something, like a resistor, or a speaker. In order to hold the correct voltage, the amp has to pump out current. If it runs out of current, the voltage will decrease, the relationship between output voltage and input voltage will falter. 

With something simple like a resistor, one could easily work out how much power one needed from power = voltage squared divided by resistance. But speakers are not resistors, and music is not a steady state. I think that if one suddenly wants a speaker's driver to move, its resistance momentarily collapses, and so only an amplifier with a lot of power can hold its output voltage steady. As the speaker resistance collapses due to a rim shot or square wave, the amplifier has to dump a tonne of current into it.

In my mind big power amps sound powerful because they are capable of making the speaker follow the music. They sound more exciting, not louder.