Heat/Efficency of Speakers


What % of power sent to the speakers is turned to waste heat? That's the short version of my question.

I'm looking to minimize waste heat accross my stereo as my listening room is unforgiving come summer; no cooling and a computer system which cannot be relocated. I understand amplifier efficency & the classes as well as speaker efficency measured as W/db however the interplay eludes me.

Taking two hypothetical amplifiers: a Class A amplifier outputting 10W w/ 100W from the wall & a Class D outputting 200 w/ 220W draw I understand the D will be the cooler operator however this is where the discussion tends to end, D only wasting 20W vs the A amplifier's 90W. Considering appropriate speaker matches to each amp(as well as a standard HE speaker at say 95db/w), how do I determine the wattage converted sound and the watts spent as heat?

I'm asking because I was previously running a 10W tube amplifier in this room(4xel84 tubes) with 96db speakers. This was bearable in two hour doses this last summer. My friend assures me any Class D amplifier and many AB amps would have no such heating problems and says it's class not wattage that is my issue. Before I move to a different amplifier technology(and swap speakers, these voiced for SE tube partnering) I want to understand this issue fully. I'm unconcerned with power usage and only care about the heat.
redfuneral
Speakers typically churn out between 99% and 90% of energy as heat. Horns are the most efficient at 90% of power ending up as heat.
shadorne
Speakers typically churn out between 99% and 90% of energy as heat.
You might want to check your math - most speakers are much less efficient than that. A speaker with 92 dB sensitivity is only about 1 percent efficient.

As for the OP, the heat in his room is overwhelmingly from the amplifier - not the speaker. His hypothetical 95 dB sensitive speaker is about 2 percent efficient.
So I should assume all power(90+%) drawn from the wall will end up as heat. I should be looking to minimize the wattage at the speaker terminals. I should clarify that I’m looking at room temperature exclusively. An amplifier’s ability to cool itself has no impact on its thermal contribution(as I’ve learned from upgrading heatsinks in my conputer)

I’d say I don’t understand how to calculate power draw of amplifiers outside of class A. I was looking at an average power AB amplifier last night with power consumption rated 25-250W. I understand it idles at 25W and when in use is pulling 250W? Or does the power usage scale with the volume and demands of the speaker? How about Class D? Can a 50-100W AB or 250W D amp play music using less power than 5W of Class A? Do they all output the same wattage to the speaker all else being equal?
redfuneral
So I should assume all power(90+%) drawn from the wall will end up as heat. I should be looking to minimize the wattage at the speaker terminals
No, not at all.  You can achieve substantial increase in efficiency by choosing an efficient amplifier. As your friend suggested, you should be looking at your amplifier's class of operation.

Can you explain how that's the case? As I understand it now amplifier efficency only covers wall -> speaker terminal efficency. The wattage output to the speaker will mostly all be turned to heat. Where I stand Class A appears the answer only because there are 2-25W amplifiers all over the market where the alternatives all have higher output.