Do amps have a sweet spot?


What I mean by this is do amps have an output range at which they sound better? The reason that I'm asking is that I'm now running some very small speakers (Minuet Supreme Plus) and they're probably the least demanding speakers I've had; but I've found that my setup sounds better when I have the volume turned up.

Out of curiosity, I took my Minuets to my local shop and hooked them up to an NAD C326BEE. I thought it sounded pretty darned good at "normal" listening levels. I almost bought it, but then I decided to start cranking it up to what I would call "rockin" levels and the amp started to clip. If it could have played louder, I would have bought it.

So...is it usual for an amp not to open up until you start pushing it?

My current amp is an Aragon 2004.
tonyangel

Showing 2 responses by bombaywalla

remember that 100wpc will only get you an additional 3 decibels of volume over the 50 wpc
Runnin, just wanted to point out that 100WPC is 3dB more power (& not volume) over 50WPC.
How much more volume/SPL you will get with a 100WPC is really dependent on the 100WPC amp design - really it's power supply design & current handling capacity of its transformer.

IMHO the issue you are facing is the classic amplifier-speaker interface issue. You are having to turn up the volume to overcome the marginal amp-speaker interface by sheer grunt power from the amp.
We discussed this in quite a bit of details back in early May 2013. Lots of good info in that thread by some very knowledgable members. here is the link to that thread -
http://forum.audiogon.com/cgi-bin/fr.pl?aamps&1367644336&&&/Current-amp-vs-Voltage-amp

i know it's a lot to read but do take the time to read & digest the info.
I must be either reading this wrong or missing something.

I thought increasing the voltage across a speaker's input will increase its output in SPL based on the speaker's sensitivity. I don't see how that's related to the voltage source (the amp) in any way.

The increase in SPL by increasing the voltage by 3 dB will always be less than 3 dB due to the speaker not being 100% efficient (it has losses).

What am I missing?
Bob_reynolds
Firstly, thanks Almarg for jumping in & clarifying for me. :-)

Yes, Bob_reynolds you are missing something. it's called "current".
If there's voltage across the speaker terminals, then current must flow from one speaker terminal to the other. Where does this current come from? From the power amp. THAT'S how it is related to the voltage source (amp) in every way. ;-)
You must remember that voltage & current are duals - if there is one, the other must be present. Ohms's Law.
What if the voltage source/amp is incapable of supplying the current (transformer does not have the current capacity, power supply not robust enough)? Will we able to sustain that voltage across the speaker terminals? No!
The criteria that I cited & those that Almarg added (thanks!) all come into play when you put the 100W (or any amp) into the signal chain. Simply buying any 100W & swapping out your 50W/ch amp & expecting an increase in SPL proportional to your increase in the volume knob on your preamp will be a roll of the dice if you have not thought it thru - particularly if you have a hard-to-drive speaker.
The amp-speaker interface is important. Hence the link to that thread. Almarg along with other members have taken the time to write some lucid notes to explain the matter. Do take the time to read.
Thanks.