Watts up with that?


I was concerned that my Belles 30 watt Class A amp (SA-30) was not powerful enough for my Montana XP speakers (seven driver 92db at 2 watts due to 4 Ohm). Using the calculation of voltage squared divided by impedance would give you watts, I hooked up my Wavetek digital multimeter across the speaker posts to read AC volts. The meter has a “max” feature so it keeps displaying the highest voltage reading until reset. My speakers have a very flat impedance curve with a low of 3 and a max of 5 Ohms, so I feel pretty safe using the average of 4 Ohms. Upon playing some music at my average listening levels I got a max voltage reading of 2.13 volts. This calculates to just over 1 watt. I then turned up the volume to much louder than I will usually listen and got a max voltage reading of 3.28 volts after a few songs. So with the volume higher than normal, and at the loudest part on the track, I get just under 3 watts being drawn. I still have a lot of watts left! Are my calculations correct? Is this an OK way to measure power? I was thinking I needed a few hundred watts of available power, but it seems I’ve got all I need at just the 60 watts capability (4 Ohm load) of my current amp. Your thoughts please.
koestner

Showing 1 response by rrog

When ever I had an amplifier with meters I was always amazed at how little watts is being used. Although, not all amplifiers are created equal. I have owned 60 watt amplifiers that sounded more powerful than 200 watt amplifiers. When I owned Dunlavy SCIVs I had incredibly good sound with 60 watt tube monos. The best way to find out if an amplifier is powerful enough is to listen.