Question about high current amps versus "not high current amps"


Recently I read a reply to a post about a certain speaker, and the person who replied typed that (and I am going to paraphrase somewhat) the speaker required a high current amp to perform well and it wasn’t the WPC that was important.

Sorry as I am afraid that these are probably going to be  "audio electrical questions for dummies," but here goes:

I vaguely remember being taught the PIE formula, so I looked it up online for a quick review and if I am understanding it correctly,

P (power/watts) = I (current/amps) x E (electromotive force/voltage) .

My first question would be: if I am understanding that correctly, how can wpc NOT matter since watts are the sum of current x voltage? I mean if you have so many WPC, don’t you then HAVE to have so much current?

My next question would be, if I am understanding PIE correctly, is E/voltage going to be a fixed 110 vac out of the wall, or is that number (E) determined by the transformer (so it would vary by manufacturer) and it is that (different transformers that are used in different amps) going to be the difference between a high and a lower current amp?

Or am I completely off base thinking that P is wpc and P is actually the spec in my owners manual that lists "power consumption as 420 watts operate 10 watts stand by"?

And lastly, what would be an example of a high current amp and what would be an example of a low current amp?

Thanks.

 

immatthewj

Excellent and informative discussion. Thanks to all even if it is a bit tough for dummies like me to understand

How does any of this inform a future purchasing decision or help us enjoy what we have now?

 

Amplifiers are not about power. Power is consumed by the load, whether it's a resistor, a motor or a speaker driver. The amplifier is an energy source and does not necessarily follow Ohm's Law. As an example, if you short a 9-volt battery with a 1/2-ohm resistor, Ohm's Law dutifully informs us that the "power" delivered is 162 watts (9Vx9V/0.5) and the current is 18 amps. Of course that is ridiculous; the energy storage of a 9V battery is limited to about 500 milliamps and only when a resistor limits the battery to less than 500mA does Ohm's Law apply. On the other hand, that 150kVA utility transformer feeding your street can easily slam thousands of amps over a bolted fault. It's all about energy.

An amplifier has an energy source: the power supply's transformer and filter capacitors. If the power supply can maintain it's secondary voltage from any load the speaker presents it, you can call it a "high current amplifier" and relationships established by Ohm's Law apply. But if the output voltage sags when the speaker load drops, then the current delivered to the speaker will decrease, which means the power consumption decreases.

If an amplifier's specs says it can maintain voltage down to 4 and/or 2 ohms, you will see the power double (i.e. 100W into 8 ohms, 200W into 4 ohms and 400W into 2 ohms) and that is practically a true voltage source. But if you see 100W into 8 ohms and 150W into 4 ohms, then that amplifier is not as good of an energy source.

Uhm - Ohm’s law applies regardless (or its generalisation using impedance). It’s just that at some point the internal impedance of the generator (amp or battery) becomes relevant and has to be included into the circuit being analysed explicitly, rather than neglected.

Ohm’s law is one of the simplest laws of physics, yet seems to be so hard to apply properly.

What matters is the voltage drop across the load, Multiply this voltage drop by the current flowing, and you have the power (or Wattage) produced at the load.

The current that flows is just given by the relationship voltage drop = current times resistance.

Amplifiers might be considered to be ’straight wires with gain’ to quote Peter Walker of Quad. They try to produce an output voltage which is a multiplier (the gain) of the input signal voltage. So the voltage drop is given by the input signal and the gain, and the resistance is also fixed (at least nominally).

If the rated maximum power into 8-Ohms is say 100-Watts, that must be the voltage drop times the current. The current is the voltage drop divided by the resistance. So 100 = V * V / 8 or V = sqrt (800) or about 28-Volts.  Note that this is derived just from the power and the load resistance.

This is true for Direct Current. When we measure alternating currents, it is conventional to think of them as equivalent to the direct current that would produce the same average power. Turns out this is the square root of the average (mean) of the instantaneous value squared. Abbreviated to RMS (root mean squared).

I feel better now ...