Explain Amp classes & how effects sound?


I see class A, B, AB, Class T, etc on amplifier specifications. Can someone explain (simple terms) how each class (even class T) effects sound quality?
aberyclark
Also class D had NOTHING to do wit digital. Many people presume wrongly that the D stands for digital. This is just not true.

There are amps being sold as class A but in truth some of these are Class AB with a high class A bias. But how much watt do you really need? I just spoke with someone who ones 1000watt mono amps. He never saw the needles move beyond 15 watt and normally not beyong 5 to 10 watt.
Mordante comments about class A marketed amps really falling to class A/B is important. IME there is a notable difference with many of these amps when they drop down into class A/B in terms of performance and sound characteristics. Personally, I am a believe in pure class A amps (ie. that remain in class A at all times), but I think this is a matter of opinion. Also, I am not suggesting that all class A amps are better than all Class A/B or D amps (for that matter).

How much power does one really need? That is always the question and there are so many factors that come into play. Obviously the speakers, the room, listening levels, even types of music and of course sound preferences.

I own a pair of speakers that are rated at a solid 8 ohms and the reviewer and printed/promoted mfg. recommended power is 60 wpc at 8 ohms.

However, based on direct communications with the designer of the speakers, he clearly suggests that the speakers operate in a range of 8-16 ohms and based on their design principles are relatively hard to drive. He suggests that for best performance, 200-300 wpc (rated at 8 ohms) will produce the best results. I would agree with his assessment as I have run the speakers with various amps that ranged from a low power SET design, mid power push pull design, mid power 100% class A design and now a higher power 100% class A design (rated at only 125 wpc into 8 ohms). Each power increase proved to deliver better performance in specific scenarios.

Can one have too much power? No, I have never experienced that problem. But I suggest that there needs to be a balance between just a lot of power and the quality of power. Is 1000 wpc monoblocks too much power? Depends on the speakers, room, etc. . . Just because the proverbial needle doesn't ever "seem" to go past 15 watts does not mean for a second that all that is needed is a 15 wpc amplifier or that the 1000 wpc amp is a waste.

I can pretty confidently say that if you have two identifcally designed amps (with the exception of power output) that in the referenced case (above here and by Mordante), the 15 wpc amp will suffer versus the 1000 wpc amp and very, very likely run out of headroom and produce notibly different sonic attributes.
What about class B. The Makamichi 620 was (is) a 'pure' class B amp, delivering 100 w/ch and sounds mighty good. I think that the Quad 44 is also a class B, albeit called a 'current dumping' design, whatever that is!

Salut, Bob p.
Most transistor amplifiers do operate pretty close to class B; at low power levels most of the power comes from the driver transistors rather than the outputs. But in order to keep the amplifier from making significant distortion (notch and crossover) at the zero crossing point, they have to have a certain minimum amount of bias current to make that happen. As a result they are considered class AB.
Most transistor amplifiers do operate pretty close to class B; at low power levels most of the power comes from the driver transistors rather than the outputs. But in order to keep the amplifier from making significant distortion (notch and crossover) at the zero crossing point, they have to have a certain minimum amount of bias current to make that happen. As a result they are considered class AB
Maybe you're meaning that most of the power dissapated is in the drivers at small signal levels? This is indeed true in some designs, notably emitter-follower output stages with one set of output transistors. But in a conventional solid-state amplifier, virtually all the loudspeaker current flows through the output transistors, NOT the drivers. In any case, I'd agree that what constitutes "Class A" and "Class AB" is indeed widely used imprecisely . . . even though in a solid-state amplifier, they are distinctly different operating points.

The "cutoff" region in a tube or transistor is defined as the area where its transconductance drops sharply as the current through the device is reduced. In true Class B operation, the bias point is chosen so that the cutoff region of both halves of the output stage occur inversely at the same time . . . that is, the upper half is turning on at the same time, and to the same degree, that the lower half is turning off. In Class AB, one half of the output stage remains turned on before the other is turned off, thus giving a region (in the "middle") where both devices are conducting at full transconductance together. This is of course its Class A region.

The problem with Class B is well known -- it's due to the fact that pairs of output devices operating this way are non-conjugate, and the turning-on of one half never lines up exactly with the turning-off of the other. But in (solid-state) Class AB, the transition-region of the output device is left completely "exposed", without the other half compensating for its change in transconductance at all. Thus, the consequences are more severe - once a Class AB output stage leaves its Class A region, it produces more distortion than a pure Class B design.

On the test bench, a solid-state Class B design produces its best distortion at a specific bias level, and increases in distortion are evident both when the bias is too low AND when it's too high. Maintaining it at the proper point with regards to temperature is a very complex task, which is dependent not only on circuit design, but physical layout, assembly, and calibration procedure. Class AB on the other hand simply needs to have sufficient bias control to avoid overheating. For those who hear differences when their solid-state amps are left on all the time, variations in output stage bias current are the obvious reason why.

However, in a transformer-coupled push-pull tube output stage, the crossover behavior is fundamentally different for two reasons. First, it's a power amplifer (rather than just a current amplifier) which means the linearity is dependent on both the voltage and current across the device, as opposed to just the current - the latter more determining the cutoff characteristics. Second, this is accentuated by the fact that the primary inductance in the output transformer "forces" the plate of the non-conducting tube through its transition region, which in turn has less influence (the Rp increases) the more "turned-off" it's driven.

The consequent is that in a traditional tube amp, there is much more "grey area" between Class B and AB, because the presence/absence of crossover distortion is determined principally by the quality of coupling in the two halves of the output transformer's primary. The Class AB bias current serves mainly to keep some magnetization in the transformer core to compensate for leakage reactances. And on the test bench, P-P tube amps tend to show their most pronounced crossover distortion at the high-frequency, high-power end of the scale, where the output transformer core becomes most saturated.