Output watts per tube


I originally posted this question in the amp/pre-amp section with limited response. Thought I'd try it here, with some corrections. I'll further preface this by saying that I am not a technical guy, so everything I'm telling you here I learned last Thursday - fair warning if there are any gross inconsistencies with my explanation.

I recently visited a guy locally who was able to test the dozen EL34 Ruby output tubes from my Cary V12, with interesting results. Despite having 6-800 hours on them, ten of the twelve tested between 100 and 110 on a tube tester, and the other two came out between 85 and 100. This was using the sensitivity set at 53, per that tester's standard for the tube type.

Then, we used a sine wave scope with three tubes to get an average control measure, then checked all of the tubes against the control for distortion (none) by running two tubes in a test amp running push-pull in class A. This was also hooked up to a meter to measure output watts. (someone asked, and I don't know what the plate voltage was used in the amp) The result was that those two tubes - any combination of two of the twelve - had a sustained maximum output wattage of 21 watts, even the two that were a little low on the tester. The results from tube to tube were, unexpectedly, almost perfectly linear. Given the Rubies less than stellar reputation, I was expecting some measurable degradation or at least inconsistency.

So here's my questions: Is the 21 watts for two tubes the equivalent of 10.5 output watts per tube; and if my amp takes twelve tubes and makes 50 watts per side in triode, am I in fact only using a little over 8 watts per tube at any given time? The amp can also run at 100 watts per side in ultra-linear, which would require closer to 17 watts per tube (six per channel). Does the 50 Watts equate to relatively low stress on the tubes and longer tube life? Does employing 12 tubes somehow reduce the per-tube output further? (Cary didn't have an answer for any of this)

Any thoughts?
grimace

Showing 1 response by kirkus

Despite having 6-800 hours on them, ten of the twelve tested between 100 and 110 on a tube tester, and the other two came out between 85 and 100. This was using the sensitivity set at 53, per that tester's standard for the tube type.
this number refers to the "mutual conductance" (a.k.a. transconductance) that the tube delivers in the tester's circuit, which is a representation of the change of plate current divided by the change in grid voltage. With output tubes, this translates to "power gain" (power output vs. AC grid voltage), not maximum power output before clipping.

Maximum power output before clipping in a tube amp is a function of the plate voltage, the tubes' plate resistance (Rp), the load impedance (the speaker load impedance multiplied by the square of the output transformer's turns ratio), and the output transformer's insertion loss. All of these factors except for Rp are part of the amp, not the tubes . . . and this tube parameter tends to be less variable than transconductance.

Also note that "Pa max 25 watts" does NOT refer to the maximum power of an amplifier using this as an output tube, rather it refers to the maximum amount of power the tube's plate (a.k.a. anode) is able to continuously dissipate. (Pa = "anode power"). Plate dissipation is a function the quiescent (idle) conditions, the loading, and the duty cycle. I've personally measured almost 220 watts output from a quad of "25 watt" EL34s, running push-pull pentode with about 730 volts on the plates.