Setting cathode bias in my tube amps manually, how accurate must I be?


I have two systems driven by tube amps that each require manually setting bias. My custom 7591-based stereo power amp (push-pull Pilot 248 circuit) has individual pots for each of the four tubes to be set to -.55V. My Cary monoblocks require manual setting to 100 milliamps. How important is it to be accurate at these values? For instance I was enjoying my 7591-based amp for several hours and then decided to check the bias voltage because I was too lazy to set the bias at turn-on.  All tubes registered -46 to -48 V. Another evening they were in the -.65V to -.70V range. So what would that be doing to the operation of the tubes and possibly the sound? With my venerable Cary amps I found that at turn-on the bias current may be around 88 ma and then just continues to rise to near 110 ma, requiring me to wait a bit before setting them to 100 ma. Again what is the affect of them running say at 92 ma or perhaps at 110 ma. Are the tubes stressed by being off the bias target setting? What would be good plus-or-minus numbers be for each? Why did one designer choose manually setting bias voltage and why the other chose manually setting bias current?
elunkenheimer

Showing 1 response by elunkenheimer

Thanks George, went to the site and thanks atmasphere. Perhaps a embarrassing question: do I assume that the absolute value of idle current drawn across a resistance would be polarity-neutral, (whether or not there is a minus sign in front of it), making -.64 volts a "higher" value and drawing more current than -.55 volts?