It depends on which approach you prefer to take with cathode biasing. One approach is to bias hot (100%) and then use a small cathode cap, typically around 47uF or 25uF. This low cap value lets the cathode-bias voltage swing freely, so that the amp biases itself progressively colder as it works harder. So at full throttle peaks, it's probably running at close to 50% dissipation.
The other approach is to bias cooler, say to around 80% or 85% of max dissipation, and then use a huge cathode cap, in the 1000uF to 2200uF region. This helps extend the life of tubes, which may be useful if you're running NOS. It also makes for a bit more of a stiffer feel, taking the amp a bit more like a fixed-biased amp.
jcr1234 wrote:
357.25x0.027(anode current)=9.64
Is this right? Only 9.64 watts?
If I can go to 12 watts with an EL84 should I reduce the value of the cathode resistor?
So your actual cathode current is 30mA, which is slightly less than the Bias-Rite showed. Interesting....
And your anode (aka "plate") current is 27mA. Multiplied by the difference between anode and cathode voltage, that gives 9.68W by my calculations. Then you can chose whether you want to bias warmer or not, based on the two approaches I explained above.
However, I am really puzzled at how you only got 7.75V across the cathode resistor, with around 360V on the EL84s. I would normally have expected to see more around 12V. How much does it change with a different set of tubes?