romberg wrote: ↑Fri May 06, 2022 9:41 pm
I have a couple of amps where an output transformer is being driven by tubes that can (in theory) deliver more power than the transformer is rated for. In one case I have a pair of el34s operating at 450V into an OT rated at 30 watts (marshall artist 3203). And the other amp is a fender super reverb with a pair of 6l6gcs at around 460V into a 40 watt rated output transformer.
A power amplifier can best be thought of as a power supply that lets out some of the power it takes from the wall socket under highly specialized conditions. An amplifier output stage simply cannot put out more power than the power supply makes.
The tubes are pass-through devices, that do that letting through of the power supply's power, wasting some power heating themselves in the process. Tubes cannot let through more power than the power supply can provide them. They also cannot "make" output power. Rather, they can let through a certain amount of power before they melt themselves down from the waste heat.
The amount of power the tubes can let through is also limited by how well the output transformer matches their preferred voltage and current limits to power transferred to the load.
Output transformers do not have impedances, they have ratios. A transformer rated at (made up example here!) 6K to 8 ohms only loads the tubes with 6k when there really is 8 ohms on the secondary. When the secondary load is 4 ohms, the tubes are loaded with 3k, and when the secondary load is 16 ohms, the load on the tubes is 12K.
How much power the tubes can let through the OT to the secondary is indeed affected by bias. In general, a specific pair of tubes can let through a variable amount of power depending on bias. Biased to class A, they have the lowest distortion, but can make the least power. This involves having a continuous high-ish DC current in the OT that causes some additional power waste/heating in the OT wires. Both the tubes and the OT are hottest and therefore can produce the least power without dying from the heat in Class A. Class A usage also uses the most power from the power supply, so a Class A power supply has to be beefier for the same tubes and OT than one used in class ab or b.
Class B involves using only half the OT primary (in a push-pull setup) at a time, so the OT heating is down a lot, and so is the heating on the tubes. You get way more output power from a given set of tubes and OT in class B than in class A before either the tubes or OT are damaged by heating. Also more distortion. Also requires less power out of the power supply.
Class AB is a hybrid. As you turn the bias down from full class A toward class B, for the heating on the tubes and OT, you can use more power from the power supply without melting the tubes or the OT, until you get to class B. You get first a little, then increasingly more distortion as you do this bias change.
Takeaways:
- The tubes don't make power, they let the power supply power through. And they only let through what you tell them to do by both bias and signal level.
- The amount of bias on the tubes affects how much power the tubes can let through without melting. The closer to class A, the more waste heat they produce for every watt put out through the OT, so the higher the static bias current, the lower the power supply voltage usually is designed.
- for a given power supply, the higher the idle current, the lower the distortion, but the higher the idle heating on the tubes, and the higher the tube heating is at a given output power; continuously increasing idle current in an amp with a power supply designed for near-class-B operation (that is, higher B+) is a good way to melt tubes.
- Class A biased amps have a nearly fixed amount of power used from the power supply. The tubes heat less as the output signal rises from idle.
- Class B amps have very little waste heat in the tubes at idle; tube heating (and power supply power output) goes up radically as the signal level rises.
- Class AB (which is where we all work) idle cooler at biases nearer B and hotter at biases nearer A. Biasing an AB amp hotter heats the tubes more, the OT a little more, and has less increase in tube and OT heating with bigger signals; you pay for that by having the static heating higher all the time.
Final bit of backgrouund. OT power rating is a slippery topic. An OT rated for some fixed output power, say 40W, does have a different internal heating at different biases. But it is not clear at all whether the power rating on the OT was limited by internal heating, low frequency bandwidth, or the impedance ratio and an implied bandwidth. Transformer power ratings are slippery. The real limit is internal heating breaking down the internal insulation, which you usually don't get to know for an OT. The iron and copper will work fine at temperatures visible as a slight red glow in a dark room if the insulation parts don't allow shorts, and will at this temp put out many times more power. So heat death for an OT is determined by how hot the insulation can get and still not allow shorts. This info is generally unavailable for guitar amp OTs.
My first question is: Does this need to be taken into consideration when biasing the power tubes?
I've often seen folks mention just biasing the 6l6s up to 50-70% of their max plate dissipation (30W) and off you go. But thinking about this makes me unsure. I'm assuming that a 40w output transformer would start to experience core saturation at 40watts. So, maybe these amps should really be biased at 50-70% of the power rating of the output transformer rather than the power tube plates? I think this might make for more headroom for such mismatched transformers. But I'm not sure. It occurs to me that this may not really make a difference as the OT is really the limiting factor so just bias the power tubes as usual.
The OT is generally not the limiting factor. It's generally tube heating or power supply capability. Bias for the tubes, not the OT.
Second question is more of a design thing. What keeps such a setup from cooking the output transformer? I realise this could be a very complicated area of discussion
![Smile :)](./images/smilies/icon_smile.gif)
.
You got it in one. That's why I started with the stuff above.
At idle, the tubes only deliver a small amount of power into the DC resistance of the primary. So, I'm assuming the limiting factor in the transformer is magnetic core saturation? If so, when this happens, what effects happen on each side of the OT? Does the impedance seen by the power tubes rise thus reducing the current on the primary? Or can the tubes (assuming they are backed by a stout power supply) just burn up the primary? I'm assuming the voltage on the secondary side may just clip.
The limiting factor on the OT is insulation breakdown temperature, as above. It's rare that an OT is so poorly put together that tube bias will push it over the edge, though. Could happen, but not often.
Magnetic core saturation is not generally an issue. An iron core can absorb a given amount of integrated volts-times-seconds before its primary inductance ramps to the edge of saturation. So a transformer saturates more on low frequency signals. At higher frequencies, the signal doesn't last as long in one direction before reversing, so you can pump a lot more volts in on a half cycle before saturating. Saturation is a low frequency issue. It's also complicated. You can't saturate a transformer from the secondary, so secondary loading isn't the issue. Magnetic saturation per se does not heat transformers, but it makes their primary inductance drop dramatically, meaning that external circuits that rely on the primary inductance to limit current are on their own, so sometimes current skyrockets, overheating the externals a lot and the transformer a bit. Saturation, if it every happens, also dramatically lowers the coupling from primary to secondary, so the secondary voltage waveform flattens off at saturation, as the primary can no longer drive it effectively.