5

A transformer has a fixed output ratio. E.g. 10:1, for a 23Vac supply from 230Vac. But it is found that under light load the voltage rises above the nominal (at rated load) - why?

Thomas O
  • 31,546
  • 57
  • 182
  • 320

4 Answers4

15

The voltage rises because the transformer has been designed for a specific load. If the load is removed the voltage will rise, perhaps by 50%. It's caused by the core flux being maximum with no load on the transformer. Current in the windings will generate an opposing field that reduces the flux, reducing the voltage.

Regulated transformers are called constant voltage transformers. Here is how they work. The additional winding and capacitor makes them more expensive than unregulated transformers.

Leon Heller
  • 38,774
  • 2
  • 60
  • 96
  • What about unregulated DC supplies? (e.g. 12V with a cap and two diodes using a centre tapped transformer.) Why are these unregulated? – Thomas O Nov 21 '10 at 00:06
  • 1
    Why should they be? They are followed by a voltage regulator, or regulation isn't needed. – Leon Heller Nov 21 '10 at 00:21
  • 2
    @Thomas O -- regulation doesn't just happen by accident, and you have to start somewhere, don't you? – JustJeff Nov 21 '10 at 02:11
  • What I'm asking is why doesn't the transformer keep a fixed ratio, why does it vary under load? – Thomas O Nov 21 '10 at 17:22
  • An ideal transformer wouldn't vary under load, but a real one does because it's a passive device governed by the laws of electromagnetics. To maintain a constant ratio in the real world it would need feedback and compensation - but then it wouldn't be passive. If you don't understand it yet it's because you haven't had the right physics/circuit analysis classes. For an idea of what the differences between ideal and practical transformers are visit here: http://en.wikipedia.org/wiki/Transformer - look under "Equivalent Circuit" – AngryEE Nov 22 '10 at 18:44
5

Better regulation needs more copper and steel = more cost/size/weight. And anyway there wouldn't be any point, as the mains voltage isn't precise, so even a perfect transformer wouldn't produce a consistent voltage in a freal-world application.

mikeselectricstuff
  • 10,655
  • 33
  • 34
4

Your question is not clear. Are you looking at a rectified DC output, or the raw AC output?

If you're looking at the raw AC, it's likely that the transformer is not in fact 10:1, it's something like 11:1 or 12:1, and the manufacturer's rating is simply compensating for the copper losses at the specified current draw (see copper losses below).

If you're looking at the smoothed DC output:
There are several things interacting here.

First: If you directly rectify 120V AC, what would you expect to get?
You will get ~170V DC, not 120V (Specifically you get ~ √2 * AC RMS, or ~169V).

  • This is because the 120V rating is RMS, or the average power provided by the AC. Therefore, when you directly rectify it, you're smoothing capacitor will charge to the peak voltage of the AC.

As soon as you start to load your rectified and smoothed transformer output, the voltage begins to be drawn down. However, since current will only flow when the transformer output voltage (which is a sine wave) is higher than the capacitor voltage (well, less the diode drops), the period of time where energy is flowing from the secondary of the transformer are quite short.

Here, another factor comes into play.
Copper Losses: or the DC resistance of the secondary of the transformer.

  • Since the period of time where the secondary is sourcing current is quite short, the current flow must be fairly large to enable the transformer to effectively source any power.
    Since we have a large current flow, any resistance in the winding of the transformer will translate into a large drop in voltage.

Now, since small transformers are wound with fine wire, they have a large DC R. Therefore, transformers are designed so the losses in the transformer balance out the increased voltage from the fact that the output rectifiers charge to the peak AC voltage.
Therefore, when you load a transformer to it's specified load, the output DC rectified voltage will correspond to it's labeled specs.

Connor Wolf
  • 31,938
  • 6
  • 77
  • 137
0

A transformer secondary can be approximately modeled as a Thevenin equivalent voltage source. Like any real-world source, it has output impedance, and since it's a passive device, that impedance is essentially fixed. This means that the more current you draw, the lower the voltage gets.

(More experienced engineers than I will have to speak as to the physical nature of the output impedance; it's obviously some combination of the windings' resistance and the transformer secondary's impedance at the line frequency, but the inductance of a transformer is also variable with load in ways I don't know how to quantify.)

Transformers are often rated by just how much voltage difference there is between full spec'd load and open load. A 5% transformer drops 5%, a 1% transformer drops 1%. So a 5% 20:1 transformer is 20:1 at full load, but more like 21:1 at no load.

To make things worse, the AC input to a transformer from the power grid is not guaranteed to be 230VAC. It's more like 230VAC +-10%, depending on where you are in the world and what time it is. So even if your load was constant, and even if your transformer had zero output impedance, your output voltage will still be unregulated; it will vary with the input voltage.

Stephen Collings
  • 17,373
  • 17
  • 92
  • 180