6

As an example, I have an AC-DC "wall wart" that specifies an input voltage of 100-230 volts and an input current of 0.3 amps. Since my mains supply is 230v:

  • Does that mean that it uses 230*0.3=69 watts?

  • If I were to use it in a 110v country, would it use 110*0.3=33 watts instead?

Dataflashsabot
  • 1,735
  • 2
  • 11
  • 6

5 Answers5

5

First, it will only "use" what the load demands. Second, you don't know where it's limitation is. If the true limitation is its input current, then yes, you get less power at a lower input voltage. However, the maximum power it can put out may very well be limited by something else. In that case, it will only put out up to that maximum power before it will either shut down, drop the output voltage, violate the input current spec, or catch fire.

Look more carefully at the nameplate. It probably says something about its output voltage and current. That is what you can rely on as long as you give it input voltage within its specified range, which in this case seems to be to simply plug it in anywhere in the world.

Olin Lathrop
  • 310,974
  • 36
  • 428
  • 915
  • 1
    "or catch fire" made me laugh this early morning... OP I think just learned input V, A ratings are intended for MAX ratings not nominal. – Tony Stewart EE75 Dec 17 '12 at 14:53
5

Short answer: no.

First of all, the way you compute the wattage works with DC, in AC the calculation is different because you need to use the rms value of the voltage, and the phase between current and voltage might not be zero, that adds another corrective term.

Anyway, assuming your power supply is switching, and looking at the input voltages allowed I'm quite sure I'm right, the power consumption is practically the same and depends on the power consumption of the utilizer. If you read carefully the tech specs you'll find the maximum output current, that times the output voltage gives you the maximum output power (that's DC now!). The efficiency of a switching PSU is quite high, so that roughly corresponds to the power required.

If you want to deepen the AC power computation topic you can find all what you need on wikipedia:

AC Power

Switching power factor

Long story short, the problem is that inside a switching PSU there is a full bridge rectifier and huge electrolytic capacitors, so the current is absorbed only when the input voltage exceeds the capacitor one. This leads to very high current consumption spikes that make the consumed power computation a bit tricky.

Vladimir Cravero
  • 16,007
  • 2
  • 38
  • 71
4

To a first approximation, no, appliance wattage does not differ with line voltage because the electrical power (wattage) required to do a task at a given rate is independent of the voltage.

However, practical implementations for one voltage or another may be mildly more or less efficient, and it may also be the case that dual-voltage designs are more efficient at one voltage than the other.

In some extremely crude cases - let's imagine a transformer, rectifier, filter cap, and zener diode regulator - the power draw may reflect the applied voltage more than it does the load. And of course a simple resistive load like a heater would be an extreme example, though there operation at half or twice the intended voltage is unlikely to produce an acceptable result. Many motors can be wired in their terminal box to configuration appropriate for various line voltages.

Also, distribution at higher voltages incurs lower losses.

Chris Stratton
  • 33,282
  • 3
  • 43
  • 89
  • +1 for the `to a first approximation...` part. I've seen some power supplies which considerable efficiency differences depending on the input voltage. – AndrejaKo Dec 17 '12 at 17:07
2

Those are maximum values.

You've not said what's on the output DC side; let's say it's max 30V at 1A = 30W. That accounts for the input of 110V 0.3A, plus 10% internal losses.

If you then plug it into a 220V supply it will most likely draw 0.15A, resulting in the same power throughput.

pjc50
  • 46,540
  • 4
  • 64
  • 126
  • It outputs 12v at 2a, I didn't mention that because I was only interested in the amount of mains power it would use. I assumed those input numbers were what it would realistically draw, not maximums- thanks for correcting me on that. It's powering a strip of LEDs to light an aquarium; I'm not sure what the strip's actually drawing from it. – Dataflashsabot Dec 17 '12 at 15:05
-2

As you might know, the current is an effect of voltage, the translation being that if you decrease the voltage, automatically the current is decreased too. Hence, if you plug your 220V machine in a 110V electric outlet, it'll draw half the neccesary voltage AND current, in an undervoltage state, thus the equipment will not work at the power it was designed to work .

  • 2
    The question mentions a "wall wart" with a 100-230V input range which would normally mean it's a switch-mode supply. This answer would apply to a resistive load but I don't feel is useful in this case. – PeterJ Apr 09 '14 at 15:49