5

I bought a UNI T UT210E true rms multimeter.

I measured the current of 9 W LED lamp. It shows 0.033A. For the power I get \$0.033 A \cdot 220 V = 7.26 W\$ only.

But bulb was 9W. Why this difference comes? Actually am new to electric.

I measured also a running 5hp water pump 3 phase with 240 VAC it gives

I1 = 7.35 A
I2 = 6.75 A
I3 = 6.15 A 
Freiheit
  • 109
  • 5
sss
  • 67
  • 3
  • 1
    Did you _measure_ the voltage at exactly the same time, or just assume it was 220V? – Bruce Abbott Sep 29 '19 at 07:20
  • No. I measure exactly the same time. My home AC is 220v – sss Sep 29 '19 at 08:53
  • Most likely the bulb isn't what it says it is. That is quite common, particularly with LED bulbs. What is the voltage rating of the bulb? If it is, say, 240V then it won't consume 9W at 220V. – Peter Jennings Sep 29 '19 at 10:27
  • 4
    I would have to guess that the % accuracy of a 100A clamp-on meter, when measuring 0.033A, is no better than 20%. – Hot Licks Sep 30 '19 at 00:59
  • 5
    I wouldn't trust a clamp meter at 33mA, especially when being measured with a 100A meter. – DKNguyen Sep 30 '19 at 04:55
  • Make sure you always calibrate it first before measuring using the clamp meter, there is a zero button in the meter. Turn off the appliances, clamp, set to zero, turn on. – Unknown123 Sep 30 '19 at 05:40
  • 1
    A lot of people colloquially describe mains power as 110/220V, which is what it was 100 years ago when Tesla won the war of the currents and mains power started to scale. Actually in most places, it's had several bumps since then, and I'd expect to see 120V, 230V or 240V. – Harper - Reinstate Monica Sep 30 '19 at 06:03

5 Answers5

21

That user manual (which you should link to in your question) shows the following:

enter image description here

Figure 1. Just because it's digital, doesn't mean it's accurate.

  1. You are measuring at the bottom end of the range and if it were an analog meter you would be squinting at it to try to make out the reading.

Figure 2. Reading position on an analog scale.

0        0.4       0.8       1.2       1.6       2.0 A
|---------|---------|---------|---------|---------|
 ^-- 0.033 A
  1. The manual shows that that accuracy is only for readings > 5% of full scale, 100 mA on the 2 A range. I have no idea what is meant by "<20 residue readings".
  2. The manual doesn't make any claims about true RMS.

enter image description here

Figure 3. The crest factor of an AC current waveform is the ratio of waveform's peak value to its rms value. Source: Ametek.

  1. Your LED lamp will probably have a high crest factor (peak current to RMS value due to the rectification action of the diodes. The meter doesn't handle this well with a further 7% error possible.

Multiplying VRMS by IRMS gives you the VA and not the watts. To calculate the power consumed is more difficult and involves integration of the power curve. Digital power meters sample the voltage and current waveform many times per cycle, multiply the instantaneous readings together to get the instantaneous power, sum them (integration) and average the readings to give the average power.

In short, it's the wrong meter for a true power calculation.

Transistor
  • 168,990
  • 12
  • 186
  • 385
7

One more thing to consider - unless it has been measured with an accurate wattmeter, you don't know how much power your '9W' LED lamp actually draws.

LEDBenchmark measured the characteristics of many LED lamps with high quality test instruments. Their Wattmeter has a claimed accuracy of 0.2% with voltage and current sample rate of 4800/sec. Some example test results:-

'9W' Bulb draws 7.6W at 246V

'9W' Warm White GU10 only draws 3.9W at 245V

"9 Watt, Operating Voltage 80-240 Volt AC" bulb only draws 2.2W at 123V!

Bruce Abbott
  • 55,540
  • 1
  • 47
  • 89
6

Transistor's answer is correct, but I'm going to expand on it.

You must have had the meter set on 2 amps. From the user manual, the accuracy is +/- (3% + 10), where the 10 means "ten counts". The resolution is 1 mA, so each count is 1 mA. Then the accuracy at, for instance, 40 mA (.040 A) will be +/- ((.04 x .03) + 10) mA, or basically +/- 10 ma. So your reading of 33 ma could mean the real current could be as high as 43 mA, or as low as 23 mA. 43 mA times 220 volts equals 9.46 W. 23 mA times 220 volts equals 5.06 watts.

It's also not unreasonable that your 220 VAC varies by as much as 5% (you didn't actually measure it, remember), so your real power could be in the range of 9.96 to 4.8 watts.

This does not include the crest factor problems which plague simple power measurements.

Finally, if you don't connect anything at all, the meter can have a reading of as much as 20 mA (that, I think, is what the "20 residue" means). Since you are reading less than the rated minimum current (100 mA), you might have an error of as much as 20 ma, which means that your current could actually be as low as 13 mA, and the meter would still be reading within spec.

As transistor says, you need a different meter. Specifically, you need a dedicated power meter which will sample both voltage and current at a fairly high rate, then multiply corresponding samples and do the math. LED bulbs are not like incandescents, which behave like simple resistors. They are non-linear devices which need special attention if you're going to try measuring them.

WhatRoughBeast
  • 59,978
  • 2
  • 37
  • 97
1

You can't determine if the meter is good or not, just with a calculation of supposed bulb power. You should measure the current with an additional more accurate ammeter in series.

Marko Buršič
  • 23,562
  • 2
  • 20
  • 33
0

Many LED lamps are designed in such a way that they take more than rated power given part of each AC cycle, but then give some of the power back during other parts. A clamp-type current meter without a voltage connection will have no way of distinguishing which way power is flowing during different parts of a cycle, and will thus have no way of subtracting the power which is returned to the mains from power which is taken from them. Instead, both kinds of power will be added together.

Incidentally, transformers are rated in units of "VA" rather than "watts" for a reason similar to this: the amount of energy lost in a transformer will be proportional to the magnitude of the voltage times current regardless of which way the power is flowing. If a transformer which is 90% efficient is used to power a device which takes a certain amount of energy from the mains during part of each cycle, and returns it all during the other half, the device being powered wouldn't use any energy, but the transformer itself would waste twice as much as if the device took all the energy it received and simply dissipated it without returning it. Thus, clamp meters may be good for estimating energy dissipation in a transformer even if they're not good at estimating total energy consumption for some kinds of loads.

supercat
  • 45,939
  • 2
  • 84
  • 143