Transistor's answer is correct, but I'm going to expand on it.
You must have had the meter set on 2 amps. From the user manual, the accuracy is +/- (3% + 10), where the 10 means "ten counts". The resolution is 1 mA, so each count is 1 mA. Then the accuracy at, for instance, 40 mA (.040 A) will be +/- ((.04 x .03) + 10) mA, or basically +/- 10 ma. So your reading of 33 ma could mean the real current could be as high as 43 mA, or as low as 23 mA. 43 mA times 220 volts equals 9.46 W. 23 mA times 220 volts equals 5.06 watts.
It's also not unreasonable that your 220 VAC varies by as much as 5% (you didn't actually measure it, remember), so your real power could be in the range of 9.96 to 4.8 watts.
This does not include the crest factor problems which plague simple power measurements.
Finally, if you don't connect anything at all, the meter can have a reading of as much as 20 mA (that, I think, is what the "20 residue" means). Since you are reading less than the rated minimum current (100 mA), you might have an error of as much as 20 ma, which means that your current could actually be as low as 13 mA, and the meter would still be reading within spec.
As transistor says, you need a different meter. Specifically, you need a dedicated power meter which will sample both voltage and current at a fairly high rate, then multiply corresponding samples and do the math. LED bulbs are not like incandescents, which behave like simple resistors. They are non-linear devices which need special attention if you're going to try measuring them.