I want to know how energy meters (Electro-static or electronic) measure Energy (let's assume power, ignore time) when there are harmonics present.
I know what are harmonics, how are they generated and how do they affect actual power. I read that harmonics power (in most cases) is in the opposite direction of Actual Power. That means if the utility is providing power to a consumer who has a dominant non-linear load, and in-between them is an energy meter (plz ignore class, type, etc.). So energy meter would record less power. That is understandable.
I tested this scenario like this, I applied a linear load to meter and recorded power of ~2.3KW (230V x 10A x 1.0 P.F). But when I applied Phase-fired (90 degrees) waveform which contains odd harmonics, the power recorded by meter was ~1.1KW given the same set of V, I and P.F.
I know meter records power accurately because you can see from the waveform that rms have been decreased (as each half-cycle is 50% clipped to zero). But there lies my question.
If meters are recording less power in the presence of harmonics, then isn't it a loss of utility and benefit of consumers? While consumer is responsible for generating harmonics and is getting profit on top of it?
Should meter record actual power while filtering out the harmonics?
To me, it sounds really confusing, and I tried to search for utilities and meter manufacturer point of view, and they seem to agree that meter should consider harmonics effect while measuring power as it would be beneficial to both? How?
I tried to raise my question as much as I could find the write words, I hope that people here would understand, and help me find the answer.
Thank You