I have recently purchased a watt meter from Adafruit to measure the cumulative watthours of a non-constant (DC) current.
When I use a multimeter to test the voltage and amperage, I get what I expect: 12 V, 1 A. When I use the wattage meter, I get 12 V, 0 A. When I then plug in a load, the current goes up to 1 A and the voltage goes down to 3 V. The load plugged in is 5 V, 1 A. There is also no internal load in the wattage meter.
How do I get the wattage meter to count the cumulative current and voltage correctly, and if that isn't possible, is there an alternative way to do this? I know I can calculate hours x current x voltage to find watthours, but the current and voltage vary a little, and because of this, I can't find the average of the current and volts.
While the multimeter and wattage meter read 12 V without a load, the rated voltage is 10 V. Also, I wouldn't expect it to reach 10 V, more like 7.5 to 8 V, which is essentially what the wattage meter was reading +5 V, which made me wonder if the wattage meter was reading the voltage the load isn't using.
The load is a Firestick TV from 2014. The source of energy is a solar panel.