I am working with some small devices and checking input power requirements. Many of the standard labels look like this:
Input Power: 100 - 240 VAC, 50-60Hz, 2.5A
I am trying to determine how much current will be used with 120V or 240V. Because the input voltage is a range of 100 - 240, I am guessing that the device will only pull 2.5A for an input of 100V. Can I assume that for 240V, the maximum current it will draw is 1.0A? There is no documentation on total Watts.
100V * 2.5A = 250W
250W/240V = 1.04A at 240V
UPDATE: Another way to ask this question is how many Watts does this device use? Is it 100V * 2.5A = 250W? Is it 240V * 2.5A = 600W? Does the power vary based on changing the input voltage or will it automatically use less current to maintain the same power? I would like to connect multiple devices to 1 power source