In the course I'm reading, there's this example to show that a low power factor leads to more current being drawn, and thus more power lost in the transmission lines.
Assume a certain load requires 1 kW to function correctly.
If the power factor were 0.9, and the power supply gives 110V, we would need 10A. If the power factor were 0.6, and the power supply gives 110V, we would need 15A.
However, something here just doesn't make sense. If we consider the power supply as an ideal voltage source, then the current has nothing to do with the given power.
It seems that he just assumed the load would draw the required power to function properly no matter what.
On the other hand, it seems to me that the current would stay the same, being affected by the impedance of the load, leading to a lower real power and thus a malfunctional load.
Where did I go wrong?