I am trying to understand when some devices say that for example it needs 5V and 2a to work properly, or when a USB charger says it provides 5V at 1A or another that says 5V at 2A.
Why does all this mention the source current? isn't the source supposed to only provide voltage and the current is defined by the resistance of whatever im connecting to the source?
So for example, if I have a resistor to heat a blanket or whatever at 10W plugged into a 5v 2amp usb port, to get the 10W of heat dissipation the resistor must be 2.5 ohms and all is ok.
But if I connect the same resistor to a usb port that "supplies 5v and 1amp" the circuit current should be 2 amps anyway because V = IR and with that the resistor will dissipate 10W because P = V ^ 2 / R, right?
What am I missing? I suppose that in a heater there must be other components that provokes this?, I read that when the devices demand more current that source provies, the voltage source drops to maintain the required current. But I don't see how my heater "demands" a certain amount of current if it should be defined by I = V / R
(I only know some basic of electricity and all this arose because I want to buy some heatpads that work at 5v and 2A and I was wondering if connecting it to the usb3.0 of my laptop would give me satisfactory results and because im curious xd)