0

I am trying to understand when some devices say that for example it needs 5V and 2a to work properly, or when a USB charger says it provides 5V at 1A or another that says 5V at 2A.

Why does all this mention the source current? isn't the source supposed to only provide voltage and the current is defined by the resistance of whatever im connecting to the source?

So for example, if I have a resistor to heat a blanket or whatever at 10W plugged into a 5v 2amp usb port, to get the 10W of heat dissipation the resistor must be 2.5 ohms and all is ok.

But if I connect the same resistor to a usb port that "supplies 5v and 1amp" the circuit current should be 2 amps anyway because V = IR and with that the resistor will dissipate 10W because P = V ^ 2 / R, right?

What am I missing? I suppose that in a heater there must be other components that provokes this?, I read that when the devices demand more current that source provies, the voltage source drops to maintain the required current. But I don't see how my heater "demands" a certain amount of current if it should be defined by I = V / R

(I only know some basic of electricity and all this arose because I want to buy some heatpads that work at 5v and 2A and I was wondering if connecting it to the usb3.0 of my laptop would give me satisfactory results and because im curious xd)

Camilo
  • 11
  • 2
  • You have some good comments. I would suggest you spend a few evenings doing online tutorials on Ohm's law, that will also give you a few branches. A hint the source and load need to be the same voltage but as long as the source can supply as much or more current then you need. – Gil May 29 '21 at 02:13

2 Answers2

1

Why does all this mention the source current? isn't the source supposed to only provide voltage and the current is defined by the resistance of whatever im connecting to the source?

What the power supply is saying is that a load (the device that draws power from the supply) may safely draw up to the rated current and still assume the voltage will remain within the stated specifications.

So a "5V 1A" power supply should maintain an output voltage of 5V if you don't draw more than 1A from it.

But if I connect the same resistor to a usb port that "supplies 5v and 1amp" the circuit current should be 2 amps anyway because V = IR and with that the resistor will dissipate 10W because P = V ^ 2 / R, right?

All bets are off if the load draws more than the current rating of the supply. Usually one of following happens in such cases:

  1. The output voltage decreases.
  2. The power supply may simply shutdown and stop supplying current. The shutdown may be temporary or last until some reset action is performed by the user.

Some computers will even initiate a shutdown of the system if an overcurrent event is detected on a USB port.

Power supplies which don't protect themselves may even get damaged if you try to draw too much current from them.

ErikR
  • 4,897
  • 11
  • 18
0

Your understanding is correct within the limitations of the power supply; if the load attempts to draw more than the supply can deliver then the behaviour will depend on the power supply - the voltage may drop, the power supply might shut down completely, or it may fail somehow if it’s not adequately protected.

Frog
  • 6,686
  • 1
  • 8
  • 12
  • And thus assumes that the load acts like an ideal resistor - for a heater this is typically a good approximation but switch-mode devices often exhibit a *negative* resistance, which is to say that a small decrease in supply voltage results in the device attempting to draw more current. Obviously this only applies over a certain range, the device would not attempt to draw infinite current with zero volts applied. – Frog May 29 '21 at 02:04