1

I often see power adapters indicating 5V 900mA (or) 5V 1000mA or sometimes 5V 3.4 A. Obviously they are converting current coming from 220V/110V source to 5V DC using some step down converter. But how do they know the amount of current that can be given by them ? i.e., 0.9 A or 1 A or 3.4 A etc.. What I mean is, based on Ohms law ( V=IR) the current is dependent on V and R where V is step down voltage given by adapter ( lets say 5 V in this case). However R is dependent on the circuit that is going to be connected to the battery or charger. The circuit may have higher or lower resistance and accordingly the current would vary. So then what is the meaning of the current notation mentioned on the chargers ? Is it the max current ? If that is the maximum current in Amps that charger/battery can deliver does it mean that the resistance of the charger circuit is lets say 5V / 1 A = 5 Ohms ?

I know this is a very basic question. Appreciate your patience.

Sridhar Y
  • 61
  • 1
  • 3
  • Actually that is 4 questions in one text block. Along with being a possible duplicate, it is off-topic (too broad) for this site. –  Jan 19 '18 at 03:08
  • The duplicate does not answer anything asked by OP. This should be reopened. – Passerby Jan 23 '18 at 05:42

1 Answers1

3

As you say, the current depends on the attached resistance.

The rated current is not what they will provide, but what they can provide. If your powersupply is rated 5 V, 1 A, it means that it can deliver up to 1 A of current.

So if I take that supply and connect to it a 5 kOhm resistor, the supply will provide only 1 mA of current. If I connect a 50 Ohm resistor, it will provide 100 mA of current. 5 Ohms will be the maximum, at which point it will provide 1 A of current. When I attach a lower value resistor, all bets are off regarding what will happen. IT could be the powersupply just fails to provide 5 V at the current draw, and will for example drop to 4 V if we overload it a bit. It could also be that the supply provides the current, but damages itself because it is getting too hot internally to handle this.

Quick added note: Sometimes the manufacturere will actually mention that the supply can operate outside of this limit, but at reduced performance elsewhere. For example, I have a few lab power supplies that are rated to 24 V, but will happily output up to 30 V. The manufacturer (HP, now keysight) says that doing so will not risk any damage to the device, but they cannot guarantee the noise and output resistance of the supply is within the advertised specifications.

Joren Vaes
  • 12,376
  • 33
  • 64