7
  1. For example, if a phone comes with a charger rated at 5V and 0.7A, when it's plugged in to charge, what dictates the current the phone draws, is it the resistance of the phone?

  2. If I = V/R, do phones typically provide little resistance so that the current is the max the charger can provide? i.e in the above example, if the phone was off, would it constantly be drawing 0.7A, and if the charger was changed with one rated at 5V and 2A, would the phone draw more than 0.7A? could it reach 2A?

  3. ...bit of a side question, but when the phone is done charging, how does it stop drawing current? again if I = V/R, does the phone have to alter the amount of resistance it is providing? how does it do that?

I'm only looking for fairly simple answers to be honest as this is just a general query and not something I need to go in depth with.

Thanks.

RJSmith92
  • 315
  • 2
  • 3
  • 8

2 Answers2

14

There is a charge controller chip inside the phone that determines how much current to put into the battery. Generally lithium ion batteries are charged with a constant current until the cell voltage reaches a specific level, at which point the charge controller switches over to constant voltage charging until the current drawn by the cell decreases to zero. It's a bit difficult to think about in terms of resistances as the cell itself has chemical reactions going on inside and the charge controller is built up with many transistors.

One thing to note about ratings: the rating on the power supply is generally the nominal voltage and maximum current. It does not supply the current on the label at all times. It's quite easy to see why this is: when nothing is connected, there is no path for the current to flow so the current is zero.

Charge controllers generally regulate the flow of current into the cell in one of two ways. Depending on the design of the charge controller, the controller IC can use a transistor to act either as a switch or as a variable resistance. Linear charge controllers work like super fancy variable resistors, changing the resistance between the charger input and the battery terminal so that a specific amount of current flows. The current is usually measured with a current sense resistor, a resistor with small value (generally 0.01 to 0.5 ohms) that generates a small voltage in proportion to the current. The measured current is then used in an analog feedback loop to control the transistor. This drive transistor dissipates the difference in voltage between the charger input and the cell as heat, P = (Vcharger-Vcell) * Icell. Linear charge controllers are generally small and cheap, but inefficient. This dissipated power can result in quite a bit of extra heat that has to be dissipated somewhere. Linear charge controllers also must have a higher input voltage than the desired cell charge voltage. Lithium ion batteries generally charge to around 4.2 volts per cell, so a single cell with a 5v power supply leaves the charge controller around 800 mV to work with.

Another design of charge controller is a switching controller. These controllers use a DC to DC converter to move charge into the cell. A DC to DC converter uses two switches (generally a transistor and a diode) and some form of energy storage (generally an inductor and several capacitors) to efficiently change the input voltage. A step-down conveter (also known as a buck converter) works by alternately storing up and draining energy in the inductor at a high frequency (100s of kHz to a few MHz). Since the transistors are either fully on or fully off most of the time, less power is dissipated making the converter more efficient. It is also possible to design a converter that can draw power from a supply with lower voltage than the cell voltage. Aside from the DC to DC converter, the operation of a switching charge controller is essentially the same as a linear charge controller: it measures the cell current and voltage and generates a control signal to adjust the duty cycle of the switching transistor to change the current flowing into the battery. Switching charge controllers are more complex and more expensive, but more efficient than linear charge controllers.

Now, as for how much current the charge controller can draw to charge the battery, this is generally determined by the software running on the phone. When you connect the phone to your computer's USB port, it can only draw a limited amount of power before it has to ask the computer for permission to draw more. Cell phone chargers generally advertise their current limit via a resistor connected between the USB data lines. This resistor is detected and measured and the corresponding current limit is then passed along to the charge controller so it knows how much current it can safely draw to charge the battery.

As far as sharing power with the battery charger, the phone will certainly draw additional power above and beyond what goes in to the battery. In fact, depending on how the phone is configured, it can draw more power when plugged in to a charger than it would if it was running off of its internal battery, using this current to provide a brighter display, longer backlight on time before standby, higher CPU performance, etc.

alex.forencich
  • 40,694
  • 1
  • 68
  • 109
  • Thanks for all that, much appreciated. When you say 'Generally lithium ion batteries are charged with a constant current until the cell voltage reaches a specific level', what do you mean by cell voltage reaching a certain level, would this not always be a fixed value? Also, why is it bad to use a charger with a higher voltage than the one supplied with the one phone, can the charge controller chip not provide enough resistance to lower the current to safe levels? Thanks. – RJSmith92 Aug 27 '15 at 16:15
  • 2
    The charge voltage depends on the battery chemistry. Some lithium ion batteries are charged to 4.2v, some to 3.6v, etc. And the battery voltage will vary with the current charge state - less charge means less cell voltage, but the relationship is not linear (quick drop from completely full, flatter plateau for a while, quick drop again when getting low). Using a charger with a higher voltage can be a problem as that voltage difference has to go somewhere. For a linear charge controller, this can result in generating a LOT of heat, which could cause damage if it isn't properly manged. – alex.forencich Aug 27 '15 at 17:14
  • Thanks again. Going back to the 3 questions in my post, in simple terms are the following correct - 1. The charge controller chip inside the phone dictates how much current is being drawn, acting as either as a switch or as a variable resistance. 2. Software on the phone would determine how much current the phone can draw, some phones may be able to draw up to 2A. 3. Again, the charge controller chip inside the phone stops the cell drawing current when it's full. I'm still not sure how it does that though? Sorry if you have explained it and I haven't understood, I'm new to all this. – RJSmith92 Aug 27 '15 at 19:27
  • 1
    How in terms of how does it determine when to stop charging, or how it actually turns of the current? The battery itself determines how much current is drawn when in constant voltage mode, I think standard practice is to electronically disconnect the charger from the battery once the current falls below some threshold current. Physically, this will be implemented with a comparator that looks at the output of the current sense resistor and a transistor. Generally this will be the same transistor that is used to limit the charging current, it simply needs to be turned all the way off. – alex.forencich Aug 27 '15 at 19:32
  • Thanks for that, really appreciate it. Sorry to nitpick, but can you just conform the following from my previous comment regarding my original questions - "1. It is the charge controller chip inside the phone that dictates how much current is being drawn, acting as either as a switch or as a variable resistance. 2. Software on the phone would determine how much current the phone can draw, some phones may be able to use all the extra current the charger can provide, whereas some phone may not be able to use any." – RJSmith92 Aug 27 '15 at 19:44
  • Yeah, that's a pretty good summary. – alex.forencich Aug 27 '15 at 19:45
  • Sorry to bother you again. Do you mind clarifying 2 things you have said? 1 - "the rating on the power supply is generally the nominal voltage and maximum current", is this the maximum current the power supply can physically deliver, or what it can deliver safely? 2 - Back to using a power supply with a too high voltage, would it be the job of the charge controller to try to deliver a safe amount of current to the cell, and therefore it would need to provide a higher amount resistance than it was designed to do to lower the current, and therefore generate a lot more heat? – RJSmith92 Aug 29 '15 at 02:52
  • 2
    1. it's the design current, the power supply should be able to maintain the voltage out to the specified current, then the voltage could start to droop. Generally power supplies should be well-designed so they behave safely with any reasonable load (i.e. it shuts down/goes into current limit when you short it instead of catching on fire). 2. pretty much, that extra voltage has to go somewhere. In a linear regulator, that means more heat. In a switching regulator, it will draw the same amount of power (volts times amps) at a lower current, though it could be less efficient. – alex.forencich Aug 29 '15 at 08:35
  • Last thing, promise. You say the voltage could start to droop with a higher current than the design current, but could this not balance out at about the same amount of power? i.e. P = VI. I assume it doesn't work like that, I'm guessing it would only be a small drop in voltage compared the increase in current, therefore the power will still be increased? – RJSmith92 Aug 29 '15 at 16:28
  • 1
    Depends on the design of the power supply. For a real-world example, see the bottom of page 7 of http://www.xppower.com/pdfs/LF_ECM40-100.pdf . That supply is rated for 24V and 2.5A. It will actually supply 24V at up to 4A, though you may run in to thermal and reliability issues if you operate at that output for a long time. After 4A, the output voltage will start to drop. A bit further, and the overcurrent protection trips and resets the supply. – alex.forencich Aug 29 '15 at 17:03
  • Thanks, interesting. That's sort of what I mean, looking at the graph, at 4A the voltage is 24V so the power is 96W. At 5A, the voltage looks to be 18V, which equals 90W. Why then is it worse for the power supply to be running at 18V 5A than 24V 4A? Instinctively I would have thought it would be the other way round... – RJSmith92 Aug 29 '15 at 17:15
  • 1
    It depends on how the supply is built. Linear supplies have a transformer followed by a rectifier and linear regulator. The regulator has to dissipate the extra voltage, so overcurrent protection is usually built in to let the voltage droop after a certain point. In some cases, this protection will also decrease the current limit to decrease the overall power dissipation as you will get higher dissipation in the regulator with a lower output voltage at the same current. This is called 'foldback current limiting.' – alex.forencich Aug 29 '15 at 17:32
  • 1
    In a switching supply, you eventually run in to the duty cycle limit. Since the duty cycle is more or less proportional to the output power, the voltage will drop as the current increases for more or less constant output power. The output power will probably also drop due to decreased efficiency at high output. The efficiency curves in the datasheet I linked are only plotted to the rated output power, so you don't see this drop on those curves. – alex.forencich Aug 29 '15 at 17:35
  • 1
    It's not that it's "better" or "worse", that's just how the power supply design operates under load. Also, more information on foldback limiting with some nice graphs: https://en.wikipedia.org/wiki/Foldback_%28power_supply_design%29 – alex.forencich Aug 29 '15 at 17:38
  • Sorry to keep nit picking, you say "The regulator has to dissipate the extra voltage, so overcurrent protection is usually built in to let the voltage droop after a certain point." What do you mean by extra voltage, isn't the voltage always the same (until it droops i.e. 24V in the link you sent), is it not the extra current? – RJSmith92 Aug 29 '15 at 17:58
  • 1
    No, I do mean extra voltage. In a linear power supply, you have a transformer and rectifier that generate, say, 36 volts no load, 28 volts under full load (or something). Then the linear regulator will dissipate the difference between the transformer voltage and the output voltage as heat. If you provide a simple current limit, when the load tries to draw more current, the output voltage drops. However, the voltage from the transformer might not drop as much, so the regulator has to dissipate even more power. Hence why foldback protection is used, it lowers the current limit as well. – alex.forencich Aug 29 '15 at 18:05
  • Right this is definitely the last thing! you said 'Cell phone chargers generally advertise their current limit via a resistor'. If a phone comes with a charger rated at 2A, then I used one at 0.7A, is this dangerous? I've read that it is, but with what you said, would the charger not tell the charge controller it can only produce 0.7A and that's all the phone one draw? – RJSmith92 Aug 29 '15 at 21:01
  • Let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/27561/discussion-between-rjsmith92-and-alex-forencich). – RJSmith92 Aug 29 '15 at 22:58
  • Hey Alex, really sorry to be a nuisance, can you just confirm my last message in the above chat, I assume the answer is yes, just want to be certain... – RJSmith92 Sep 04 '15 at 01:49
  • I could have sworn I posted a reply to that. Anyway, I don't think it's going to be dangerous. However, the battery will only be charged with the difference between the supply capability of the power source and the current draw of the device itself. If the external power supply puts out less current than the current required by the CPU, screen, radio, etc. then the battery will actually continue to discharge. However, it's unlikely that 700 mA will be less than what the system requires to run, so the battery will charge, albeit more slowly than it would with the 2A supply. – alex.forencich Sep 04 '15 at 02:38
  • Thanks, I think the chat room isn't working correctly for some reason. The last message I posted in there was - "And I assume when the negotiation process fails that's when problems can occur, with the phone trying to pull a higher current than the supply is capable of, potentially damaging it?" The reason I say that is because of what it says [here](http://www.groovypost.com/howto/choose-right-power-adapter-charger-phone-laptop/), where it says 'current too low' about half way down. Would that be the case if the negotiation process failed? – RJSmith92 Sep 04 '15 at 11:26
1

Cell phone battery charging is handled through a battery charging IC. Typically a switching regulator that varies voltage and current in order to charge the battery. It also measures battery voltage and temperature to know when to cut the charging, through a mosfet.

Passerby
  • 72,580
  • 7
  • 90
  • 202