6

Over the years, I've come across websites and people with different opinions on the "correct" way to charge rechargeable batteries (I'm more concerned with laptop, tablet and phone batteries than with rechargeable AA or AAA batteries, if that makes a difference).

  • Some (e.g. this one and some of my family members) claim that one should discharge all the way to 0% (or close to that) and charge all the way to 100% to prevent the battery from developing a "memory" (whatever that means; do they mean hysteresis?) and losing charge-storing capacity.
  • Others (mainly Apple representatives and some of my family members) claim this was the case with older batteries only and newer batteries are "smarter" (whatever that means) than that and it's battery cycles that count now.
  • Still others (e.g. this one, this one and this one) claim that the best way to charge a battery is to let it oscillate only between X% and Y% (with different values of X and Y depending on whom you ask, but all agree that X>0 and Y<100) because that somehow increases battery life by preventing overloading (no mention of what the lower limit is for).

None of those websites or people have ever explained why what they say is true is, in fact, true, so I thought I'd ask here.

I'm aware of how battery cycles work: https://assets.pcmag.com/media/images/567210-apple-chart-iphone-charge-cycles.png I'm also aware that battery life/health decreases over time and it's just a process that can't be helped because entropy.

My own experience with battery charging and health is varied (and I haven't performed nearly enough experiments to come up with a reasonable conclusion because I only have so much money to spend on electronics and so much time to spend draining and recharging batteries):

  • My laptop (which is from mid-2014) spends most of its time permanently plugged in and at 100% battery. If I ask it what its health is, it says it's at 82% health, it's completed 609 cycles and its actual charge is 96%... which probably means health is not a linear measure of the scaling of the 0%–100% range as it goes down compared to the original 0%–100% range, but some more complicated quantity. When I do unplug it, the battery lasts a good while (haven't measured it and certainly haven't compared it to what it originally lasted).
  • My old phone (an iPhone 6, so it's as old as whenever that came out, which is more recent than 2014) was charged whenever I thought the battery level was lower than I wanted it to be and I knew I wouldn't be able to charge it for several hours, and it often (but not always) got all the way to 100% and then remained plugged in for hours. If I ask it about its battery health, it says the maximum capacity is 73% and "[the] battery is signicantly degraded". The battery now lasts very little and drains very quickly; the phone will turn off (and claim it's run out of battery) when the indicator is anywhere from 1% to 53% — but this may be due to either age or charging habits, so it's inconclusive without more evidence.
  • I can figure out the battery healths and explain the charging habits of my old tablet and my wife's tablet phones if necessary.

I'd like to settle this once and for all, please, preferably with an answer that includes physics. What's the best way to charge a battery, and why is that that best way?

Rain
  • 161
  • 3
  • 2
    This question is really too much about the *usage* of consumer products to belong here, and in a technical sense, too specific to a battery type and the details of the battery management circuitry around it. Generally for modern lithium-chemistry packs in consumer products, running them all the way down is *not* a benefit to the battery, apart from occasional cases where a charge metering circuit gets confused and this may force it to reconnect with reality. – Chris Stratton Jan 03 '19 at 15:48
  • Apologies for asking my question here, @ChrisStratton. I wasn't aware it was the wrong place to ask. :) – Rain Jan 03 '19 at 18:22
  • Edited the question title to make it clearer that I'm looking for the physics behind why one procedure is better than the others and not just for tips on which procedure I should use. – Rain Jan 03 '19 at 18:41
  • That edit doesn't really fix the problems with the question. You're still approaching the finished product from a consumer "outside the black box" perspective with no *specific information about the battery or how it is being managed by the circuitry*. If you were actually *designing* something within the mission of this site, you would have specifics. – Chris Stratton Jan 03 '19 at 18:54

1 Answers1

7

The three statements you quote are all about correct. What's good for a battery heavily depends on the type (chemistry) of the battery.

In the old days, earlier than 10-15 years ago, the majority of rechargeable batteries were the NiMh (or NiCd before that) type. This type is more or less susceptible to the so-called "memory effect", which basically says that the battery ages sooner (starts losing capacity) if it is recharged before being more or less completely discharged.

With current Lithium-based batteries (LiPo, LiIon, LiFePo,...) which have largely replaced NiMh (replaced them completely in most areas, esp. in mobile computing), the situation is different:

Lithium batteries take some damage from deep discharges (0% or less, 'under discharge') and also suffer a bit from high charge levels (100%) especially if held there for extended periods and/or at elevated temperatures. That's why for almost all current rechargeable batteries (Lithium) the optimum useful lifespan in terms of total energy stored/provided before becoming more or less unusable is achieved by maintaining neither too low nor too high charge states. A rule of thumb is to try and keep them between 30-70% as much as you can. The actual optimum varies by battery type, manufacturer etc.; some may yield more energy in total if cycled between 20 and 80% for example, but the difference should not be much, i.e. a given battery always cycled between 20% and 80% may live twice as long as the same type of battery cycled between 2% and 98%, but going to 30-70% instead of 20-80% will probably not make much of a difference in either direction.

It may be noteworthy that the relation "lower cycles" = "longer lifespan" is not always true due to non-linear effects. That is, permanently cycling between 49% and 51% will give many more charge-discharge cycles than, say, 20%-80%, but the total usable energy will likely be less. (Notice that a "cycle" comprises only 2% of the total capacity in the former vs. 60% in the latter case. If you get 10x as many cycles for the shallow cycle, you still only get 10*2%/60%=1/3 of the total usable energy over the life of the battery.)

There are some articles at battery university, including at least one where they tested different strategies and plotted a graph of cycling range vs. total energy stored. I'll try and come up with a link.

Edit:

This should cover all your questions about what's best for Lithium-based batteries: https://batteryuniversity.com/learn/article/how_to_prolong_lithium_based_batteries

Some quotes:

If at all possible, avoid full discharges and charge the battery more often between uses. Partial discharge on Li-ion is fine. There is no memory and the battery does not need periodic full discharge cycles to prolong life.

.

Most Li-ions charge to 4.20V/cell, and every reduction in peak charge voltage of 0.10V/cell is said to double the cycle life.

.

On the negative side, a lower peak charge voltage reduces the capacity the battery stores. As a simple guideline, every 70mV reduction in charge voltage lowers the overall capacity by 10 percent. Applying the peak charge voltage on a subsequent charge will restore the full capacity.

The last quote can be interpreted as: Charging only up to ~85% (which should be at about 4.1V) will double the cycle life compared to charging to 100%.

So the key points to be aware of are:

  • Practically all rechargeable batteries you'll find around you are Lithium-based nowadays and these should not be treated like the old Nickel-based batteries
  • Rechargeable batteries gradually and permanently lose more and more of their initial capacity over time
  • Deep discharges (like to below 5-10%) permanently reduce a Lithium battery's capacity
  • High charges (like to above 90-95%) also take their toll on their lifetime
  • Every charge "cycle" eats a small amount of capacity, even if it's only a 1% charge cycle
JimmyB
  • 3,823
  • 2
  • 18
  • 21
  • In hybrid vehicles, the (Li based) batteries are deliberately not charged beyond about 70% as charging efficiency drops off above this. – Peter Smith Jan 03 '19 at 16:13
  • Exactly. IIRC, the Prius keeps the batteries between 30% and 80% at all times, thereby using only half of the potential capacity, in order to prolong battery life. – JimmyB Jan 03 '19 at 16:15
  • I think using percentages when discussing charging/discharging is confusing (e.g. 70% of *what* exactly?). What 100% means depends entirely of the battery management circuitry and the setpoints programmed into it, and the same goes for 70%, 50%, 30%, 10% etc. – anrieff Jan 03 '19 at 16:19
  • 1
    As an example of less ambiguous statement: I've heard that Li-Ion batteries charged to 4.1V cut off (at room temperature) age significantly slower than if they are charged to the standard 4.2V. Reducing the cut-off to 4.0V makes them last even longer. – anrieff Jan 03 '19 at 16:22
  • @anrieff I believe the consensus is that *voltage* is *not* the right variable, capacity is. The reason is that there's just too many different batteries out there with, e.g., vastly different discharge curves. I.e., at 4.1V one battery may be 95% full, another type (HV for example) may at that point be only 75% charged. Agreed that, in principal, the BMS can deliberately "lie" to us, like it does in the Prius ("battery empty" = "30% s.o.c.", "battery full" = "80% s.o.c"), but most probably won't; and even if they did, it's often still the best/only information we have. – JimmyB Jan 03 '19 at 16:26
  • 1
    @anrieff To maybe clear this up a little: The percentages I use are percentages of the battery's *actual* capacity. It's not necessarily the same as indicated by a BMS. So if you take a single cell, charge it to the rated maximum charge voltage (about 4.2V for most Lithiums) then discharge down to the rated 0% voltage (e.g. 2.5V), measuring the drained mAh, then you have established what 100% means for that specific battery at that point in its lifetime. – JimmyB Jan 03 '19 at 16:35
  • 1
    @JimmyB, agreed, then capacity is not a fixed value either, and you'd get different capacities depending on discharge rate and temperature. Also, since the OP is asking about consumer products, the % he sees is what the BMS says, i.e. potentially misleading. So it's wrong to give advice like "keep the battery between 30 to 70%", because you don't know what that really does under the hood. – anrieff Jan 03 '19 at 16:44
  • @anrieff You're right. One would have to measure the *voltages* while discharging at e.g. the 90%, 80%, ... state of charge points and then keep using those established voltages as the indicators for that battery. However, unlike in the e.v. sector, battery lifetime in consumer goods is not paramount, runtime on a charge is more important. Hence, even if the BMS is a bit conservative, e.g. allowing only the 5-95% range to be used, manually limiting charging cycles to 20-80% *of that range* will still be beneficial to the batteries lifetime. – JimmyB Jan 03 '19 at 16:51
  • 1
    @JimmyB the manual 20-80% limiting is only really useful for products that are aggressively optimized for runtime. If the BMS is conservative, especially *very* conservative as in the Prius case, the benefits of staying in 20-80% would be marginal, while the runtime lost would be quite substantial. Personally, I like to think that the designers of the product made the best possible educated decision when they calibrated the BMS setpoints, and in well-designed products, worrying too much about the battery life is irrational. – anrieff Jan 03 '19 at 17:10
  • Then the only real advice I have applies to storage. If your gadget is going to be shelved for extended periods (e.g. more than a few weeks), turn it off when its battery is around the midpoint value (40-60% of whatever its BMS tells you). – anrieff Jan 03 '19 at 17:15
  • 1
    JimmyB & @anrieff Thanks for your answer and comments. This is all very educational. The one complaint I have is that the physical mechanism for batteries becomming worse over time and for one charging methos being better than the others was never mentioned. The link provided in the answer contains some nice graphs and tables showing the performance of batteries subjected to different charging procedures, but it doesn't explain the physics of it either. What was the physical principle behind memory in NiMh and what is the underlying physical principle of underdischarging and overloading? – Rain Jan 03 '19 at 18:39
  • 1
    @Rain that question may be a better fit for Physics or Chemistry.SE, IMO. In the mean-time, here's a worthwhile [academic paper search](https://scholar.google.com/scholar?q=degradation+mechanism+rechargeable+batteries&hl=en&as_sdt=0&as_vis=1&oi=scholart). – Shamtam Jan 03 '19 at 19:09
  • 1
    @anrieff Again, I agree. I just doubt that manufacturers always optimize for maximum battery lifetime. Or, optimize in just the way an individual user would like his product to be optimized. In part it's just marketing logic. Hard to sell a laptop with a statement like "our laptop runs only half as long on a battery charge as the competitor's, but our battery will be good for about 8 years of use instead of the competitor's 2 years." For an e.v. on the other hand, you don't want to tell customers that they'll need to shell out 5000€ for a new battery every 2 years. – JimmyB Jan 03 '19 at 22:45
  • @rain See if you can find details in the link Shamtam provided. To my knowledge, the mechanisms behind both the Nickels' memory effect and Lithium battery degradation are generally not yet understood. In fact, it was empirically observed that NiMh suffer less memory effect than NiCd, but by some theories this is attributed to some extent to the more sophisticated chargers that were developed and introduced for NiMh, resulting in much less overcharge situations in average use which are known to damage cells' capacity. – JimmyB Jan 03 '19 at 23:00
  • Sorry for the long silence; crazy few days. Thanks for the advice; will take the question to Physics.SE. – Rain Jan 08 '19 at 20:52
  • @Rain, I'm interested too. Could you please post new link here? – Martian2020 Sep 05 '21 at 03:17
  • Good link to batteryuniversity. Pity they have not tested storage in freezer. Internet jury is out on that. Google search find numerous claims freezing restores capacity, in that QA here discussion is around protection from freezing: https://electronics.stackexchange.com/questions/467518/what-is-the-effect-on-lithium-ion-batteries-from-long-term-storage-in-sub-freezi – Martian2020 Sep 05 '21 at 03:40
  • "Deep discharges (like to below 5-10%) permanently reduce a Lithium battery's capacity" - that is just direct consequence of previous one - that battery constantly loses capacity no matter what. – Martian2020 Sep 05 '21 at 07:01
  • NiMH did not suffer from the memory effect, only NiCd did. NiMH did suffer from voltage depression however. The difference is that the memory effect decreases the mAh rating, while voltage depression increase the internal resistance (and thus reduces the nominal voltage) If you test the batteries by cutoff voltage, voltage depression looks like the memory effect, but you see that a battery affected by this returns to a higher voltage if the load is removed – Ferrybig Jan 12 '22 at 15:01