4

Reading a few other answers including: this detailed answer, this one and this one on apple discussions

The evidence in those answers focusses on the 100% start point in the discharge cycle.

Generally there is agreement that the net amount of usage (total current in and out) increases with lower depth of discharge, or at least lower cycle range. I wonder if the word 'depth' isn't a little erroneous. Apparently Li-ion degrades fastest at the highest state of charge.

Many of my devices have design lives of 1000 cycles. If I want to increase this by reducing the depth of discharge cycles I might say reduce the cycles to 20% rather than 80% cycles. If I do that is it best to start at 100% and go down to 80% or start at a lower value and go down from there? What is the most optimal point?

I presume starting at 20% and draining to 0% is a bad idea.

The use case I'm most interested in is my laptop, and mobile phone.

  • 3
    As I understand it, the last few percent of charge (where voltage increases quite rapidly) also stresses the battery, so 90% down to 70% is probably better than 100% down to 80%. (That would correspond to stopping charge at 4.0 or 4.1 rather than 4.2V per cell) –  Dec 22 '17 at 01:54
  • Interesting. That sounds like an important piece of the puzzle. Any evidence of that? If we could build the picture further with evidence of damage at low voltage, we might get a lot of the way there to building a full picture. – Not a chance Dec 22 '17 at 04:12
  • I just found [this](https://www.sciencedirect.com/science/article/pii/S0026271415301505) but they don't say what is ideal in percentage charge terms. – Not a chance Dec 22 '17 at 04:15
  • I personally have knowledge of at least one battery pack that only goes to 4.1V instead of 4.2 on the high side. I have designed several mass produced lithium ion battery devices, and I always cut off on the low side at around 3.5 or 3.4V per cell. Battery pack designers have the option to define what voltage or capacity 100% and 0% really are. It may not be the same as what is shown in the cell datasheet. Personally, when I buy a consumer device, I assume the pack designer has made a reasonably good choice, and I just don't worry about it. – user57037 Dec 22 '17 at 04:28
  • 1
    The point being, that you don't really know what "100%" and "0%" really are. But you are right that cycling near the middle will probably prolong battery cycle life. – user57037 Dec 22 '17 at 04:30
  • 1
    Agreed it's hard to find numbers on this use case. There's a lot of work done on maximising capacity. But much less on minimising the cost/kWh of storage (mainly coming from battery cost/life cycle). But with almost free solar, I believe that's about to change. MIGHT be worth looking at LiFePO4 (3.2V/cell but lasts 2000 cycles at 80% discharge, or LTO (2.4V/cell, 20000 cycles ditto). Both have lower storage density, but you can use most of it, so may win overall. –  Dec 22 '17 at 10:35
  • Mkeith, thanks for your input. I agree that in general we should accept what the manufacturer has decided. I think that is safe in the case where the manufacturer doesn't sell their product by advertising how many mAh capacity their device's battery pack has, or how many hours of usage it has, but in the case of a lot of portable Li-ion batteries (and maybe also MacBooks), they really advertise this total capacity, which leads me to think that they might push the limits in order to sell more. I just bought a 20Ah portable Li-ion and it doesn't say anything about the number of design cycles. – Not a chance Dec 28 '17 at 23:58

1 Answers1

1

Both extremes of full charge, and full discharge, reduce the life of a lithium cell. Operating the cell between nominally 80% and 30% SOC will give a greatly improved lifetime.

NASA apparently charge their Lion's to 3.92v instead of the rated 4.1v to maximise their life.

Neil_UK
  • 158,152
  • 3
  • 173
  • 387