A question about charge cycle induced cell wear.

JCD

Enlightened
Joined
Apr 12, 2010
Messages
892
Consider the following two identical Li-ion cells:

Cell A is charged to 4.15 V. It is then discharged in a (hypothetical) light with a 100% efficient buck/boost regulator to 3.60 V, at which point x mWh of stored energy have been depleted from the cell.

Cell B is charged to 4.25 V. It is then discharged in the same light until x mWh of energy have been used. Its voltage is greater than 3.60 V (about 3.65 V?), and it should have roughly 10% capacity remaining (based on a 4.20 V = 100% capacity).

Repeat process 100 times.

Which cell will suffer the most wear? Will cell A wear faster because it is consistently discharged to lower levels, or will cell B wear faster because it is consistently charged to higher levels? Will there be any difference?

What I'm wanting to know is if the extra wear from charging cells to higher voltages wears cells quicker because of the higher voltage, or if it's because they have more energy depleted from them per charge cycle, due to the additional capacity. Does a complete discharge (to minimum voltage that leaves the cell safely rechargeable) cause more, less, or similar wear compared to charging a cell to it's maximum recommended level?
 
Last edited:
or if it's because they have more energy depleted from them per charge cycle, due to the additional capacity.

I'm not going to pretend to be a LiIon expert, but this quoted statement cannot be the case. Because nobody intentionally and habitually would drain their cells that deeply to even realize any "additional capacity".

A simpler question is, if you always shallowly drain two cells by just 200mAh, which is going to last longer over time, the one charged to 4.20V or the one charged to 4.25V? Or will they age the same? I'm guessing, just guessing, that the 4.25V victim will go first.
 
or if it's because they have more energy depleted from them per charge cycle, due to the additional capacity.
I'm not going to pretend to be a LiIon expert, but this quoted statement cannot be the case. Because nobody intentionally and habitually would drain their cells that deeply to even realize any "additional capacity".

All else equal, a Li-ion cell charged to 4.25 V will have about 10-11% more capacity than one charged to 4.15 V.

Perhaps my description wasn't sufficiently clear. The experimental methodology is designed specifically to not utilize the extra capacity available in cell B, but rather to discharge cell A from 4.15 V to 3.60 V and cell B from 4.25 V to ~3.65 V, such that exactly x mWh of energy is drained from each cell at exactly the same rate. One cell is charged to its safe limits, and the other cell is discharged to its safe limits.

A simpler question is, if you always shallowly drain two cells by just 200mAh, which is going to last longer over time, the one charged to 4.20V or the one charged to 4.25V? Or will they age the same? I'm guessing, just guessing, that the 4.25V victim will go first.

I think you would see the results you predict with that experiment. However, that doesn't offer us any information about cell wear at the bottom end of the charge/discharge cycle, since we would only be partially discharging the cell, not even utilizing 25% 0f the available capacity. It would only provide half of the information I'm looking for.
 
Last edited:
Oops I missed the fact that we are assuming equal watt hours here. My mistake. So the 4.25V guy isn't going to be drained quite as much with respect to mAh.

I'm not sure how "capacity" comes into play at all here. I see two possible questions here:

1) The simplistic "Which is worse, discharging all the way down to 3.60V or 'over' charging to 4.25V?"

2) And then the more interesting,
Does the benefit of having to draw fewer mAh to get the same mWh negate the detrimental effects of charging to 4.25V?

Is that where you're going with this?
 
I see two possible questions here:

1) The simplistic "Which is worse, discharging all the way down to 3.60V or 'over' charging to 4.25V?"

2) And then the more interesting,
Does the benefit of having to draw fewer mAh to get the same mWh negate the detrimental effects of charging to 4.25V?

Is that where you're going with this?

Exactly. Thanks for showing me I needed to clarify and for stating the questions so clearly! :)
 
Which cell will suffer the most wear?

All lithium-ion cells degrade with time (which is true of any battery chemistry actually). The three parameters that affect performance degradation are:

1.) Temperature
2.) State of charge (cell voltage)
3.) Number of cycles

All things being equal, a lithium-ion cell cycled to an upper voltage cut-off of 4.25V will degrade much faster than one cycled to 4.15V. The degradation will likely include both capacity loss and impedance rise.

So the simple answer to your question is this: the cell charged to 4.25V will degrade faster than the one charged to 4.15V.

Cheers,
Battery Guy
 
Last edited:
All things being equal, a lithium-ion cell cycled to an upper voltage cut-off of 4.15V will degrade much faster than one cycled to 4.25V.

Did you rather mean this: ?
All things being equal, a lithium-ion cell cycled to an upper voltage cut-off of 4.25V will degrade much faster than one cycled to 4.15V.
 
All things being equal, a lithium-ion cell cycled to an upper voltage cut-off of 4.25V will degrade much faster than one cycled to 4.15V. The degradation will likely include both capacity loss and impedance rise.

Thanks for your response. I believe that has been well established, but I don't believe it answers the questions I'm trying to ask.

Strictly speaking, if the upper voltage cutoff is different, all else cannot be equal. If the two cells are discharged to the same voltage level while providing the same power, then cell B will have discharged more total energy during discharge than cell A. If the cells discharge the same amount of energy at the same rate, then cell B will end the discharge (via manual termination) at a higher voltage. The two cells will be in a different state of charge.

From what I can tell, most "all things being equal" comparisons refer to the former scenario, where the two cells are discharged to the same voltage levels. I'm interested in the latter scenario, where the two cells have equal amounts of energy discharged, at the same wattage but with differing currents and voltages.

I'm under the impression that charging to a higher upper cutoff voltage decreases cell life, but so does a higher current discharge and a deeper discharge. I'm interested to see how the effects of the first factor compare to the combined effects of the second and third factors.
 
Last edited:
As I understand it a lithium ion cell will degrade in proportion to the time spent at higher states of charge (higher voltages). This is because when charged to higher voltages the unwanted side reactions that cause deterioration proceed faster.

Conversely, the side reactions proceed more slowly at lower voltages, unless you make the voltage too low (like below 2 V). In this region other unwanted side reactions occur that also cause problems.

So to make a lithium ion cell last the longest, you need to make the time integrated average voltage as low as you can. Obviously this is best achieved by not charging it up at all and keeping it at 3.7 V or so, but then the cell no longer has any utility.
 
So to make a lithium ion cell last the longest, you need to make the time integrated average voltage as low as you can. Obviously this is best achieved by not charging it up at all and keeping it at 3.7 V or so, but then the cell no longer has any utility.

If this is the case, (and I have no reason to doubt it necessarily) then why are we cautioned against deeply discharging LiIon cells if cell life is important? (i.e. why the 40-60% in the hybrid vehicle thread, why not 10%-50% instead?) Also, why does conventional wisdom say to store them at 40%? Why not lower, if you can monitor it closely enough to keep it there?

Just trying to get a handle on this; not disagreeing.
 
If this is the case, (and I have no reason to doubt it necessarily) then why are we cautioned against deeply discharging LiIon cells if cell life is important? (i.e. why the 40-60% in the hybrid vehicle thread, why not 10%-50% instead?) Also, why does conventional wisdom say to store them at 40%? Why not lower, if you can monitor it closely enough to keep it there?

Just trying to get a handle on this; not disagreeing.
I can't give you full answers to all of those questions -- I am not really an expert.

In the hybrid thread the 40-60% was in reference to NiMH cells, which have different characteristics from lithium ion cells.

As to why 40% and not 0%, I think this is probably a law of diminishing returns. Once you get below 40% state of charge the rate of deterioration is so low there is no point in discharging them further. (In chemistry, reaction rates are an exponential function of energy, and voltage is a measure of energy. This means, for example, that a small increase in voltage can double the rate of deterioration.)
 
In the hybrid thread the 40-60% was in reference to NiMH cells

Oops you're quite correct. (That's what happens when I don't look up the thread I'm quoting.) I still thought it was unwise to discharge li ion cells to empty if you can help it though.

As for the storage recommendations, that explanation makes sense and is reasonable.
 
Did you rather mean this: ?
All things being equal, a lithium-ion cell cycled to an upper voltage cut-off of 4.25V will degrade much faster than one cycled to 4.15V.

Mr. Happy,

YES! Thanks for noticing my error and not dumping on me! I edited my original post. My apologies, and thanks again for catching this mistake.

Cheers,
Battery Guy
 
I'm under the impression that charging to a higher upper cutoff voltage decreases cell life, but so does a higher current discharge and a deeper discharge. I'm interested to see how the effects of the first factor compare to the combined effects of the second and third factors.

For a lithium-ion cell, this is going to be very dependent on the particulars of cell design, manufacturing quality, etc... But, here are the general rules of thumb for lithium-ion:

1.) higher upper cutoff voltage decreases capacity and increases internal resistance
2.) higher temperature reduces capacity and increases internal resistance
2.) higher current does not do much to decrease cell performance
3.) lower cutoff voltage does not impact cell performance as long as the cell discharge cutoff is >2.5V and the cell is not stored for long periods of time <3.5V.

Bottom line is that high voltage and high temperature are the enemies of lithium-ion. Low voltage can do a lot of damage too, but the voltage needs to be below 2.5V (cell potential). The exact voltage where overdischarge will damage the cell is dependent upon cell design and usage history.

Cheers,
Battery Guy
 
For a lithium-ion cell, this is going to be very dependent on the particulars of cell design, manufacturing quality, etc... But, here are the general rules of thumb for lithium-ion:

1.) higher upper cutoff voltage decreases capacity and increases internal resistance
2.) higher temperature reduces capacity and increases internal resistance
2.) higher current does not do much to decrease cell performance
3.) lower cutoff voltage does not impact cell performance as long as the cell discharge cutoff is >2.5V and the cell is not stored for long periods of time <3.5V.

Bottom line is that high voltage and high temperature are the enemies of lithium-ion. Low voltage can do a lot of damage too, but the voltage needs to be below 2.5V (cell potential). The exact voltage where overdischarge will damage the cell is dependent upon cell design and usage history.

Cheers,
Battery Guy

Thank you. That's very helpful and answers my questions. :)

I would like to ask a followup question. What constitutes a "long period of time" with respect cells at less than 3.5 V? As a rough estimate: 1hr? 3 hr? 6 hr? 12 hr? 1 day?
 
I would like to ask a followup question. What constitutes a "long period of time" with respect cells at less than 3.5 V? As a rough estimate: 1hr? 3 hr? 6 hr? 12 hr? 1 day?

The real worry about storage below 3.5V is that you are on the steep part of the discharge curve. A conventional lithium-ion cell at 3.5V has almost no capacity left, and a small amount of further discharge (either self-discharge or discharge through some leakage current in a circuit) will bring the voltage down quickly.

So there is really nothing wrong with 3.5V, it is only that 3.5V is right on the edge.

With that said, a high quality lithium-ion cell with a low self-discharge rate can be discharged to 3.5V and sit on a shelf for months with no problems. However, if you are planning on storing your lithium-ion cells for a few months or more, I would recommend bringing them up to 20-30% SOC.

Cheers,
Battery Guy
 
Bottom line is that high voltage and high temperature are the enemies of lithium-ion. Low voltage can do a lot of damage too, but the voltage needs to be below 2.5V (cell potential). The exact voltage where overdischarge will damage the cell is dependent upon cell design and usage history.

I think BG has brought up a good point. Cells made by different manufacturers likely have slightly different characteristics however, it is my understanding that lithium cobalt oxide cells have a higher internal oxidation rate when stored at either extreme within their normal state of charge range. This is why it is recommended to store them in a 40% SOC (~3.80 Volts) condition.

Storing LiCo cells in a higher or lower state of charge, apparently results in faster degradation of the cell. I'm not sure which is more severe, and I don't know about other Li-Ion chemistry cells. As far as I know, this only applies to LiCo cells, but other Li-Ion chemistries may have similar characteristics.

It has also been shown the the total amount of Watt hours delivered from a LiCo cell during it's lifetime can be substantially increased if the cell is only discharged to say, 3.8 Volts rather than 3.7 Volts, or 3.7 Volts rather than 3.6 volts etc. A common recommendation, is to only discharge cells to 20% remaining capacity (~3.7 Volts). To get the maximum service from a LiCo Li-Ion cell, you want to avoid prolonged storage, or use at either end of the 3.5-4.2 volt range.

Dave
 
It has also been shown the the total amount of Watt hours delivered from a LiCo cell during it's lifetime can be substantially increased if the cell is only discharged to say, 3.8 Volts rather than 3.7 Volts, or 3.7 Volts rather than 3.6 volts etc. A common recommendation, is to only discharge cells to 20% remaining capacity (~3.7 Volts). To get the maximum service from a LiCo Li-Ion cell, you want to avoid prolonged storage, or use at either end of the 3.5-4.2 volt range.

Oh boy.

That would mean I'm still looking for an answer to my questions! :thinking:
 
It has also been shown the the total amount of Watt hours delivered from a LiCo cell during it's lifetime can be substantially increased if the cell is only discharged to say, 3.8 Volts rather than 3.7 Volts, or 3.7 Volts rather than 3.6 volts etc.
I have seen 10% increases in constant perfect experimental conditions, but in normal retail use I don't think anyone could tell the difference. You also should only charge to 4.1V but now you are only using 50% of the cells capacity for a small increase in longevity.

However, you've now made lithium batteries perform more poorly than NiMH. Why bother?

As for the original question, I don't think you'd see a detectable difference in 100 cycles. At 500-1000 cycles, slightly over-charged batteries should decline more quickly than slightly over-discharged batteries but the obvious answer is that over-charging and over-discharging decrease cycle life. The details are very battery dependent.

Rick.
 
Last edited:
Oh boy.

That would mean I'm still looking for an answer to my questions! :thinking:

Yeah, sorry. :( I can't really answer your original question. My guess would be that it would work out about the same.


However, you've now made lithium batteries perform more poorly than NiMH. Why bother?

Let's make it clear we're talking about Li-Ion cells, not lithium primaries. I wouldn't say Li-Ion cells perform more poorly than NiMh cells in general. I'd say they are about the same. The biggest advantages Li-Ion cells have over NiMH's are their inherently higher voltage, which allows more efficiency in circuits, and the fact that they are much lighter than nickel based cells. In a Wh to Wh comparison though, they are actually about the same. (EDIT to add, by volume)

Dave
 
Last edited:
Top