Battery voltage drops

ank

Newly Enlightened
Joined
Feb 1, 2017
Messages
61
Hi

I have a few Samsung 35E 18650 batteries. They came charged at 3.4V
So I've put them on the charger and left them until the charger showed 4.05V
Then I took the batteries out.
5 minutes later I put a battery back on the charger just to see the voltage and it showed 3.97V !
Is it normal for the voltage to drop like that, without using the battery?
 

ank

Newly Enlightened
Joined
Feb 1, 2017
Messages
61
I heard that charging batteries to 4.2V decreases their lifetime
 

KITROBASKIN

Flashlight Enthusiast
Joined
Mar 28, 2013
Messages
5,448
Location
New Mexico, USA
Batteries will show a higher voltage while charging. Figure out what resting voltage you want, and what volts it shows while charging when it is charged as much as you want, then you've got it.
 

ven

Flashaholic
Joined
Oct 17, 2013
Messages
22,533
Location
Manchester UK
As said, is normal................once the cell has rested it will drop if the charge cycle has not terminated. IMO your better just to charge them and use them, life is too short unless you can set your charger(so you dont have to keep checking it constantly to 4.1v for example). If you get 300 full cycles or 270 full cycles.............who is counting, who would know unless monitoring over years of use. Unless you plan on storing them(3.6v-3.8v ish), use and enjoy...........I still have cells going from many years back:)
 

KITROBASKIN

Flashlight Enthusiast
Joined
Mar 28, 2013
Messages
5,448
Location
New Mexico, USA
Agreed with Ven not to sweat the small stuff. Nevertheless, I will terminate charging early if I think of it and plan to not use the flashlight for a while. Maybe that is what the OP is doing,
 

ven

Flashaholic
Joined
Oct 17, 2013
Messages
22,533
Location
Manchester UK
True kitrobaskin, around 3.9v seems a happy medium for storage and ready to top off and go. Not as long to wait going from 3.6v,also 3.9v is enough voltage to be of use come an emergency (unexpected power cut for example).
 

terjee

Enlightened
Joined
Jul 24, 2016
Messages
730
Location
Bergen, Norway
Couple of quick things. First up, yes, 4.2V will add some stress to the cells, but before you give too much thought to forming habits around that, it's good to put it on a relative scale.

I'm inventing some numbers below, I'm just aiming in illustrating and putting things on a relative scale, not really aiming for precision.

Throwing out some ballpark numbers, the whole things looks a bit like this:
If you cycle the cell daily, charge pretty far down and back up to 4.2V, then you could expect about 1-2 years of life before battery capacity is at about 80%, and you should start thinking about a new cell. For most cases, that's a cost of something like $2-5 pr. year of usage.

If you have 4 cells for a light in rotation, that's bit more than 1/4th the wear.

If you only have one cell, but only charge it up every 4th day, that's also just a bit over 1/4th the wear.


If you do both, so you rotate four batteries, and charge every 4th day, it's be a bit over 1/16th the wear (from charging).

But batteries also age no matter what you do with them, and how well you treat them. So if you're doing the 1/16th thing, you'd still not get 16-32 years and still have 80% capacity. The effect of the cell aging - which happens anyway - would be much larger than the voltage related stress of charging to 4.2V.

In other words, it's true that it's more stress to charge to 4.2V, but it's also true that it doesn't make much of a practical difference for somewhat casual users for example.

There are a couple of cases where it does matter, such as if you're cycling the cell continuously 24/7, at that point I'd be interested in charging to 4.1V.

Same with extended charging of many cells, then charging about 3.9V could be worthwhile to consider.



One final thing: the charging algorithm for LiIon says to bring the battery voltage to 4.2V, and hold it there. Current will start to decrease, so the battery doesn't rise above 4.2V, but there's still a lot of charging left to be done.

If you remove the battery before the charger has reached 4.2V, the voltage will "snap back" to a lower voltage, but that's entirely normal and expected. If you wanted to charge to a lower voltage for storage, just charge to a bit higher than your target, and experiment to see about where you'd need to stop charging. The "snap back" effect will be lower with lower charge current, so you could use a lower current to make it easier as well.

There are chargers where you can set a different voltage than 4.2V as a termination voltage, but unless you'll be doing it a lot, it's not really needed.
 

Gauss163

Flashlight Enthusiast
Joined
Oct 20, 2013
Messages
1,604
Location
USA
I have a few Samsung 35E 18650 batteries. They came charged at 3.4V. So I've put them on the charger and left them until the charger showed 4.05V. Then I took the batteries out. 5 minutes later I put a battery back on the charger just to see the voltage and it showed 3.97V !
Is it normal for the voltage to drop like that, without using the battery?

Yes, the voltage drop V is proportional to the terminal charge current I (roughly V = I * R by Ohms law, where R = resistance, internal + external). You terminated at 4.05V so it was probably still in CC phase at high current. So if you were charging at 1A, and internal R = 0.06Ω (says HKJ) with 0.02Ω external (wires + contact) this yields a resting voltage drop of roughy 1A * 0.08Ω = 0.08V (exactly what you observed).

This is why the standard CC/CV charge algorithm tapers down the current to a small value before terminating (otherwise the voltage would drop far below the target charge voltage).
 
Last edited:

Gauss163

Flashlight Enthusiast
Joined
Oct 20, 2013
Messages
1,604
Location
USA
[..] The effect of the cell aging - which happens anyway - would be much larger than the voltage related stress of charging to 4.2V.

This is generally false. One can achieve great gains in cumulative capacity by avoiding extreme voltages (and temps) - even as high as a factor of 10 improvement for extreme optimizations (a fact that is exploited e.g. by NASA). Generally one can achieve significant gains by using only smaller SOC ranges (centered around 50% SOC). Whether or not this is worth the trouble will be highly-context dependent.
 
Last edited:

terjee

Enlightened
Joined
Jul 24, 2016
Messages
730
Location
Bergen, Norway
This is generally false. One can achieve great gains in cumulative capacity by avoiding extreme voltages (and temps) - even as high as a factor of 10 improvement for extreme optimizations (a fact that is exploited e.g. by NASA). Generally one can achieve significant gains by using only smaller SOC ranges (centered around 50% SOC). Whether or not this is worth the trouble will be highly-context dependent.

In all fairness, I did try to make it clear that I was oversimplifying things. The major point was mostly just that while you can extend the life of a cell, that doesn't automatically mean that the effect is significant enough to matter, especially when you take standby-aging into account as well.
 

iamlucky13

Flashlight Enthusiast
Joined
Oct 11, 2016
Messages
1,139
I haven't seen good data to draw conclusions from about the size of effect on number of lifetime cycles when charging to 4.1V vs. 4.2V when you're actually using your batteries. There is, according to every source I've seen, a benefit, but I don't know how large exactly. I figure if you're regularly recharging your batteries, you're getting your money's worth out of them regardless. If you get the typically rated 400 or so cycles, I'd be very happy with that for a flashlight application.

Like Gauss said, NASA controls of state of charge like this to maximize the lifespan of their batteries (as does Tesla, I believe). However, it's a very, very high priority for them since they can't replace worn out batteries on many of their spacecraft, and they need very long lives. When your 1/2 billion dollar mission depends on its battery to function, and you can't replace it because it's tens of millions of miles away, that's really critical. You or I can go online anytime and order a replacement for less than $10.

The results of extraordinarily careful battery management are impressive, however. For any space nerds here, since I went ahead and looked this info up out of curiosity:

The Opportunity Mars Exploration Rover was launched 14 years ago, and included a lithium ion battery pack to allow the rover to run high peak power loads like its drive motors off the low power output of its solar panels, as well as continue to function at night. The minimum mission goal was 90 days, and there was potential for up to 365 days.

My understanding is they are ICR cells (lithium cobalt oxide) with a low temperature electrolyte, in a custom-made 10 Ah size, with 16 cells in an 8S2P arrangement (~30V nominal). Discharge cutoff was 3.0V per cell. Engineers could select charge cutoffs of 3.85, 3.95, 4.15, or 4.2V to balance battery longevity with forecast power needs and solar array production. 4.15V was the normal setting. A paper assessing the battery performance after one Martian year (1.9 earth years) indicated depth of discharge was typically less than 50%, and capacity was estimated at 90-95% of original.

Opportunity is still running after 14 years and at least 4891 partial discharge cycles.
 

Gauss163

Flashlight Enthusiast
Joined
Oct 20, 2013
Messages
1,604
Location
USA
In all fairness, I did try to make it clear that I was oversimplifying things. The major point was mostly just that while you can extend the life of a cell, that doesn't automatically mean that the effect is significant enough to matter, especially when you take standby-aging into account as well.

It's automatic if you know what you're doing. Anyone with the appropriate knowledge and tools can usually achieve significant gains in calendar / cycle life / cumulative capacity by employing optimized (dis)charging and storage policies. But doing so might not not worth the effort for run-of-the-mill cells (vs. large expensive packs). Deciding where the tradeoff boundary lies for life vs. cost / convenience etc. will - of course - be a highly personal decision.
 
Last edited:

TinderBox (UK)

Flashlight Enthusiast
Joined
Jan 14, 2006
Messages
3,488
Location
England, United Kingdom
I haven't seen good data to draw conclusions from about the size of effect on number of lifetime cycles when charging to 4.1V vs. 4.2V when you're actually using your batteries. There is, according to every source I've seen, a benefit, but I don't know how large exactly. I figure if you're regularly recharging your batteries, you're getting your money's worth out of them regardless. If you get the typically rated 400 or so cycles, I'd be very happy with that for a flashlight application.

Like Gauss said, NASA controls of state of charge like this to maximize the lifespan of their batteries (as does Tesla, I believe). However, it's a very, very high priority for them since they can't replace worn out batteries on many of their spacecraft, and they need very long lives. When your 1/2 billion dollar mission depends on its battery to function, and you can't replace it because it's tens of millions of miles away, that's really critical. You or I can go online anytime and order a replacement for less than $10.

The results of extraordinarily careful battery management are impressive, however. For any space nerds here, since I went ahead and looked this info up out of curiosity:

The Opportunity Mars Exploration Rover was launched 14 years ago, and included a lithium ion battery pack to allow the rover to run high peak power loads like its drive motors off the low power output of its solar panels, as well as continue to function at night. The minimum mission goal was 90 days, and there was potential for up to 365 days.

My understanding is they are ICR cells (lithium cobalt oxide) with a low temperature electrolyte, in a custom-made 10 Ah size, with 16 cells in an 8S2P arrangement (~30V nominal). Discharge cutoff was 3.0V per cell. Engineers could select charge cutoffs of 3.85, 3.95, 4.15, or 4.2V to balance battery longevity with forecast power needs and solar array production. 4.15V was the normal setting. A paper assessing the battery performance after one Martian year (1.9 earth years) indicated depth of discharge was typically less than 50%, and capacity was estimated at 90-95% of original.

Opportunity is still running after 14 years and at least 4891 partial discharge cycles.

Interesting stuff, Amazing battery life, Any links to the battery used and specifications, I did google.

Thanks

John.
 

Gauss163

Flashlight Enthusiast
Joined
Oct 20, 2013
Messages
1,604
Location
USA

I don't see how that is relevant to the above (and it's far too vague to be of much use). Instead, see this post where I show some prototypical results from recent studies. These should give you a feel for what sort of optimizations can be obtained.

There you'll find a graphs (Wöhler Curves) showing how cycle life depends on the depth of discharge. They illustrate how Li-ion chemistry differs dramatically from other chemistries (e.g. lead acid), which show little improvement in cumulative capacity under shallower depths of discharge. This is probably why many folks are confused about these matters - they expect their intuition from other chemistries to transfer to Li-ion. But that is not always correct. Beware of claims on random websites (e.g. Battery "University"), esp. when - as often the case - they are not supported by scientific studies.
 
Last edited:

sbj

Newly Enlightened
Joined
Feb 19, 2017
Messages
173
I have linked to it, because the table shows, how the battery life lengthens when the batteries are charged to less than 4.20V.
Similar values I have already read in other sources. I can not therefore, refute these statements and assume that they are correct.
 

terjee

Enlightened
Joined
Jul 24, 2016
Messages
730
Location
Bergen, Norway
Guys I think (almost?) everyone agrees that both DoD and termination voltage has been shown to have *some* impact the useful life of a LiIon cell. I haven't really seen much attempt at disputing it.

The interesting thing here is probably not if it's theoretically such and such, but rather how that applies in terms of practical use. For a cellphone for example, you'll probably cycle it just about daily for years, and this stuff can really make a difference.

But what about flashlights?
Often it'll be significantly different from a cellphone. Might be used significantly less, might be rotation between multiple batteries, and replacing batteries costs significantly less.

My conclusion has so far been that for cellphones, yeah, it probably makes sense for most to try to not drain to 5% every day, but it's probably too much hassle to actively try to avoid charging to 100%

For flashlights, I'm less convinced about both DoD and termination voltage. I'm not at all trying to dispute that you can gain some extra life if you terminate at 4.1V rather than 4.2V, rather the opposite. I'm backing the claim, but I'm also arguing that the effect is probably too little that it makes sense for most to care about it.

Rather than discussing if it can make a difference or not - which isn't all that useful to the average user - it would be intersting to see some back of the envelope ballpark calculations showing that it does or should matter. And for which cases does it become an issue?

Take an example user, rotating four batteries, charging once a week, up from 50% DoD when the battery hits the charger. My take is that it's not worth the hassle to avoid charging above 4.1V. I'm not disputing that it'd be good for the battery, I'm suggesting it wouldn't be worth it for the user.
 

Gauss163

Flashlight Enthusiast
Joined
Oct 20, 2013
Messages
1,604
Location
USA
[...] Rather than discussing if it can make a difference or not - which isn't all that useful to the average user - it would be intersting to see some back of the envelope ballpark calculations showing that it does or should matter. [...]

For some ballpark estimates, using the Wöhler curves from the Ecker study that I linked above, if you charge to 90% SOC (about 4.07V), and do 80% DODs then you'd gain about 33% in equivalent full cycles (vs. 100% (dis)charges).

But you'd more than double the number of equivalent full cycles if you only charge to 75% SOC (about 3.94V) with 50% DOD. Said equivalently, you'd get more than double the total number of Ah (cumulative capacity) from the cell before it degrades to 80% of its initial capacity. But you'd have to charge it twice as frequently since you're using only half of its capacity in each discharge (25% - 75% SOC).
 
Last edited:

terjee

Enlightened
Joined
Jul 24, 2016
Messages
730
Location
Bergen, Norway
For some ballpark estimates, using the Wöhler curves from the Ecker study that I linked above, if you charge to 90% SOC (about 4.07V), and do 80% DODs then you'd gain about 33% in equivalent full cycles (vs. 100% (dis)charges).

But you'd more than double the number of equivalent full cycles if you only charge to 75% SOC (about 3.94V) with 50% DOD. Said equivalently, you'd get more than double the total number of Ah (cumulative capacity) from the cell before it degrades to 80% of its initial capacity. But you'd have to charge it twice as frequently since you're using only half of its capacity in each discharge (25% - 75% SOC).

Yeah, this is all fine, but what I mean is, what does all of this mean to a user?

If a battery can live for 10 years (I just made that up), or 30-6000 full-ish cycles (about ballpark right), whichever comes first, that would those increases make sense for the user? Or would the difference between charging to 4.07V vs. 4.2V be behind the 10 year horizon anyway?

Put differently: If a battery is retired after 10 year regardless, it doesn't matter much if the charging-related stress would kill it after 15 years, and we're talking about pushing that to 30 years. It wouldn't give the user any kind of practical information that could be used for improving anything significant, but it would add the extra work for charging carefully, or getting a better charger.
 
Top