Charging LiFePO4 batteries, algorithms

HKJ

Flashaholic
Joined
Mar 26, 2008
Messages
9,715
Location
Copenhagen, Denmark
[size=+3]Charging LiFePO4 batteries, algorithms[/size]

DSC_1927b.png


Chargers use slightly different methods to charge LiFePO4 batteries, do this change how much energy is stuffed into the battery?
I am working on a computer controlled charger for testing and for one of the tests I decided to check this.



[size=+2]The methods
[/list]



  • CC/CV voltage with 3.4 to 3.8 volt as target and a very low current termination
  • CC/CV voltage with 3.6 volt as target and a high termination current.
  • Voltage charge, i.e. check battery voltage with current off.

Easy to test when you have a charger where these parameters can be adjusted.



[size=+2]The tests[/size]

All test are done as a charging, followed by a discharging and capacity measured at both operations.

Charge-18650-LiFePO4-CCCV-1.0%203.4%2010m.png


The first test to 3.4V, this is only slightly above the full cell voltage.

Charge-18650-LiFePO4-CCCV-1.0%203.6%2010ma.png


The usually voltage used for charging LiFePO4 is 3.6V or 3.65V.

Charge-18650-LiFePO4-CCCV-1.0%203.8%2010m.png


A 3.8V charging voltage is rather high, but it do not stuff more capacity into the cell. A slight temperature increase is present when the voltage goes high.

Charge-18650-LiFePO4-CCCV-1.0%203.6%20200m.png


Using a faster termination, i.e. higher termination current.

Charge-18650-LiFePO4-VOLTAGE-1.0%203.6.png


Using pulsing current and measuring the voltage when current is off, this is a fast charge method, but as can be seen above it is not faster than the other methods.

Result.png


capacity is in mAh, TC is Termination Current in mA

The capacity difference between the charge voltages is very small, but using a low voltage (3.4V) means slower charging. The higher termination current do not have much influence on the charge time, but may affect capacity.

Charge-18650-LiFePO4-CCCV-1.0%203.6%2010ma%20pct.png


How much can a faster termination affect the end result? The current start dropping at 97.2% or 1119mAh charged, this means that even a charge without any CV (Constant voltage) phase will be within about 3% of full capacity.



[size=+3]Conclusion[/size]

This is just a simple and fast test, but it shows that the different charge methods only has a small influence on the final charge on a LiFePO4 battery.
 
First off, unlike lead getting to 100% Full is completely unnecessary, in fact stressful for the cells, harms longevity.

The top voltage on the data sheet is a "don't get near this maximum" not a recommended termination / CV / Absorb setpoint.

3.8V is just silly! And 3.6V not conducive to longevity, although immediately drawing down to storage voltage would help mitigate the effect, it's sitting for long time periods at high SoC that really kills lifetime cycles.

But the point is, going above 3.45-3.50Vpc is pointless, no reason at all to do so. You can see the top "shoulder" in the voltage curve, that should really be avoided for longevity.

Resting 3.33Vpc - 3.35Vpc really is as high as you want, as you note trying to push in more does not result in actual usable Ah capacity, the charger output is just harmfully "churning" chemical processes and generating heat.

Fact is, there are hundreds of varying profile specs that all get to a given SoC point.

Voltage on its own says nothing, need initial current rate too.

A CC-only (Bulk) HVC cutoff works fine. At a low charge current, say under 0.2C, just stop at 3.45Vpc. After an hour check resting, might tweak a bit.

At a high rate, up to 0.4C in cool ambient, you could stop at 3.50V or so.

There is no reason to go higher in normal cycling.

If you need a precise benchmark for say capacity / SoH testing purposes, then use 3.45Vpc and hold CV (Absorb) stage until trailing current drops to 0.03C.

If you want to compare different profiles' outcomes' "SoC deltas", note that "surface charge" higher voltages over 3.35Vpc, do not signify actual usable Ah stored.

A tiny load for a few seconds will dissipate the voltage down, and be sure to give that standard isolated resting period, say an hour.

To get a very precise benchmark, do a CC dummy load drawdown to 3.0Vpc as 0% SoC and precisely time it, using a smartphone app if your load does not feature that (for other readers, of course OP's does).

A coulomb counter is quite a bit less accurate.
 
Last edited:
When datasheets say "Charge termination voltage" or something similar I will use that as charge voltage.
Just about all chargers terminates around 3.6V to 3.65V volt.
I was curious about chargers that terminates outside this range and about termination current, the above test gave me a good idea about it.
 
Glad to see this post, as I was about to make a new thread about some confusion I have regarding charging LifePo4.

I'm attempting to charge a pair of K2 Energy 3.2V LFP123A rechargeables on my liitokala lii-402 using the 3.2V setting. I'm confused, as I was under the assumption that these cells should charge to around 3.65V, according to the K2 Energy's description on the back of the packaging.

http://www.led-resource.com/wordpress/wp-content/uploads/2012/03/SF_LFP_01.jpg

The cells are coming off the charger at 3.35V, which is described as optimal by john61ct. Yet the packaging, and various other posts I've read say they should be coming off at 3.65V.

Are my cells at full capacity being only charged to 3.35V? Or is there something unique about these cells that require them to be charged up to 3.65V to obtain full capacity? Is there another setting I must use on the charger?
 
Last edited:
If you check the voltage curves on my charts you will see that the batteries drops to 3.35V or thereabout when the charging current is turned off.
 
There are many different voltage points to be aware of.

What is described as "termination" or "CV / Absorb" is the voltage setpoint on the charger, call that A. It does not by itself correlate to any specific SoC, depends on current rates, if CV stage is held and how long and/or endAmps spec used

Then you have B. the actual measured voltages at the battery when isolated.

This will be close to A immediately after charging, but at rest, with surface charge removed, will settle to 3.33-3.35Vpc if the battery is Full.

And C. the maximum "stress voltage" on the datasheet, which the cells should never exceed even when benchmarking / stress testing. In your case 3.65V, and really, do not even approach that unless you know exactly what you're doing - very bad for longevity to let the battery sit there. Most of the industry uses that for A, because stupidity or carelessness, lack of concern for longevity.

D. 3.2Vpc is just nominal voltage, a label for LFP chemistry, about the 50% SoC MPV.

Are my cells at full capacity being only charged to 3.35V?
So, that is B. where they sit at 100% SoC after resting isolated.

Use A=3.45V or a bit higher at a high C-rate, can go lower at a ling gentler charge, to get to that point.

Anything over 3.55V won't add more than a mAh or two, IOW no significant range / cap utilisation, and at the cost of shortening lifespan.
 
Note my "obsession" with longevity is based on using much larger (and more expensive) LFP cells for various low C-rate use cases, where 4-5000 cycles is the norm.

With thousands of dollars invested, being able to go for decades, maybe 10,000 cycles, is the goal.

In a high C-rate application where 1000 cycles might be a lot, going the extra mile in your care protocols, to only get a few hundred extra cycles might not be worth it.

Especially for cells costing less than a hundred bucks.

In which case just relax and use the standard LFP settings on your charger.
 
Last edited:
In regards to degradation, the missing element documented, but not discussed is the amount of *time* spent at or near full charge. Time is more of a degrading factor than voltage - within reason, like limiting a higher-voltage charge to 3.6v max for CV.

As the charts show, any charge set for a CV from 3.4 to 3.6 will achieve full charge. It just takes different amounts of time for the so-called absorb to full to take place.

The general idea is one is cycling to full charge (not usually necessary), it is better to limit the time spent at or near full charge by using a higher voltage like 3.6, and stopping the charge, rather than linger for a long time using 3.45v for CV, and spending way too much time achieving near-full capacity.

In other words, get it charged and over with quickly.

The overly-cautious route of using a low cv like 3.45v can actually be more damaging in the long run if one tries to achieve full charge each cycle - and has the time for it.

Solar is a particularly good example - if one is daily cycling, it is best to set the cv high, to get as much in as possible during intermittent sunny periods. If the cv is set low, AND you run into intermittent sun, you may fall short in your charging needs.

From HKJ's awesome charting, it is easy to see that a small nudge in CV, especially for solar users to say 3.5v CV, is kind of an ideal middle-of-the-road setting to get charged quickly, and possibly avoid the concerns of using a high cv voltage like 3.6v for long-term life.

The moral of the story is that most assume voltage is the bad-actor, when the REAL bad actor is the one of TIME. Unfortunately, that kind of degradation is a long-term slope, not seen with brand-new diy builds that are claimed as a success.

But, with so-called cheap LFP, there is a willing market to sell you cells over and over at the 3rd or 4th year. Sound familiar to lead-acid practice? You bet. :)
 
In regards to degradation, the missing element documented, but not discussed is the amount of *time* spent at or near full charge. Time is more of a degrading factor than voltage - within reason, like limiting a higher-voltage charge to 3.6v max for CV.
Already clearly stated right off the bat:
3.6V is not conducive to longevity, although immediately drawing down to storage voltage would help mitigate the effect, it's sitting for long time periods at high SoC that really kills lifetime cycles
> The overly-cautious route of using a low cv like 3.45v can actually be more damaging in the long run if one tries to achieve full charge each cycle - and has the time for it.

Poppycock.

> As the charts show, any charge set for a CV from 3.4 to 3.6 will achieve full charge. It just takes different amounts of time for the so-called absorb to full to take place.

No Absorb Hold Time is actually required to get to "functionally Full", but the key variable for both the ending SoC and time it takes is the (CC / Bulk stage) **current** rate.

It becomes possible to optimize charging for a particular pack with a simple power source, ideally with variable current and an HVC.

But either way is fine, if we're only talking minutes or a few hours, or only occasionally.

It's **habitually sitting** at high SoC for an overall significant proportion of lifespan that causes significantly reduced longevity.

Whereas a too-high charge termination **voltage** can actually damage cells right then and there, held too long even cause a fire very hard to distinguish etc.
But the point is, going above 3.45-3.50Vpc is pointless, no reason at all to do so. You can see the top "shoulder" in the voltage curve, that should really be avoided for longevity.

Resting 3.33Vpc - 3.35Vpc really is as high as you want, as you note trying to push in more does not result in actual usable Ah capacity, the charger output is just harmfully "churning" chemical processes and generating heat.
There is just **no point** to going to a higher voltage, absolutely nothing to be gained!

Especially if you are holding CV anyway, a standard endAmps spec like 0.03C will lead to an exact benchmark 100% SoC every time, no matter how much the CC/ Bulk stage current varies.

And even with a 200+Ah bank charged with a small-amp charger, I've never had CV last for more than a few minutes, it's the starting SoC% and CC/ Bulk stage current that determines charge duration, not the end-charge algorithm.


With low C-rates like solar, then as I said 3.5V is a fine setpoint.

But going any higher CV / Absorb won't result in getting there any faster, the actual bank voltage won't be hitting that point until you're at 97-99% Full anyway.

And of course if you know you won't be getting to Full anyway (who cares?) then even a 3.8V setpoint won't do any harm, unless you forget to reset it before insolation conditions improve, and then you risk real damage.

> The moral of the story is that most assume voltage is the bad-actor, when the REAL bad actor is the one of TIME.

If you mean AHT then I agree. But with a CC-only control scenario, time becomes irrelevant!

And in either case, it's the charge **current** that really needs to be taken into account. Sure, most may always stay in a narrow range of variability there, say 0.1C to 0.4C ,

but for those whose sources can vary a lot, say from a solar trickle to a 12kW alternator, it becomes **critical** to regularly adjust your end-charge profile as needed, depending on temperature (the other super critical safety variable!)

And you are 100% correct, that vendors don't want you to figure out how to get maximum longevity out of their cells, or with LFP and LTO they'd literally go out of business, such banks at low C-rates become capital investments, not regularly replaced consumables.

I have yet to see a charge source touted as "LFP Ready" that I would allow to touch any expensive low-C-rate bank of mine.

All that's needed are ones that let you adjust to user-custom setpoints, and ideally de-rate current when needed.
 
Last edited:
I have come across rigs, where the solar charge current is at **such** a low C-rate that

it's often below any reasonable endAmps spec.

So the owner ended up using a SoC coulomb-counting BM to determine the stop-charge point,

in order to avoid overcharging, which is entirely possible in this context, even with a setpoint of 3.39Vpc!

Say 350Ah was used up by 6am, with a CEF of 1.05 he'd set the BM to isolate the bank from the charge source once 370Ah had been accepted.

Of course, such drastic measures are rarely needed, and then only when getting to Full is important.

Usually it isn't, unlike other chemistries, so again, I'd rather just use a CC-only cutoff, an HVC with its sensor at the bank itself.

IMO SoC guesstimation is too inaccurate to use for determining the stop-charge point.
 
Last edited:
I appreciate the replys. I'm a bit frustrated though, not at anyone here, but at my myself for lack of experience/knowledge of much of what's being said here, and trouble interpreting some of the responses! :grin2: Hopefully I can learn from this thread.

I do have a few questions remaining.

1. The following statement is from the OP "The usually voltage used for charging LiFePO4 is 3.6V or 3.65V". My issue is that my cells never read this, I tested then again and the cells now read around 3.44V immediately when pulled off charger. If I'm understanding HKJ's answer and charts correctly that the cells are actually hitting 3.6V at some point in the charging process, but by the time the charger reads "FULL" then have dropped down to the 3.44V reading I'm getting? I understand the data showing that there is no benefit of a higher voltage charge, I'm just trying to verify whether or not my charger is charging these properly.
2. In regards to longevity, the lowest charge my charger allows is 500ma. This is near the 600ma limit that K2 puts in the specs for these cells. How harmful is 500ma charging going to be on a small quality 123A cell like this?

I appreciate the help.
 
Last edited:
1. The following statement is from the OP "The usually voltage used for charging LiFePO4 is 3.6V or 3.65V". My issue is that my cells never read this, I tested then again and the cells now read around 3.44V immediately when pulled off charger. If I'm understanding HKJ's answer and charts correctly that the cells are actually hitting 3.6V at some point in the charging process, but by the time the charger reads "FULL" then have dropped down to the 3.44V reading I'm getting? I understand the data showing that there is no benefit of a higher voltage charge, I'm just trying to verify whether or not my charger is charging these properly.

When the charge stops charging the voltage is about 3.6V and it will start dropping immediately and after some time it will end up at about 3.35V
 
When the charge stops charging the voltage is about 3.6V and it will start dropping immediately and after some time it will end up at about 3.35V

But does this peak voltage of 3.6V that occurs happen at some point before the readout on the actual charger itself says "100 percent full"? Because as I said, I can pull the cell from the charger the split second it reads 100 full, and the voltage reads around 3.44V.
 
But does this peak voltage of 3.6V that occurs happen at some point before the readout on the actual charger itself says "100 percent full"? Because as I said, I can pull the cell from the charger the split second it reads 100 full, and the voltage reads around 3.44V.

I happens a some minutes before the charger says full and will be maintained until the charger syas full. How fast the voltage drops depends on the battery and the actual termination current.
 
But does this peak voltage of 3.6V that occurs happen at some point before the readout on the actual charger itself says "100 percent full"? Because as I said, I can pull the cell from the charger the split second it reads 100 full, and the voltage reads around 3.44V.
The key here is to understand that there are several voltages going on.

R-read my post #6.

The **charging** voltage is influenced by both the battery (lower) and the charger (higher).

The point at which the charger terminates has little to do with where the battery ends up once isolated.

Your 3.44 is nearly a full volt **higher** than the "at rest" voltage indicating 100% Full.

IOW there is a surface charge, extra "meaningless voltage" level from the residue of the charging process. Remove even one mAh of energy, and you will see the voltage drop.

Same with letting the battery just sit for 24hrs. In which case it has **not** lost any actual energy, its internal chemistry reactions have just had a chance to "settle down."
 
My issue is that my cells never

actually hitting 3.6V

I understand the data showing that there is no benefit of a higher voltage charge, I'm just trying to verify whether or not my charger is charging these properly
If at rest and isolated, your cells are sitting anywhere above 3.32V, they have been fully charged.

Getting there at the lowest voltage / current rate possible is optimal for longevity.

Getting there quickly and conveniently is how most chargers are designed.
 
If at rest and isolated, your cells are sitting anywhere above 3.32V, they have been fully charged.

Getting there at the lowest voltage / current rate possible is optimal for longevity.

Getting there quickly and conveniently is how most chargers are designed.

How do you feel about that 500ma recharge rate on longevity?
 
Depends on cell capacity of course

and temperature.

Assuming sweatshirt weather, 0.4C would be my usual ceiling.

As it gets up to warmer temps, maybe 0.6-7C

1+C only at 30°C and hotter, actually pre-heating to 40+ for really fast charging but that's pretty rare.
 
So I get 3.45 V for CV and 3% C for current.

I see that some battery DISCHARGE curves show little gain in time when below 2.65V. Does discharging the cells to a particular voltage above 2.5V also help ( so discharge to 2.65V, charge to 3.45V as a 'cycle' ) ?

Regards
titanize
 
Top