li-ion charge termination at C/10 or lower?

malow

Newly Enlightened
Joined
Sep 2, 2009
Messages
144
Location
Brazil
i just got some single ic's charging for li-ion after my cheap charger blow up, and after testing and by the datasheet, they cut the charging at c/10 of the charging current.

i was testing with 800mah, 520mah, 2200mah and others li-ion and li-poly batteries.

one thing i noticed, after the charger cut-off, the voltage of the li-ion drops proportional of the charging current.

like if i charge a 800mah battery at 400ma, when the charger end at 4.20v the battery voltage falls to 4.16v. if the battery is 400mah, the voltage fall to 4.14v, 4.15v

if i use the 100ma charger in this 400mah battery, after the 4.20v cut-off it goes to 4.18v. if i charge the 2200mah battery at 100mah, it keeps 4.20v after charged.

so, the lower the current when charging, the higher the voltage of the battery goes after disconnected

so, my question: ending charge at c/10 its not too soon? i know, charging "95%" lets say, will make batteries last longer, but even than, i would prefer using all battery capacity.

if i recall, ive seen some charging graphs, and some chargers go lower than c/10, something like c/20, c/40 or c/100 before cut-off.

any thoughts?
 
Last edited:
Hi malow. Most hobby chargers terminate charge at 0.1C. I think it likely has to do with both speed, and the fact that cells last longer stopping at the higher rate. Most cell manufacturers used to say 0.03C, but now things are changing. Of course, any rate higher than that is OK, the cell just won't receive as much of a charge. And, yes the lower the initial charge rate, the higher the resulting capacity of the charged cell.

Part of the problem, as I understand it, is that charging towards the end of the charge cycle, when the voltage is higher, promotes more rapid oxidation of the electrodes. Terminating the charge at 0.1C, as opposed to 0.03C reduces this to some extent. This is also why the cheap CC only chargers that are so popular, damage cells. They drive higher currents through the cells towards the end of charge than a CC/CV charger, attempting to achieve a cell voltage (ie. not the circuit voltage) of ~4.20 volts, further damaging cells.

You may find Mr Happy's post of interest, in this thread by pae77.

Dave
 
tnks for the answer and links Dave. much more clear to me now ;)

i will do multiple chargers with different current, to adapt to each battery capacity.

also, as i do huge battery packs, i will stick with slower chargers that can pump more power to batteries. every drop of juice is important to me ;)

lovecpf
 
IMO - this is where hobby chargers are great - you can set whatever you like and see what they are doing. e.g. I can charge my 5000mAh Li-ion cells at 3A or 4A and they will CV charge until the current drops to 1/10 the starting voltage (300mA or 400mA). Then if I want to know how much short of a full charge that is I could charge the same cell that just finished charging at say 200mA to see how much capacity goes into the cell by the time the current drops to 20mA. If the cell only accepts 100mAh more capacity then I know it was within 2% of full charge when the first CV charge terminated.

I might just try this with a 2200mAh 18650 cell - charging it at a higher rate, then finishing it off at a lower rate to see how much more goes in. I'll post with results soon.
 
I might just try this with a 2200mAh 18650 cell - charging it at a higher rate, then finishing it off at a lower rate to see how much more goes in. I'll post with results soon.

I charged it at 1.6A which means the charge terminated at 160mA, then I charged it at 200mA so that it terminated at 20mA - the cell took another 59mA on the 2nd charging. So that would be less than 3% extra charge. I don't think that 3% matters that much to me.

It is probably worth understanding that 4.2V is not 100% charge either, a Li-ion cell can be charged to a higher voltage and will store more energy or charged to a lower voltage and store less energy. 4.2V is just chosen because it is a fairly high state of charge, but with minimal damage to the cell - charging to 4.1V would give the cell a longer life, but less capacity. Charging to 4.3V would give more capacity initially, but much less as the cell becomes more damaged (MUCH more damaged, 4.1 to 4.2 increases the damage but 4.2 to 4.3 volts would lead to several times the rate of damage to the cell, that would cut its life short very quickly) - not really worth it to most of us. 4.16V would be a pretty good voltage - more than 95% of the highest advisable charge level and still a pretty good capacity for the cell. So 4.2V is rather arbitrary and hobby chargers have the option to charge to 4.1V instead for good reason.
 
Interesting results, KiwiMark. That is similar to what I have observed, although I must admit I've never watched it close enough to have commented. Suffice to say, there isn't a whole lot of difference, as far as final capacity is concerned, but there is a difference.

I pretty much charge all my cells at 0.5C, unless I'm in a hurry or something. It seems a good compromise between extending cell longevity and speed of charging. As I mentioned in another thread somewhere, charging at 0.5C doesn't take anywhere near twice as long as charging at 1C, due to the disproportionate length of the CV stages, between the two rates.

Dave
 
Top