Appropriate Charge Rate for NiMH Batteries

one thing i dont understand is why people call the 12-14 hour chargers "Travel" chargers, when i got mabey 6 hours in a motel before the maid is turning the bed out on me. Checkout time :wave:, and at home i got all day.

You plug in at home, go traveling, and they are done when you get back :nana:
 
now thats old scool , even discontinued


When using the DDVC charger, you must keep in mind that it is not a peakdetect or automatic cut-off charger. You are the only one who can stop it. Don't forget about it, or you could damage your cells.

so it had 2 otions i guess , the old 16 hour slow charge, or fast charging while holding battery in hand, terminate on "ok its warm enough".
Uh, no. It's sole purpose is to output a user defined (in 5mA steps) current; that's it!

2 channels
1-10 Nixx cells on each channel
Each channel adjustable from 5mA - 500mA, in 5mA increments.
You supply the timer, or don't.

Outside of the LCD that displays current, the switch for changing channels, and the pots for adjusting current, there's nothing else on this guy. That's the beauty; it's dead simple! When I want to do a break-in on some AA cells, I'll load 'em up in the plastic cell holders, eight in each, and stick them on the DDVC, and it into a digital lamp timer. No need to tie up the C9000 with a break-in, unless I want to see the capacity from the break-in mode. I can do a 0.1C charge on 20 (TWENTY) cells at one time! How's that for efficiency?
 
Duracell writes "For fast charging and optimum performance" you can interpret it any way you want. They also recommend an indefinite trickle charge.

Unless you are seeing something I'm not, they ONLY recommend the 1C charge. They only present other options, but they are not recommended. Certain charge rate rages are specifically NOT recommended actually.

I'm seeing what you, and Duracell, wrote:

Duracell states for "optimum performance", use this method:

For fast charging and optimum performance, Duracell recommends a three-step procedure:
  1. Charge at 1C rate, terminated by using dT/dt = 1°C (1.8°F)/minute
  2. Apply a C/10 top-up charge, terminated by a timer after 1/2 hour charge (optional, not required)
  3. Apply a maintenance charge of indefinite duration at C/300 rate

I think it is a good example why manufacturers recommendations cannot be taken literally and, without additional information, can not be trusted. I think the third step of the recommended procedure will shorten batteries life. The current is low so it won't be severe but I think it is still not good for batteries. This step is to compensate for self-discharge, it will ensure the batteries are fully charged so they decided to recommend it as "optimum performance". The problem is their optimum (keeping the cells fully charged) is not my optimum that we are discussing here.
 
Last edited:
hmm looks like quoting manufacturer's data sheet/specs is the best we can do without any real expert and theory on battery chemistry and physics behind it.
even when we have someone with that information, there are also differences between various generations of NiMH alone, and again, it has to be dissected and explained.
so take it easy guys, lets learn together :grin2:

After knowing the complexities of this supposedly simple activity of charging batteries, i think i prefer to become the 'average' user with a simple 'no-options' charger rather than tinkering with all various options in the smart charger to get the 'best' results on my batteries :green:
I guess thats because i dont have scores (hundreds or over) of batteries to maintain everyday and responsible for it 😀. Oh, and i use lithium primaries on my lights :nana:

of course YMMV :thumbsup:
 
hmm looks like quoting manufacturer's data sheet/specs is the best we can do without any real expert and theory on battery chemistry and physics behind it.
even when we have someone with that information, there are also differences between various generations of NiMH alone, and again, it has to be dissected and explained.
so take it easy guys, lets learn together :grin2:

We could also quote more independent documents but I, for once, haven't yet even started searching for them 🙂 I have nothing against quoting the manufacturers' documents but I wanted to warn that they have to be frequently taken with a grain of salt. There used to be a time when there generally were useless marketing brochures and rather honest data sheets. Now it seems that many data sheets are written not to explain how things work but "why our product is better than competition, with good looking numbers and graphs". In case of charging recommendation I think they recommend the procedure that will please most users but not necessarily the batteries.

After knowing the complexities of this supposedly simple activity of charging batteries, i think i prefer to become the 'average' user with a simple 'no-options' charger rather than tinkering with all various options in the smart charger to get the 'best' results on my batteries :green:
I guess thats because i dont have scores (hundreds or over) of batteries to maintain everyday and responsible for it 😀. Oh, and i use lithium primaries on my lights :nana:

of course YMMV :thumbsup:

Fast charging has its place of course and I definitely won't resign from it - in many situations it is more convenient. The standard charge is much longer but the only thing it requires is the knowledge of the battery capacity. If it turned out it is better for batteries, I would prefer using it when I'm not in hurry.
 
Last edited:
I think it is a good example why manufacturers recommendations cannot be taken literally and, without additional information, can not be trusted. I think the third step of the recommended procedure will shorten batteries life. The current is low so it won't be severe but I think it is still not good for batteries. This step is to compensate for self-discharge, it will ensure the batteries are fully charged so they decided to recommend it as "optimum performance". The problem is their optimum (keeping the cells fully charged) is not my optimum that we are discussing here.

C/300 is well below the threshold to harm a battery. As long as you didn't leave a cell on for days or weeks, you'd be fine. It's not really a "trickle CHARGE" per se.
 
C/300 is well below the threshold to harm a battery. As long as you didn't leave a cell on for days or weeks, you'd be fine. It's not really a "trickle CHARGE" per se.

There is no point in recommending the indefinite C/300 charge if the battery is not left for days or weeks. This recommendation is for people who want to leave the batteries in the charger for months and have them fully charged when taken out. They will have a better performance then - the batteries will be fully charged while not trickle charged competition may already be empty. It's only a different definition of performance than the one we were talking about.
 
In fact they do, and I specifically linked the reference documents above.

In fact I have 2 Eneloop datasheets here that plainly state:

(AAA) "Single cell capacity under the following condition.
Charge: 80mAx16h, Discharge: 160mA*E.V.=1.0V) at 25C"

(AA) "Single cell capacity under the following condition.
Charge: 200mAx16h, Discharge: 400mA(E.V.=1.0V) at 25C"

So not only do they find this an acceptable charge rate, it is the rate at which the capacity itself is guaranteed, not at any other rate.

They changed their tune over trends, but not quite how you assume. Back in NiCad days, slow chargers worked just fine, as NiCads were very tolerant of it. When NiMH consumer cells were introduced, the chargers did not drastically change. If you know the precise capacity of the cell in question, you can charge at a slow rate on a timer. Since it was easier to use the same chargers and just change the time, it was an easy assumption to simply assume consumers would recharge cells only when depleted. This is also when you started to notice more strong language to the effect of "use only our cells in our charger". Put a lower capacity cell in a charger meant for higher capacity, and you can damage the cell.
You're simply backwards arguing towards a position supportive of your assumption. Your assumption was that if/when a battery manufacturer touts the ability to fast charge, that they are also suggesting it shouldn't be done any other way. If that were what they were suggesting, THEY would have stated it, not leaving ambiguity for 3rd parties to reinterpret.

They clearly list charge rates THEY use for their own testing which take 16 hours.

...That drove smart chargers to a -dV/dT condition to sense end of charge. But to sense this signal, the charger must be able to measure it. A typical NiMH cell will show this signal more prominantly >0.5C charge. Below that and the signal is VERY difficult to measure, or the cell may not even display one at all. Above 1C charge, you can get into other issues with charging the cell too fast.
So you keep claiming, but in fact the signal is not so difficult to measure, every single low-priced charger I've seen in recent years that has Delta -V detection, detects it just fine below 0.5C. ALL of them, so obviously what is difficult in your mind, is not so hard to do when a charger is designed to do this very thing. They not only detect Delta -V at lower levels, they do it successfully with only one cell having reached Delta -V if the cells were at different enough capacity that both did not reach Delta -V at an overlapping moment in time.

Certainly there is a charge rate low enough that Delta -V is too low to be detected, but once again (why is it necessary to keep repeating what chargers prove to be true?) if you look at the chargers, those which detect Delta -V, have a high enough charge rate to do so, they aren't making 200mA chargers that detect Delta -V for this reason, but are making chargers below 0.5C charge rate because it clearly works.

0.5-1.0C is a sweet spot where you can both measure a -dV/dT signal, and not overcharge the cell. If your equipment is capable of it, that is the current "best" method to charge NiMH cells of an unknown state of discharge.
I can put a 90% charged AA battery in a low end 5 year old 700mA rate charger and it Delta -V terminates just fine. What you claim is not supported except perhaps on the first generations of Delta -V capable chargers but as with most electronics, successive generations of product improved in performance.

The Delta -V change doesn't need to be as high or fast as possible, only enough that a charger can detect it which they do.
 
Last edited:
not all of the Medium Speed chargers use V-drop, some dont Wait for that, and just bail out when the voltage is no longer continuing to increase.
 
In fact I have 2 Eneloop datasheets here that plainly state:

(AAA) "Single cell capacity under the following condition.
Charge: 80mAx16h, Discharge: 160mA*E.V.=1.0V) at 25C"

(AA) "Single cell capacity under the following condition.
Charge: 200mAx16h, Discharge: 400mA(E.V.=1.0V) at 25C"

So not only do they find this an acceptable charge rate, it is the rate at which the capacity itself is guaranteed, not at any other rate.

I recommend you look up what "Standard Charge" actually means. It does NOT mean that is the recommend charge scheme. That is the method by which the capacity is determined, NOT the typical recommended method.

If you look on the same data sheet, it recommend that the typical charge is 1C...
 
So you keep claiming, but in fact the signal is not so difficult to measure, every single low-priced charger I've seen in recent years that has Delta -V detection, detects it just fine below 0.5C. ALL of them, so obviously what is difficult in your mind, is not so hard to do when a charger is designed to do this very thing. They not only detect Delta -V at lower levels, they do it successfully with only one cell having reached Delta -V if the cells were at different enough capacity that both did not reach Delta -V at an overlapping moment in time.

I'm afraid it's not so simple. You are right that the chargers most of the time will detect the -dV condition with currents lower than 0.5C. The problem with this algorithm is that not always. There is really no good sensitivity level that you could set. If you set it high, there may be false peaks that will result in premature termination. If you set it low the voltage drop may be too subtle to detect and you will overcharge the battery. I believe this was the problem with Maha C9000 that I wrote about previously.

I think that -dV by definition slightly overcharges batteries - only when overcharged they produce this signal. In the documents that I read, if various fast charge termination methods were compared, the temperature based dT/dt was, I think, always recommended. It was described as the most reliable for fast charging and ending the charge just on time. The only problem is that you need to be able to closely monitor the cell temperature which is costly.

EDIT: here is the cycle life comparison from the Duracell document quoted earlier by Marduke:



This overcharging by the universally used -dV method and reduced cycle life is yet another reason why I think the standard 0.1C charge is better. Unfortunately I haven't seen a comparison similar to the one above with the standard charge included but I believe it does not reduce the cells life in the way -dV does.
 
Last edited:
I recommend you look up what "Standard Charge" actually means. It does NOT mean that is the recommend charge scheme. That is the method by which the capacity is determined, NOT the typical recommended method.

If you look on the same data sheet, it recommend that the typical charge is 1C...

LOL. You are actually suggesting they recommend against the very rate they use for their own testing? It is not believable, far more likely they use that rate because it gives their cells the max performance. Certainly they are not going to specify a testing parameter that isn't even a valid method to recharge them!

On the same data sheet, it does not recommend 1C charge rate, it merely specifies what rate someone wanting to do a fast charge would use so they don't exceed the maximum.

Specifying a maximum alone never means there is no minimum or other capable value. Haven't you seen this on lots and lots of datasheets? On almost ALL datasheets for electrical/electronic components? For example a 3A, 40V diode is not recommended to run only at 3A & 40V exactly, that is only the upper limit.
 
Last edited:
LOL. You are actually suggesting they recommend against the very rate they use for their own testing? It is not believable, far more likely they use that rate because it gives their cells the max performance. Certainly they are not going to specify a testing parameter that isn't even a valid method to recharge them!

Unfortunately a single manufacturer can use for testing the method that would give the best capacity, recommend another method that they think the users will like the most and write in their own technical documents that the method they previously recommended is worse than another one that they didn't recommend (see Duracell above). As I wrote previously these documents are valuable but only if you can decide how to interpret them and which parts can be trusted...
 
I'm afraid it's not so simple. You are right that the chargers most of the time will detect the -dV condition with currents lower than 0.5C. The problem with this algorithm is that not always. There is really no good sensitivity level that you could set. If you set it high, there may be false peaks that will result in premature termination. If you set it low the voltage drop may be too subtle to detect and you will overcharge the battery. I believe this was the problem with Maha C9000 that I wrote about previously.

I suggest that most chargers don't set it high, or low, they use testing in the product development to set the most appropriate level possible, but one where there is still a short period of trickle charging needed to top off the cell.

I think that -dV by definition slightly overcharges batteries - only when overcharged they produce this signal. In the documents that I read, if various fast charge termination methods were compared, the temperature based dT/dt was, I think, always recommended. It was described as the most reliable for fast charging and ending the charge just on time. The only problem is that you need to be able to closely monitor the cell temperature which is costly.

I could never want a temperature based termination. Put one 90% charged battery in a charger and one 10% charged battery in a 2nd identical charger. The 90% charged battery will be cooler when it reaches full charge than the 10% charged battery is.

I don't necessarily think it really costs much to implement a cell temperature circuit. They cheaply enough implement Delta -V, if it can sense voltage changes a typical thermal sensor would provide voltage changes also, so the remaining factor is a good conductive interface to the cell body, perhaps embedding it in a semi-flexible silicone pad the cell casing rests against.

EDIT: here is the cycle life comparison from the Duracell document quoted earlier by Marduke:



This overcharging by the universally used -dV method and reduced cycle life is yet another reason why I think the standard 0.1C charge is better. Unfortunately I haven't seen a comparison similar to the one above with the standard charge included but I believe it does not reduce the cells life in the way -dV does.

I suspect it is more stressful to charge at 1C, such a high rate which is why it is often spec'd as the upper limit, but that at a lower rate the Delta -V detection is not subjecting the cell to as high a temperature. Unfortunately these come back to the issue of full charge and discharge cycles when personally my goal is to not run out of battery charge every time I use a device, that is the least desirable usage pattern to me, having run out of power, nor do I accept increasing wear by discharging a cell each time before recharging.

The rapid charge rate with higher initial capacity for ~260 cycles seems a fair tradeoff to make, especially when you don't have to wait as long but I think we are drifting off on a tangent again so I will restate my position.

I am not interested in getting every last cycle or every last mAH possible out of a $2 battery. I find most methods of charging entirely acceptable, except medium-current timer shutoff type dumb chargers with no other safety/shutdown circuitry as these really do have a tendency to regularly cook the batteries any time they weren't fully discharged, especially if the charger is newer than the batteries so it is optimized for a battery with higher capacity.

I'd much rather rotate aging batteries out of demanding applications into lesser applications on a periodic basis, and if 20% one way or the other mattered I'd have spare batteries along because as mentioned above, fully draining cells as the data generated assumes as the usage profile, is the opposite of the goal to chose rather than be forced to stop using a device.
 
LOL. You are actually suggesting they recommend against the very rate they use for their own testing? It is not believable, far more likely they use that rate because it gives their cells the max performance. Certainly they are not going to specify a testing parameter that isn't even a valid method to recharge them!

On the same data sheet, it does not recommend 1C charge rate, it merely specifies what rate someone wanting to do a fast charge would use so they don't exceed the maximum.


As I said, you obviously have no clue what "Standard Charge" actually means, and have absolutely no interest in educating yourself about it...
 
I suggest that most chargers don't set it high, or low, they use testing in the product development to set the most appropriate level possible, but one where there is still a short period of trickle charging needed to top off the cell.

As far as I know such a level simply doesn't exist. You either have a higher risk of overcharging or higher risk of premature termination. With the usual level there is probably non-negligible risk of both. Except the false positives or negatives the sensitivity doesn't change the moment of termination in any important way. The -dV overcharges the cells - if you apply the trickle charge afterwards they will be even more overcharged.

I could never want a temperature based termination. Put one 90% charged battery in a charger and one 10% charged battery in a 2nd identical charger. The 90% charged battery will be cooler when it reaches full charge than the 10% charged battery is.

dT/dt is not based on the temperature value but on the detection of the temperature rise rate. Please read the descriptions of appropriate algorithms if you are interested in comparing them.

I don't necessarily think it really costs much to implement a cell temperature circuit. They cheaply enough implement Delta -V, if it can sense voltage changes a typical thermal sensor would provide voltage changes also, so the remaining factor is a good conductive interface to the cell body, perhaps embedding it in a semi-flexible silicone pad the cell casing rests against.

You may be right but the documents I read claim that it would be more costly to implement it correctly than to have a simple -dV. I tend to believe they are right here.

The rapid charge rate with higher initial capacity for ~260 cycles seems a fair tradeoff to make, especially when you don't have to wait as long but I think we are drifting off on a tangent again so I will restate my position.

I am not interested in getting every last cycle or every last mAH possible out of a $2 battery.

I understand your position and sometimes it is more appropriate also for me. Nevertheless I am interested in finding out what are the results of using various available charging methods. Especially since there seems to be a popular impression that there exist a single best method that can be recommended to everyone and I think it is simply not true.
 
Gentlemen, let us be nice to each other, no slights, or disparaging remarks here. This is about our fun hobby, and info should pass in good cheer.

Bill
 
It's beginning to seem like a couple of posters are mistaking this for a fundamentalist religious argument.

We are only discussing opinions on how best to charge a cell, not run the rest of your life or whether a certain mode of living will get you into heaven.

I'd guess some of the previous posters are having difficulty accepting that others can have different opinions than they have. This isn't a good thing for your blood pressure. :faint:
 
It's beginning to seem like a couple of posters are mistaking this for a fundamentalist religious argument.

you have to Believe in the battery, , man.

I'd guess some of the previous posters are having difficulty accepting that others can have different opinions than they have. :faint:

its about different charge rates 🙂 the difficulty is only in the different Chargers :thinking: and the way people have seen them work, making thier opinions 100% valid.

if everyone had the same charger , of these guys , they would all have the same opinion 🙂

if the manufacutures would follow ANY of thier own specs when selling thier chargers, if they would say the same thing in the data sheet that they do, or tell people to do.
If all the chargers wern't so heavily varied in thier approaches, and if so many of them werent using cheap tricks.

The Dates people aquired thier information makes a big difference, the "Trend" has changed, even though the reality is basically the same.
The Machines have changed, which changes information anyone would even provide.

in reality everyone here is 100% correct, BUT only if your machine uses the alogrythm they are correct about.
 
Last edited:
Back
Top