Looking for raw voltage vs time data for an NiMH charge cycle.

LMF5000

Newly Enlightened
Joined
Mar 10, 2011
Messages
84
Location
Malta
I'm curious as to how -dV/dt detection may be implemented in practice. Specifically, what kind of algorithm would work best at detecting the peak.

To this end, I've been looking for a plot of dV/dt (the first derivative of voltage with respect to time) for a typical NiMH charge cycle. Unfortunately this doesn't seem to be available anywhere as typically, sites like batteryuniversity and lygte only give voltage vs time plots, not their derivatives.

I was wondering if anyone with a data logger (or a charger that stores logs) could post their raw data for NiMH charges? Just a table of voltage vs time for, say, an Eneloop charging, and maybe an old/cheap cell with less-defined curve (for an added challenge). The data could be uploaded as a text file or an excel spreadsheet.

I can then calculate the dV/dT curve myself and post my findings. I've been reading through the CPF stickies and came across a 10-year old thread saying that some chargers before the time of the C9000 actually used the second derivative of voltage wrt time (d2V/dt2) for termination - which got me thinking whether this could have some advantages compared to dV/dt?
 

iamlucky13

Flashlight Enthusiast
Joined
Oct 11, 2016
Messages
1,139
User HKJ has done very impressively detailed testing of quite a few chargers. He has the kind of information it sounds like you're looking for. The index of charger reviews on his personal website is here, so you can sort through to see different chargers:

http://lygte-info.dk/info/roundCellChargerIndex UK.html

Not all of them actually use -dV/dt. Dumb chargers often just use a time limit or perhaps a voltage limit. Some use -dV/dt or other methods depending which criteria is reached first. There's a lot to learn from reading his reviews.

* Edit - sorry I missed that you mentioned you've seen this site already.
 
Last edited:

HKJ

Flashaholic
Joined
Mar 26, 2008
Messages
9,715
Location
Copenhagen, Denmark
lygte only give voltage vs time plots, not their derivatives.

Correct, but in this article: http://lygte-info.dk/info/batteryChargingNiMH UK.html
You can see some very detailed voltage vs time plots.

I believe a normal -dv/dt detection just record the maximum measured voltage and if the current voltage is a set amount of mV below that, it terminates. There must be some averaging and delays to prevent a single bad measurement (Too high or tool low due to noise) spoiling it.
 

SilverFox

Flashaholic
Joined
Jan 19, 2003
Messages
12,449
Location
Bellingham WA
Hello LMF5000,

I have a charger that pulses a 0.1C charge with no termination. This is designed to bring the cells of a lead acid or NiCd or NiMh battery pack into balance without doing overcharging damage to the other cells.

I found it very interesting that I could observe the -dV/dT indication even at this very low rate of charge. The issue was that it ocurred over several hours. I don't believe that current charging algorithms allow you to look back over 3 or 4 hours to make the charge termination indication.

Obviously, a faster charge rate would only have to look back a much shorter period of time. I think this drives the recommendation to use a charging rate in the 0.5 - 1.0C range when using -dV/dT termination.

Tom
 

LMF5000

Newly Enlightened
Joined
Mar 10, 2011
Messages
84
Location
Malta
Hi HKJ. I'm honoured to be getting a reply from the author of lygte.info himself, I'm a huge fan of your charger and battery reviews :). What I was planning to to was to work out the differentiation of the voltage-time curve to actually plot the dV/dt in the mathematical sense. This is simply the gradient at each point.

A numerical approximation can be calculated by working out change in voltage and dividing by change in time for the points before and after the point of interest. For example, if there are 100 data points on the voltage curve, dV/dT at, say, point 42 would be approximately equal to (v43-v41)/(t43-t41) (which strictly speaking is the gradient of a straight line drawn between points 41 and 43. It may or may not actually pass through point 42, but with enough points in the curve the inaccuracy would be minor).

The second derivative (d2V/dt2) would then simply be the same formula applied to the dV/dt values.

Looking at the curve I predict dV/dt would be close to zero for most of the charge, becomes positive nearing full charge, and falls negative after the peak. I suspect most chargers may not actually work on dV/dt, I think they simply look for a delta between peak voltage and current voltage and stop when it reaches a preset limit (like 4mV).

Could you share the raw data for one of the plots you linked to? Or maybe calculate the derivatives numerically and post an image of those superimposed with the voltage curve?
 

HKJ

Flashaholic
Joined
Mar 26, 2008
Messages
9,715
Location
Copenhagen, Denmark
Looking at the curve I predict dV/dt would be close to zero for most of the charge, becomes positive nearing full charge, and falls negative after the peak.

It will be positive during the charge, voltage raises all the time, except at the end.


I suspect most chargers may not actually work on dV/dt, I think they simply look for a delta between peak voltage and current voltage and stop when it reaches a preset limit (like 4mV).

Correct.


Could you share the raw data for one of the plots you linked to? Or maybe calculate the derivatives numerically and post an image of those superimposed with the voltage curve?

You will have to send an email with a link to the curve you want the data for (email address is on my website).
 

LMF5000

Newly Enlightened
Joined
Mar 10, 2011
Messages
84
Location
Malta
Re Silverfox - Interesting theory. I have been cycling a set of new AAs in a battery box on my hobby charger at different charge rates and I can't say I really noticed a voltage drop after full charge at 0.1C. I did however notice an increase in temperature (about 7-10*C over ambient) for the entire overcharge portion of the 15-hour charge.

In contrast, charging at about 0.7C and looking at a millivolt meter across one of the cells when it reached full charge I saw the voltage distinctly stop rising and then fall as full charge was reached. The charger's default peak detection setting is 4mV/cell, and sure enough it tetminated 2-3 minutes after the cell I was measuring fell from 1700mV to 1696mV. This was for a set of four new Vapex 2900mAh AAs charging at 1.5 amps on a Charsoon antimater 300W model. I was quite impressed actually, my other hobby chargers are very poor at terminating correctly on AA batteries.
 

LMF5000

Newly Enlightened
Joined
Mar 10, 2011
Messages
84
Location
Malta
It will be positive during the charge, voltage raises all the time, except at the end.




Correct.




You will have to send an email with a link to the curve you want the data for (email address is on my website).

Thanks for the offer 😁. I will study the curves and send an email with the one or two I want over the coming days. Much appreciated!
 

Gauss163

Flashlight Enthusiast
Joined
Oct 20, 2013
Messages
1,604
Location
USA
It seems your goal is to understand how such charge algorithms are implemented in practice and for that probably your best bet is to peruse various patents, e.g. try this patent search for starters. You should find much of interest there. Anything further is usually proprietary so will not be publicly accessible.
 

LMF5000

Newly Enlightened
Joined
Mar 10, 2011
Messages
84
Location
Malta
Just wanted to update this thread with the status. Firstly thanks Gauss for the tip, I will read through some of those eventually to see how they implement the algorithms comercially. What I'm trying to do I suppose is to derive a workable algorithm from first principles. To that end, here's the progress so far.

HKJ was kind enough to send me the data I requested. I've tried doing some preliminary tests, and a simple deriviative fails because the charger is switching itself off every 30 seconds or so to get a voltage reading. The sudden collapse of terminal voltage is obviously generating a large voltage gradient that drowns out the desired dV/dt data. So my efforts have so far shifted to smoothing the data and removing noise.

A small 30-second snippet is shown here.
a7v9YPw.png


The dV/dt line is simply change in voltage divided by change in time for the points before and after the point of interest. You can see the huge jump from the charger switching off at around the 13-second point. The other lines are my attempts at smoothing using different techniques. I've tried a simple-moving-average ("SMA"), exponentially-weighted moving-average ("EMA"), and eventually found a very efficient formula for smoothing which calculates the next point by blending a fraction of the raw data for that point with a fraction of the latest smoothed value ("Smooth").

The formula is: new smoothed point = [fraction * previous smoothed point] + [(1-fraction) * raw data of new point ]

The sensitivity of the curve is adjusted by varying the fraction. With fraction = 0.99 the curve is very smoothened (and hence slow to respond to changing data); with fraction = 0.01 the smoothed curve is practically identical to the raw data. In this context the raw data is -dV/dt value based on raw voltage readings. The graph in the pic shows f=0.99. The EMA in the pic is averaging over 30 periods (30 seconds).

It probably would have made more sense to apply smoothing to the raw voltage and then work out dV/dt on smoothed voltage, but I have yet to try it in that order. I will continue the analysis (and work towards actually identifying charge termination point from dV/dt data) when time permits.
 
Last edited:

HKJ

Flashaholic
Joined
Mar 26, 2008
Messages
9,715
Location
Copenhagen, Denmark
Why not make a filter that will split the each curve into two data sets: One with current on and one with current off.
 

Kurt_Woloch

Enlightened
Joined
Nov 12, 2014
Messages
290
I would try to compress the curve so that the data of each 30-second period gets averaged into one data point. These should be fairly smooth then, I think, so then you could try further analysis on that compressed version of the curve.
 

hiuintahs

Flashlight Enthusiast
Joined
Sep 12, 2006
Messages
1,840
Location
Utah
I'm curious as to how -dV/dt detection may be implemented in practice. Specifically, what kind of algorithm would work best at detecting the peak.

To this end, I've been looking for a plot of dV/dt (the first derivative of voltage with respect to time) for a typical NiMH charge cycle. Unfortunately this doesn't seem to be available anywhere as typically, sites like batteryuniversity and lygte only give voltage vs time plots, not their derivatives..............
You don't need to know the actual slope after peak voltage in my opinion.........just that you've reached peak voltage and with continual charging after the voltage drops a little off of its peak voltage, continual charging results in a battery temperature rise that you don't see prior to -dv/dt. The voltage initially decays a little from the peak and then pretty much stalls out below V peak. And that is when you see a significant rise in battery temperature that wasn't so pronounced in the charging phase before -dv/dt. The question in my mind is at what point after peak voltage do you consider the battery fully charged. I'll tell you what I have done with my battery charging project.

But first; if you have a power supply that you can set a current limit, then you can gather the data with your volt meter. You don't need to track the entire charge cycle as I think you're just interested in seeing what happens once the -dv/dt slope occurs. As the battery is being charged, the voltage pretty much ramps linearly or at least it ramps upward. What you could do is watch the last 30 minutes or so of a charge cycle and take data yourself using a watch for time..........say take a voltage measurement every 15 seconds or as often as you'd like. But you will have to turn the power supply off momentarily to take the voltage measurement. The power supply will have to be set to something around 2 volts if charging at a 1 amp rate. This will only work if your power supply has a current limit setting. To catch the last 30 minutes so you're not doing something tedious for hours is to use a battery that is sitting in the upper 1.3x's volts.........say something that was charged a couple of months ago.

What I have noticed with NiMH is that I could charge an AA battery at 1000mA and the battery only gets barely warm at the tail end of the charge cycle. That tells me that as the battery is still taking on charge, no significant heat is generated. But once the battery is fully charged and can't take on any additional, it transforms that excess charge into heat. In the development of my charger, I over charged a battery. From what I can remember, the voltage simply will not rise any further once dv/dt is hit and it just gets warmer and warmer if at a 1 amp charge rate............and eventually pretty hot. At a lower rate, it may not get as warm. I guess what I'm saying is that one could almost charge a battery by temperature alone and get fairly close. Definitely it's a good idea in the design of a charger to use temperature sensing and/or a timer in the event that a -dv/dt termination is missed.

I have yet to have a missed termination on my charger. This is my system:

The microcontroller that I am using to manage everything has a 10 bit A/D converter. I use a reference voltage of 4.452 volts and thus the resolution is 0.00435v/bit. I wanted to get as close to something over 4.20v (ie: this charger also does lithium ion) so as to be able to measure all the range as accurately as possible over the full range of batteries. 4.452V was chosen because it makes the software that has to convert a binary A/D value to a meter reading easier to do (but I won't elaborate on that here).

Anyhow I was originally concerned that a 0.00435v resolution wouldn't be accurate enough to pick up on the dv following peak voltage. But so far that hasn't been a concern. With a 1 amp charge rate I was getting 0.02 to 0.030 volt drop off. I can't remember the exact amounts. With a lower charge rate like 250mA, it wasn't so pronounced but was still higher than 0.00435v.

This is my charge process:

I don't pulse charge with a high current. Just a constant current. I charge for 500mS and then turn off charging and take a voltage measurement 10mS later. That allows for everything to have settled. After voltage measurement is taken, I turn charger right back on. (10/500 = 2% time lost in the charging process to measurement.......so basically insignificant time loss). Those numbers could be changed but they work fine and I haven't bothered to change them. I sample that often to keep the display voltage and current up to date.

But I only compare or check for -dv event once every 30 seconds. That's where I log into memory if a new sampling is higher than the previous (30 seconds ago) and also where I check to see if the present sampling has dropped below the previous one. There just needs to be enough time for when voltage is rising, to overcome any errors associated with noise or my A/D's 0.00435v resolution. So what is happening during the 30 seconds is that new readings are coming in every 0.5S but are only compared to a value every 30 seconds.

Once a -dV event has been noted, I continue charging for another 5 minutes and then call it good. This is the time frame where the battery actually takes on some warmth but never hot even at the 1 amp rate.

I like to credit HKJ for his suggestions and his report.
http://lygte-info.dk/info/batteryChargingNiMH UK.html
That way I was able to see the charging charts and the various methods different companies went about charging an NiMH battery.
 
Last edited:

LMF5000

Newly Enlightened
Joined
Mar 10, 2011
Messages
84
Location
Malta
Why not make a filter that will split the each curve into two data sets: One with current on and one with current off.

That was the first thing I tried - setting a filter to disregard data when current ready in the row is below a threshhold value. It wasn't sufficient, because the voltage slope (dV/dt) in the preceeding and trailing data points is still many orders of magnitude greater than the dV/dt value elsewhere. I can refine the filter (although I'd have to figure out a way to exclude rows adjacent to the ones that meet the filter condition, and I don't know how to do that just yet), but then I I thought I'd try and find a way that would work even with the fluctuation induced by the charger switching on and off.

I confess I haven't had time to look into it these last two weeks, and CPF doesn't seem to be sending me email notifications so I only just noticed now that there were replies to my posts.

In the next couple weeks I will re-examine the approach of filtering out on vs off points, and some of the other suggestions raised.
 

LMF5000

Newly Enlightened
Joined
Mar 10, 2011
Messages
84
Location
Malta
@hiuintahs

Interesting idea. So let me see if I understood your algorithm. I've paraphrased it in steps below:
1. Pulse-charging for 500ms with 10ms off time (98% duty cycle).
2. Store voltage in memory
3. Every 30 seconds, check if current voltage is greater than stored voltage.
4a. If yes, store current voltage in memory (as new "max voltage")
4b. If no, assume -dV has occurred
5. Continue charge for 5 minutes
6. Terminate

Could you clarify whether you're averaging the voltage during those 30-second periods, or using one instantaneous voltage reading that happened to fall on the 30s mark?

Also, as an enhancement, perhaps you could tweak the 5-minute period to make it proportionate to charge rate? So instead of fixed 5 minutes regardless of current, perhaps you could program it to add only a fixed 50mAh or 100mAh... or maybe look at the total time multiplied by current level (to guesstimate battery capacity) and add 1% or 3% of that value? - that way it wouldn't cook a battery too much when you charge at higher current.
 

hiuintahs

Flashlight Enthusiast
Joined
Sep 12, 2006
Messages
1,840
Location
Utah
I guess you can call it pulse charging but mostly its just a constant charge current with the removal just long enough to measure battery voltage. Typically when I think of pulse charging its with a very high current and thus with its associated duty cycle you end up with an average current. So I'm not really doing it that way.

Yes your summation is accurate. I use a 30 second interval to update to a new max stored voltage because that will allow a significant enough of change to take place that is measurable with the 10 bit A/D converter that I'm using. When the battery hasn't reached peak voltage, the voltage is pretty much just on a gradual ramp upwards. I don't worry about averaging during this time frame because not much voltage change happens in 30 seconds, so 30 seconds isn't excessive even if peak voltage happens early in that 30 second interval where I don't actually detect it until say 25 seconds later. By continually charging another 5 minutes after -dv/dt is detected, only adds a little bit of heat at the 1 amp level and not much at all at the 500mA or 250mA level.

I looked over some of the charts that HKJ produced on the various charge methods and I just don't see the need to keep charging past -dv/dt beyond say 5 minutes. I noticed many do various things afterwards. I wanted to keep charging past -dv/dt just to make sure the battery got fully charged. And the way that I can tell that that is happening is that after you reach -dv/dt, that is when the battery starts to warm up. Charge current must get converted to heat past -dv/dt because during the entire charge cycle, the battery doesn't warm up. It seems to only increase in warmth past -dv/dt. The 5 minutes that I picked is basically a random number. If the battery got hot during that time, I'd back it down or drop the current level. It's just getting a little warm after 5 minutes and with the 500mA charge rate, you really don't notice much heat increase at all. So I think the 5 minutes is OK. Charging longer just seems like a waste of time to me if the battery is only getting warmer because that tells me that the battery is basically fully charged and the excess charge is just being converted to heat.

I noticed that I have to put well over a couple of tenths of a volt onto the battery just to force current into it. That's why I have to cut the charge periodically so as to measure the actual battery voltage. Since this charger also does lithium ion (constant current followed by constant voltage), I can't allow the control to go into a constant voltage mode at a NiMH voltage level as that would cause the current to drop. I wanted constant current right up until you detect -dv/dt. Since you don't use a voltage level (like 4.20v on lithium ion), I don't think it makes sense to allow the current to roll back before -dv/dt detection. This is just a bit of trivia to those interested.

The circuit uses a buck regulator that drops the voltage down to the right level so as to maintain a constant current. The feedback is both current and voltage. Whichever one is the highest rules as the feedback signal to the buck regulator, (an op-amp diode-or configuration). I've set the feedback signal such that 4.20v is max voltage and whatever charge current level (250ma, 500ma, or 1a) as the feedback mechanism. So when a lithium ion battery is in need of charging, it doesn't take 4.20v to create 1 amp of charge current. But as soon as the voltage hits 4.20v, then the current feedback kicks in as the regulating fb mechanism.

Anyhow I just throw a little heads up as to how it works. Thus with NiMH, I set the voltage limit at 2.00 volts. That seems to be high enough to not limit a 1 amp charge rate even as the battery approaches full charge. Then I'm relying on the current feedback mechanism to hold a constant current into the battery with software control detecting dv/dt to keep the system from trying to charge that battery up to 2.00v. Also an emergency time out and 10K NTC thermistor to detect excessive heat indicative of missing termination. So far it hasn't happened and -dv/dt detection has been very reliable.

It's been a fun project. There's a variety of ways of charging a battery. This is how I do it and it mostly has to do with accommodating the lithium ion charge circuit with NiMH.
 
Last edited:

hals

Newly Enlightened
Joined
Jul 6, 2023
Messages
1
Location
Sunnyvale, CA
Here's some NiMH charging dV/dt data, in a spreadsheet at


View attachment 45854

and the control code, using a triple output programmable power supply for charge, discharge, and 4-wire (Kelvin contact) voltage measurement:
https://github.com/halsampson/ChargeNiMH

I often see signficant improvement in capacity after the first deep cycle of long stored cells but little improvement with subsequent cycles. Optimizing discharge currents, charge termination, and anything else might help. The ISR vs. SOC (Internal Series Resistance vs. State Of Charge) curve often gets smoother (as crystallization sizes decrease?). Cells with very high ISR show little improvement with any cycling, perhaps due to anode/cathode oxidation.
 

turbodog

Flashaholic
Joined
Jun 23, 2003
Messages
6,425
Location
central time
This is an old thread, but I can tell you from charging nimh (by hand using a multimeter) that the peak is slight with them.

By the time the cells heat up, you are into overcharge territory with a depressed cell voltage.

With the main complaint being low voltage... I'd make SURE to not overcharge my cells.

Nimh charging is actually endothermic, which is why the cells stay cool until the end.
 
Top