I'm curious as to how -dV/dt detection may be implemented in practice. Specifically, what kind of algorithm would work best at detecting the peak.
To this end, I've been looking for a plot of dV/dt (the first derivative of voltage with respect to time) for a typical NiMH charge cycle. Unfortunately this doesn't seem to be available anywhere as typically, sites like batteryuniversity and lygte only give voltage vs time plots, not their derivatives..............
You don't need to know the actual slope after peak voltage in my opinion.........just that you've reached peak voltage and with continual charging after the voltage drops a little off of its peak voltage, continual charging results in a battery temperature rise that you don't see prior to -dv/dt. The voltage initially decays a little from the peak and then pretty much stalls out below V peak. And that is when you see a significant rise in battery temperature that wasn't so pronounced in the charging phase before -dv/dt. The question in my mind is at what point after peak voltage do you consider the battery fully charged. I'll tell you what I have done with my battery charging project.
But first; if you have a power supply that you can set a current limit, then you can gather the data with your volt meter. You don't need to track the entire charge cycle as I think you're just interested in seeing what happens once the -dv/dt slope occurs. As the battery is being charged, the voltage pretty much ramps linearly or at least it ramps upward. What you could do is watch the last 30 minutes or so of a charge cycle and take data yourself using a watch for time..........say take a voltage measurement every 15 seconds or as often as you'd like. But you will have to turn the power supply off momentarily to take the voltage measurement. The power supply will have to be set to something around 2 volts if charging at a 1 amp rate. This will only work if your power supply has a current limit setting. To catch the last 30 minutes so you're not doing something tedious for hours is to use a battery that is sitting in the upper 1.3x's volts.........say something that was charged a couple of months ago.
What I have noticed with NiMH is that I could charge an AA battery at 1000mA and the battery only gets barely warm at the tail end of the charge cycle. That tells me that as the battery is still taking on charge, no significant heat is generated. But once the battery is fully charged and can't take on any additional, it transforms that excess charge into heat. In the development of my charger, I over charged a battery. From what I can remember, the voltage simply will not rise any further once dv/dt is hit and it just gets warmer and warmer if at a 1 amp charge rate............and eventually pretty hot. At a lower rate, it may not get as warm. I guess what I'm saying is that one could almost charge a battery by temperature alone and get fairly close. Definitely it's a good idea in the design of a charger to use temperature sensing and/or a timer in the event that a -dv/dt termination is missed.
I have yet to have a missed termination on my charger. This is my system:
The microcontroller that I am using to manage everything has a 10 bit A/D converter. I use a reference voltage of 4.452 volts and thus the resolution is 0.00435v/bit. I wanted to get as close to something over 4.20v (ie: this charger also does lithium ion) so as to be able to measure all the range as accurately as possible over the full range of batteries. 4.452V was chosen because it makes the software that has to convert a binary A/D value to a meter reading easier to do (but I won't elaborate on that here).
Anyhow I was originally concerned that a 0.00435v resolution wouldn't be accurate enough to pick up on the dv following peak voltage. But so far that hasn't been a concern. With a 1 amp charge rate I was getting 0.02 to 0.030 volt drop off. I can't remember the exact amounts. With a lower charge rate like 250mA, it wasn't so pronounced but was still higher than 0.00435v.
This is my charge process:
I don't pulse charge with a high current. Just a constant current. I charge for 500mS and then turn off charging and take a voltage measurement 10mS later. That allows for everything to have settled. After voltage measurement is taken, I turn charger right back on. (10/500 = 2% time lost in the charging process to measurement.......so basically insignificant time loss). Those numbers could be changed but they work fine and I haven't bothered to change them. I sample that often to keep the display voltage and current up to date.
But I only compare or check for -dv event once every 30 seconds. That's where I log into memory if a new sampling is higher than the previous (30 seconds ago) and also where I check to see if the present sampling has dropped below the previous one. There just needs to be enough time for when voltage is rising, to overcome any errors associated with noise or my A/D's 0.00435v resolution. So what is happening during the 30 seconds is that new readings are coming in every 0.5S but are only compared to a value every 30 seconds.
Once a -dV event has been noted, I continue charging for another 5 minutes and then call it good. This is the time frame where the battery actually takes on some warmth but never hot even at the 1 amp rate.
I like to credit HKJ for his suggestions and his report.
http://lygte-info.dk/info/batteryChargingNiMH UK.html
That way I was able to see the charging charts and the various methods different companies went about charging an NiMH battery.