Some details on Sanyo's "Peak-cut control" termination scheme

PeAK

Enlightened
Joined
Jan 30, 2009
Messages
238
Sanyo has often mentioned their innovative scheme to determine when a battery is fully charged and to stop the charge. They are short on details but a news release describing their universal charger capable of handling "C" and "D" sized eneloops reveals that the scheme may be very similar to Maha's (C9000) scheme of using a set voltage to stop charging as speculated by one our Candlepower members NiOOH:

Had the time for some more testing on the MQH02.
.
.
.
Charge completeness is also good, equal to what C9000 + 2 hours top-off does at 1 amp.
IMO, the charger does not terminate on -dV. Sanyo talks about "peak-cut control" with -dV and absolute temperature as a bacup. I think this peak cut is not a real 0dv (or peak voltage detection) but rather max V. In other words it works similarly to Maha C9000. Most probably however, the value is set higher than on the C9000 and top-off is not necessary. After fast charge the charger shuts completely, i.e. there is no trickle charge (not bad thing with LSD cells).
.
.
.



Some Details from Sanyo:
- Adopts SANYO's time-proven rapid charger 'peak-cut method' to control charging
- System allows the voltage change of each battery to be individually monitored, halting charging as peak voltage is detected (when fully charged)
- Designed to prevent batteries from being overcharged or damaged by charging

Equipped with an overheating protection function for safe and reliable use
- To protect against overheating, an 'Overheating prevention function*5' with a built-in current protection device (a Positive Temperature Coefficient (PTC) device*6) has been adopted
- New function designed to protect against abnormal heating and electrolyte leakage


Red lamp: Charging (Charge is less than 50%)
Blue lamp: Charging (Charge is greater than 50%)
No lamp (blue lamp turns off): Fully charged

The "D" size eneloop battery charges in 8.5 hours with a capacity of 5700mA-hr. This would correspond to a current of about 670mA. If they charged at the minimum 0.5C level recommended for reliable negative delta-V termination, the current would have to exceed 2.8 amps.

If it seems so obvious, why has this schmeme not been used before and if it works it would mean that someone could just hook up a variable power supply set to the appropriate voltage to charge batteries. The answer I think lies with the fact that batteries are very low resistance and small changes in the voltage can result in large changes in current. This new scheme is actually a combination of ideas whereby a current source (puts out needed voltage to draw fixed current) together with a voltage limit. In their later charger catalogue, they actually specify that the chargers used negative delta-V in conjuction with Peak Voltage Control (PDF). Great minds at Sanyo and Maha seem to think alike.


It seems as if the use PTC material is yet another step from Sanyo to extend their leadership role in the area of mobile power and consistent with the "cut-off" scheme. Not sure how the material will affect the conventional termination scheme of negative delta-V as it will give a increase in voltage as the batteries get hot.

More information in Sanyo's PDF

PeAK
 
Last edited:

Mr Happy

Flashlight Enthusiast
Joined
Nov 21, 2007
Messages
5,390
Location
Southern California
If it seems so obvious, why has this schmeme not been used before and if it works it would mean that someone could just hook up a variable power supply set to the appropriate voltage to charge batteries. The answer I think lies with the fact that batteries are very low resistance and small changes in the voltage can result in large changes in current. This new scheme is actually a combination of ideas whereby a current source (puts out needed voltage to draw fixed current) together with a voltage limit. In their later charger catalogue, they actually specify that the chargers used negative delta-V in conjuction with Peak Voltage Control (PDF). Great minds at Sanyo and Maha seem to think alike.
The drawback is that peak voltage is not consistent from one variety of NiMH cell to another. For instance eneloops easily reach a 1.50 V at the end of a 0.5C charge, whereas I have other old technology cells that have trouble reaching 1.46 V and do not activate the MH-C9000's 1.47 V cutoff. (Eneloops in contrast end up undercharged if you cut them off at 1.47 V as the C9000 does.)

If you can sell a charger and say "use this to charge eneloops and only eneloops" then you have a lot more scope to design a good and reliable charge termination algorithm.
 

Mr Happy

Flashlight Enthusiast
Joined
Nov 21, 2007
Messages
5,390
Location
Southern California
See this thread, and posts #15 and #23.

With a "crap" cell the peak charging voltage flattened out at 1.43 V:
chicagoelectric.png



But with an eneloop it turned skywards and rocketed up to 1.54 V:
eneloop2rf2.png
 
Last edited:

PeAK

Enlightened
Joined
Jan 30, 2009
Messages
238
Seems to argue in favour of using eneloop batteries with Sanyo chargers that use "Peak Voltage control" as a backup to the negative delta-V scheme. The problem with the cheapo batteries is that the max voltage of 1.43V may never allow the "Peak Voltage control" (max V) detection to come in...not so for the eneloop.
 

uk_caver

Flashlight Enthusiast
Joined
Feb 9, 2007
Messages
1,408
Location
Central UK
Though it may well be an absolute voltage cutoff, in the absence of any other information, my first thought would have been that
"halting charging as peak voltage is detected"
would suggest a 0dV cutoff.

At first glance, it would seem odd to have 0dV and -dV in the same charger, but it is not only possible, but it isn't a bad idea for a charger with a wide capacity variation in possible cells to be charged.

Making a homemade charger terminate reliably at typical charge rates of 0.1 to 0.15C, where temperature sensing wasn't an option, I ended up having multiple termination conditions, which included a -dV which kicked in fairly quickly on a drop over a relatively short timescale, and a 0dV which looked at readings over a longer period.

That was mainly to avoid the risk of premature termination in the case of either noise or a relatively slowly-rising signal.

Though the charger was made mainly for charging 3-cell NiMH packs of ~3.5-4.5Ah, it will charge 7Ah 2+3 cell NiCd packs, and 1,2,3 cell AA/AAA packs, and seems to terminate fine on all of those targets.

On the larger capacity cells, I think it's more likely to terminate on 0dv, but the -dV does certainly kick in on AAAs (where it's close to a 1C charge).
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
Though it may well be an absolute voltage cutoff, in the absence of any other information, my first thought would have been that
"halting charging as peak voltage is detected"
would suggest a 0dV cutoff.

I agree here. Charge termination based solely on some set peak voltage, as some have speculated, would be flaky at best due to differences in cells. The best way to terminate charge is using dT/dt, but that's not always practical. The second best way is to stop charging when the voltage stops increasing. If you wait until voltage begins to decrease, then you need to accept a certain amount of overcharge (and possible overheating if your -dV/dt circuit isn't sensitive enough). So in the end the best way is to measure voltage change over some time interval, and when the voltage change is zero either stop charging, or go to a low top-off charge.
 

Mr Happy

Flashlight Enthusiast
Joined
Nov 21, 2007
Messages
5,390
Location
Southern California
So in the end the best way is to measure voltage change over some time interval, and when the voltage change is zero either stop charging, or go to a low top-off charge.
One thing I don't often see mentioned is to normalize the dV signal with charge rate, especially on multi-rate chargers. In other words, don't measure dV/dt but dV/dC, where C is supplied charge.
 

PeAK

Enlightened
Joined
Jan 30, 2009
Messages
238
Though it may well be an absolute voltage cutoff, in the absence of any other information, my first thought would have been that
"halting charging as peak voltage is detected"
would suggest a 0dV cutoff.


...I ended up having multiple termination conditions... a 0dV which looked at readings over a longer period.
.
.
.

Thanks for your comments.

From the graph of the eneloop battery posted by Mr. Happy you can see the the voltage plotted against the estimated charge (in mA-hr). Since the charge current is 1600 mA, you can divide the scale by 1600 to convert the numbers to hours as in the graph below:

2d85fdc.png

Between the x-axis labelled 700 ma-hr and 1500ma-hr is a charge increment of 800mA-hr. The corresponding time is 1/2 hour or 1800 seconds. The rise in voltage is about 25mV between these two charge/time points. The effective rate of change in voltage is 25mV/1800 secs = 0.014mV/sec with a current of 1.6A. The battery is doing a good job of keeping the voltage steady changing only by 0.1V every six seconds (or 1mV every minute). This is a pretty small amount of voltage to detect and could easily be misinterpreted as "zero slope" at the point where the battery was only half charged.
What time frame and voltage difference did you use for your "detection point" ?
How much did you need to play with these parameters to get a dependable termination ?


thanks in advance,
PeAK
 
Last edited:

Mr Happy

Flashlight Enthusiast
Joined
Nov 21, 2007
Messages
5,390
Location
Southern California
Between the x-axis labelled 700 ma-hr and 1500ma-hr is a charge increment of 800mA-hr. The corresponding time is 1/2 hour or 1800 seconds. The rise in voltage is about 25mV between these two charge/time points. The effective rate of change in voltage is 25mV/1800 secs = 0.014mV/sec with a current of 1.6A. The battery is doing a good job of keeping the voltage steady changing only by 0.1V every six seconds (or 1mV every minute). This is a pretty small amount of voltage to detect and could easily be misinterpreted as "zero slope" at the point where the battery was only half charged.
The situation is even worse with the cheap cell (see post #3 above, reformatted). How is a poor charger to know when that cell is fully charged? It can't even look at dT/dt because that is not showing a significant uptick at the end either. (On point of fact the MH-C9000 actually did terminate at the last data point on the graph, so it is not as dumb as it might sometimes appear to be.)
 

uk_caver

Flashlight Enthusiast
Joined
Feb 9, 2007
Messages
1,408
Location
Central UK
Thanks for your comments.

The effective rate of change in voltage is 25mV/1800 secs = 0.014mV/sec with a current of 1.6A. The battery is doing a good job of keeping the voltage steady changing only by 0.1V every six seconds (or 1mV every minute). This is a pretty small amount of voltage to detect and could easily be misinterpreted as "zero slope" at the point where the battery was only half charged.

What time frame and voltage difference did you use for your "detection point" ?
How much did you need to play with these parameters to get a dependable termination ?
My charger uses [cheap] buck driver modules running at ~750mA for charging.
The microcontroller has a 10-bit A/D, with a resistor divider on the input to give a full range of ~0-7.5V, or 7.5mV/bit resolution.

The charging algorithm has a cycle time of ~2.6 seconds.
Charging happens for 2500ms, then a voltage reading is taken at the end of a 100ms waiting period, then there's a 50ms further wait before recommencing charge, to allow any other channels to do their measurement (channels synchronise their cycles to cut down on possible noise sources).
I total up 64 voltage readings into a 'full' voltage reading, to help reduce noise, so one 'full' reading takes place over something like 2.8 minutes.

Once all 64 measurements have been made, I calculate the deltas (differences with the previous full reading), and I keep a record of the last five deltas.
I also keep a record of the highest full reading for a channel, and how many readings ago it happened.

(The main reason for keeping the deltas rather than the absolute readings was space - I didn't have enough space in the chip I used for the single-channel charger to keep records of both absolute readings and deltas, and it was also possible to clip the delta to fit it into a signed 2-byte number without losing any real information, which allowed more readings to be averaged. Having the deltas stored also made checking what was happening while debugging easier.)

I started off just doing terminations on various -deltaV conditions, partly to see which actually caused terminations in practice. The conditions (in order of testing) are:

a) Terminate charge if the sum of the deltas (ie the change over the last 5x2.8 = 14 minutes) is less than -0.5*the sample count.

b) Terminate charge if the latest reading is more than 'MAXVOLTDROP' units lower than the highest recorded reading.
('MAXVOLTDROP' is set at 1.2*the sample count(64)

c) Terminate charge if the maximum too place more than 8 readings ago (which is close to being a weak 0dV condition).

When modifying the charger to cope with larger NiCd cells, I added another condition:

d) Terminate charge if the sum of the deltas (ie the change over the last 5x2.8 = 14 minutes) is less than +0.5*the sample count.

Condition D effectively supercedes condition A, but I left both in place to see what happens.
Much of the time, condition a) still triggers. Had it not been there, d) would have done anyway, on the same cycle, but it's still interesting to know when a -dV would actually have happened.

When not running on the debugger, the unit displays data on an LCD.
Displayed are time, off-charge voltage, cumulative mAh, impedance and a summary of the deltas.
For space reasons, I only have 1 character per delta, so use 0-9 and A-Z for positive deltas and a-z for negative, clipping at +35/-26.
With a quickly rising signal I'd get a fairly solid line of 'Z's.
With a slower signal, I might get the odd smaller upper-case letter or zero appearing, but even with a rise of only a single 7.5mV unit per 4 reading periods (>10 minutes), I'd still get only 2 or 3 zero deltas per 4, and a total over 5 deltas of at least enough to stop the 0dV termination happening.

However, in the case of charging smaller cells, where a better -dV signal would be generated, a sufficiently large -dV can cause charge cutoff relatively quickly, since the positive deltas immediately preceding a drop are likely to be relatively small, and so can be cancelled out by two significant falls (or maybe even just one).

The figures used for MAXVOLTDROP, etc were developed by looking at the readings I got from a range of cells.

Actually, looking back through my code, I realised I had lengthened the charge time period for the one-off charger I made to deal with 7Ah NiCd packs from 2.5 seconds to 6 seconds, with a note that the 2.5 seconds seemed to just about work, but since the NiCd charger was only going to be used for those cells, it was best to build in a safety factor to ensure against premature termination.
However, the figures above seem to work fine for 3-cell NiMH packs of up to 4.5Ah and 1/2/3 AA/AAA cells, and from the displayed characters for the deltas, there don't seem to be any close calls such as consecutive strings of zeros when cells are only half charged.

Termination is on a mixture of the -dV, drop from max and 0dV conditions, and quite possibly a given pack/cell will terminate on different conditions on different charges, potentially down to nothing more than the luck of when the samples are taken.

On the ~4aH packs, which are being charged at close to 0.15C, to the extent it's possible to estimate usage, the charge taken in does seem to relate fairly closely to the charge previously used.

I'm really not sure what would happen at even lower charge rates - possibly if I went much lower, there really wouldn't be anything to reliably detect?

I don't want to alter the actual charge rate, but I suppose I could experiment by changing the charge-waiting time to give a 1:1 charge/wait duty cycle, and see what that does to reliability of termination?
 
Last edited:

PeAK

Enlightened
Joined
Jan 30, 2009
Messages
238
My charger uses [cheap] buck driver modules running at ~750mA for charging.
The microcontroller has a 10-bit A/D, with a resistor divider on the input to give a full range of ~0-7.5V, or 7.5mV/bit resolution.

The charging algorithm....
.
.
.
...I'm really not sure what would happen at even lower charge rates - possibly if I went much lower, there really wouldn't be anything to reliably detect?

I don't want to alter the actual charge rate, but I suppose I could experiment by changing the charge-waiting time to give a 1:1 charge/wait duty cycle, and see what that does to reliability of termination?

Thanks for your detailed answer, I'll get back to you on it when I've more time to digest it. Two questions:

  1. Can I buy your charger ?
  2. When are you going to send your resumee into Maha or LaCrosse ?
cheers,
PeAK
 

uk_caver

Flashlight Enthusiast
Joined
Feb 9, 2007
Messages
1,408
Location
Central UK
Thanks for your detailed answer, I'll get back to you on it when I've more time to digest it. Two questions:

  1. Can I buy your charger ?
  2. When are you going to send your resumee into Maha or LaCrosse ?
cheers,
PeAK
Thanks for that.
As for the charger, would you really be interested in one?

The analyser with the LCD display probably isn't economical to make for sale when compared to commercial offerings, but I do make a few single-channel chargers, for the rechargeable packs for my caving lights.
 
Last edited:
Top