Any open hardware charger / discharger ?

Jaunedeau

Newly Enlightened
Joined
Feb 11, 2009
Messages
11
Hi,

I wanted to make a DIY batteries charger to learn things about charging batteries.

After I read details about 10 methodes for charging and discharching and regenerating, I came to the conclusion that what I wanted is rather a battery study plateforme.

I tryed to find any open hardware / software for that, but no luck yet, I only foçund the MBA thing, which is not really fair priced but not open.

My plan are to :
-Start with the simplest things that work (step one : something that mesures the voltage of a single battery and send it to the PC via USB emulation)
-Add a constant current driver to monitor the charge.

Here, we have an almost working charger (you just must stop the charge manually!)

Then possible direction are :

-Add some function to set the intensity from the PC.
-> From there, we can already have a working charge with java or C# software to play with different charging solution (constant, pulsed) and end of charge detection (timer, V, -dV/t or dV/t=0, d²V/dt²)

-Add dischargin capability
-> this is just another intensity control + a cooled resistive charge, it should allow regeneration and cycling

-Add mesure for the Intensity when discharging
-> add capability to test accus behaviour under duty and to mesure capacity.

With this tools, all the algorithmes for regeneration, charging, and end of charge detection can be enhanced. Then it should be easy to port them to the µC to have :

-a completely autonmous charger (with USB conectivity)
-a modular design where you can build as much lines as you want (limited only by the µC number of analog IO)

Are some people interested in working on this project ?

Regards,
John.
 
That sounds interesting.

I'm currently working on a charger/discharger/analyser using a PIC chip, mainly for 3-cell NiMH packs, but also able to charge 1-4 cells in series per channel.

My intention is to have a standalone device, so rather than have the charger designed to send data to a PC, I'm developing the charger from the beginning in a PIC chip.
Using a debugger on the PC, I can stop the microcontroller to see what the code is doing, and connecting a cheap LCD display to the PIC chip, I can have a real-time readout of various numbers (voltage, mAh, voltage changes, etc.) even without neding to stop the program running.

So far, the charging algorithm seems to be working, cutting off at about the right point, and I'm already producing a few basic single-channel chargers for friends who want them.

I'm about to start on the discharger/analyser part of the code, which should be fairly straightforward.
 
...I tryed to find any open hardware / software for that, but no luck yet, I only foçund the MBA thing, which is not really fair priced but not open.

My plan are to :
-Start with the simplest things that work (step one : something that mesures the voltage of a single battery and send it to the PC via USB emulation)
-Add a constant current driver to monitor the charge.

Here, we have an almost working charger (you just must stop the charge manually!)

Then possible direction are :

-Add some function to set the intensity from the PC.
-> From there, we can already have a working charge with java or C# software to play with different charging solution (constant, pulsed) and end of charge detection (timer, V, -dV/t or dV/t=0, d²V/dt²)

---SNIP---

...Are some people interested in working on this project ?

FYI: BattMan II: A Computer Controlled Battery Manager (Stefan's Electronics Web Site)
 
TakeTheActive : Thanks you for the link. Some of the design can be a bit simpler with nowadays pics (which includes 13 A/D channels and USB), but some of the schema could be used as a good guidline (the current source at least :) )

uk_caver :

>So far, the charging algorithm seems to be working, cutting off at about the right point,

What algorithme did you use ? Could you implement d²V/dt² ("inflexion point") without any simulation and graphical visualisation of the curve ? My intention was to try to use very-fast charge until inflexion point, then switch to 0.1C until dV get x% of what it was just after inflexion (so I can charge a little more than with just inflexion, if tests proves it is better, but still dont have the batterie get as hot as they can with -dV/dV0

And for the charge, what did you use ? Constant intensity ? Charge pulses ? small discharge between charges ?

Regards,
Jhon.
 
For the current design, which charges at 750mA due to using cheap switch-mode regulators from KD for charge control, and which is mainly going to be used for ~4Ah 3-cell packs, I terminate charge basically on zero dV/dt, since the low charge rate has the potential for masking some of the more delicate signals, but even at low charge rates, the pack should always eventually reach a maximum voltage.

To try and cope with noise, I sample the cell voltage off-charge every second, and then sum those voltages over 32 or 64 readings. I then keep a rolling record of the last 5 averaged readings, and compare the current reading with the oldest stored average, and terminate if it isn't higher.

Judging from the pack temperature, the cutoff seems to be around the right point, since packs are starting to warm above room temperature, but not greatly, by the time the charger stops, and it does seem to avoid unwanted early cutoffs.
 
Have you every tried hacking an existing charger? Sometimes you can put inputs to a microcontroller off the chargers display lines and read what its doing that way. I'm thinking it would be interesting to see if it can be done on a c9000, the display is quite complex.
 
I have considered building something like this as well. A couple of power FETs driven in analog mode with PWM from a ATTiny84 AVR would do this easily, using the free C compiler WinAVR and Atmel's free AVR Studio IDE. Very open platform software and hardware wise. Talk to it with serial or use an XBee wireless interface so cables would not be required. USB is also possible, free USB libraries available.

(Note that PICs are okay too, been there, done that and I prefer the AVR programming tools (high quality open/free C compiler GCC), more open environment and 100% selection of FLASH based processors, and a compiler friendly CPU architecture, as well as one instruction per clock cycle performance where 20 MHZ == 20 MIPS. But either will work fine.)

Find the schematic for the West Mountain Radio CBA for ideas on the discharge side. It is somewhere around CPF or one of the RC forums.
 
I guess for a charger, clock speed isn't an issue, and I'm sure pretty much any microcontroller would do.
Likewise, I imagine that there are good-enough tools around for development on various systems - the Microchip environment and free C compiler certainly seem pretty usable.

For discharging, I was planning on using Mosfets in linear mode, but rather than PWM as such, I was thinking of using the tri-state nature of the microcontroller port pins to get an analog drive signal.

I've done that in some LED drivers I built, and even just using the gate capacitance of a FET, it can work, though adding extra capacitor does make it smoother (and in need of less-frequent adjustment)
Basically, the pin drives the FET through a ~1M resistor, and most of the time, the pin is set to be a high-impedance input. Control is done by measuring the current via a sense resistor and analog input, and if the current is too high, the pin is turned on as a low output to lower the gate voltage until the current hits the target value, or vice-versa if the drive current is too low.

That effectively gives a PWM-like effect, but one where, even if done in software, it only takes an occasional examination and adjustment to keep the current close to the desired value, so the code can go away and do other things. I'm sure it must be an established technique, though I'm not sure what it might be called

The LED drivers I made were written in assembler for a very small chip which wasn't possible to debug. Since my battery analsyer will be easy to debug, and has an LCD, it'll be interesting to see how much (or little) cirrection the dischagre FETs need
 
...

For discharging, I was planning on using Mosfets in linear mode, but rather than PWM as such, I was thinking of using the tri-state nature of the microcontroller port pins to get an analog drive signal.

I've done that in some LED drivers I built, and even just using the gate capacitance of a FET, it can work, though adding extra capacitor does make it smoother (and in need of less-frequent adjustment)
Basically, the pin drives the FET through a ~1M resistor, and most of the time, the pin is set to be a high-impedance input. Control is done by measuring the current via a sense resistor and analog input, and if the current is too high, the pin is turned on as a low output to lower the gate voltage until the current hits the target value, or vice-versa if the drive current is too low.

That effectively gives a PWM-like effect, but one where, even if done in software, it only takes an occasional examination and adjustment to keep the current close to the desired value, so the code can go away and do other things. I'm sure it must be an established technique, though I'm not sure what it might be called

...

Very interesting way to generate an analog voltage from a micro. Basically a bang-bang output filtered by an RC network where the C is the FET gate and the tri-state is used as a sample-hold (really sample-drift). These are sometimes referred to as charge-pump circuits. The one concern I would have about that is protecting the FET gate, it is easy to damage an unprotected gate such as this, just touching the PCB could blow the FET. If you add a resistor to ground to protect the FET gate the capacitance will probably be insufficient. Then capacitance can be added, etc.

The modern micros already have PWM circuits with 8 or 10 bits of PWM, and filtering them to drive the gate takes one resistor and one capacitor. Load a number into the PWM driver and you get a nice stable voltage.

Also the micros often have differential inputs and high gain input options to get a good current reading from a low resistance shunt. So measuring current takes two pins and a small resistor.
 
I've done that in some LED drivers I built, and even just using the gate capacitance of a FET, it can work, though adding extra capacitor does make it smoother (and in need of less-frequent adjustment)

The one concern I would have about that is protecting the FET gate, it is easy to damage an unprotected gate such as this, just touching the PCB could blow the FET. If you add a resistor to ground to protect the FET gate the capacitance will probably be insufficient. Then capacitance can be added, etc.

If you add the extra capacitor as uk_caver mentioned, I imagine that would tend to safeguard the gate from static damage by bypassing static discharges to ground? (In reality by soaking up the energy from the discharge into a big reservoir and reducing the dangerous voltage spike to a safe level.) Then a high value discharge resistor could be paralleled with this capacitor to drain it when the circuit is switched off and ensure a safe starting condition when the circuit is re-energized. So then you get where Alan said by a different path, but it still seems like an interesting way to drive an FET...
 
Last edited:
Here is a part that would work well for this project:

AT90PWM216-16SU

This micro has a multichannel 10 bit ADC - both single ended and differential, a DAC, PWM, USART, 16K program flash, etc for $3.50.
 
The discharger circuitry seems to be working fairly well.

I started off with a 1M drive resistor and a 10u gate capacitance, and with that, it seems fairly easy to keep the discharge current very stable, even when only making up/down adjustments once every 500ms.
In practice, I'm only making 'up' adjustments, since the capacitor voltage has a natural tendency to drop over time.

Given the time constant, it does take a little while to turn on and off (about 900ms to drop the current to near-zero levels), but it doesn't need to react quickly.

However, there's a lot of room to lower the drive resistor value to reduce the time constant
The code for an 'up' adjustment is basically to set the FET drive pin as a high output, loop until the A/D output goes above the target, and then disconnect the pin. When done every 500ms, that cycle is typically of the order of a few hundred A/D measurements before the target is reached. That means I have a large amount of potential adjustment resolution that I'm not using, and I could easily drop the resistance by a factor of 10 or so without making the adjustments too 'jerky', which would also save execution time.

Currently, at ~500mA discharge current, the current seems to be stable to within +/-1mA much of the time, and to within +/-2mA almost all the time.

Given the load resistor and amplification I'm using, the ADC has a resolution of ~4mA of load current, so most of the time, I'm actually getting a current stable to a fraction of the A/D resolution.

Is it generally the case that to detect termination, cell voltage is measured under load?
At the moment, I have my code set to measure the cell voltage when the discharge current is off, since the circuit is a bit of a lash-up, and some cells are at the end of cobbled-together leads, possibly with a DMM in circuit to measure current, which could give a fairly inaccurate voltage reading.

That means I currently need to have a ~1s wait before I can measure the off-load cell voltage, but that doesn't need measuring very often, so it doesn't really take much of a bite out of the discharge time, and that time will be reduced greatly when I drop the drive resistor value.

I can see that having/using hardware PWM may be rather simpler, but this approach doesn't take much code or execution time, or any more external hardware than a PWM system would use.
 
Top