Charging video camera battery

The_LED_Museum

*Retired*
Joined
Aug 12, 2000
Messages
19,414
Location
Federal Way WA. USA
Just wanted to be sure I was doing this right before doing any irreversible damage.

I have a Telepower II (model TP-8U) 9.6 volt 1.8 amp-hour ni-cad, and I'm charging it with approximately 12 volts at 600mA (reading off the 20A scale on my DMM). No heating of the battery pack has ever been noticed. I actually started it at 400mA and left it overnight, then turned it up to 600mA a few hours ago. Is 600mA too much?

FYI this is a first-charge scenario. The battery has never been used before. A quick test before charging showed it was much too low to run the camera at that point.
 
D

**DONOTDELETE**

Guest
I think that's a little faster than it needs to be,
at that rate the math would be charged in 3
wink.gif
(I mean 3 hours X 600ma, right?) hours and the battery in 3 1/2 or 4 (because it isn't a mathematically perfect world in battery charging)(ok well maybe sometimes) -- no reason not to double the time and lower the amperage and the voltage. I'd set the volts like 1 volt higher than the pack is measuring, then up it slightly after a while..that's really pampering the batteries, I think... I've been experimenting with my new variable power supply and I think if you keep the forward volts way down you can keep the batteries cooler, given the same amperage..
 

Silviron

Flashlight Enthusiast
Joined
Jun 24, 2001
Messages
2,477
Location
New Mexico, USA
I'd like a real battery expert to chime in on this, but at 12V & 400ma, I would think that any more than 6 hours would be too much, and at 600ma I wouldn't do more than 4 hours, and that only on a completely discharged battery.

It is good that you are monitoring for heat, but remember NiCd doesn't heat up like NiMH, and you can damage it before it gets noticibly hot.

Personally, (unless using a smart charger) my preference is to trickle charge these things at 1/10 ah capacity or less at about 1.0V over nominal voltage. Unless there is a need for a quick recharge, what I'd do is charge overnight (10-12 hours max)at 150-200ma, and you could leave it on maintainence charge as long as you want at ~ 50ma.
 
D

**DONOTDELETE**

Guest
well I'd like two battery experts to chime in,
in harmony, if possible. :>)
 

Silviron

Flashlight Enthusiast
Joined
Jun 24, 2001
Messages
2,477
Location
New Mexico, USA
Ted types faster than I do!

But we essentially agree- 12V is the absolute maximum voltage I'd use, and 10.5 -11V would be better if you can control it to that point.
 

The_LED_Museum

*Retired*
Joined
Aug 12, 2000
Messages
19,414
Location
Federal Way WA. USA
Originally posted by Silviron:
Ted types faster than I do!

But we essentially agree- 12V is the absolute maximum voltage I'd use, and 10.5 -11V would be better if you can control it to that point.
<font size="2" face="Verdana, Arial">I just metered across and got 11.56 volts, and the current is 300 to 310 milliamps. I unhooked it.
 
D

**DONOTDELETE**

Guest
ok so what ya got is (8) 1.2 nominal voltage cells, right? hence the '9.6 volt' label on it. can you tell us what voltage the camera is supposed to be working at? is it a 12 volt camera and still runs with an 8-nicad pac, at a slightly lower voltage? or is it designed to run at a lower voltage than 12?
if the battery pack is supposed to be around (8 x 1.45 =) 11.6 volts (1.45 being the voltage of a charged nicad cell) then I would keep charging it to 12 to 12.5 volts, for a 'little while' or till the batteries got slightly warm -- and the pack measures 11.5 to 11.6 apx closed circuit, after an hour or so of rest...
Silviron, I am typing as slowly as I can with two fingers and one thumb...
 

The_LED_Museum

*Retired*
Joined
Aug 12, 2000
Messages
19,414
Location
Federal Way WA. USA
It's a 9.6 volt battery, and the labelling on the back of the camera says 9.6 volts; and the metal stamped UL label on the bottom says 9.6VDC 8 watts.

It stops working at 9 volts (confirmed with a variable power supply). It also stops working at just under 11 volts. So it's a persnickety little thing with a very narrow operating voltage range.
blush.gif


Having charge the battery all last night and most of today (I took it off the charger a couple of hours ago), let's hook it up and see what happens... OPEN CIRCUIT voltage is 10.97 volts.

I cannot measure it once it is installed on the camera; but it *will* sag once I hit that power button. Ok, it's on there, and the camera comes on & stays on, the transport functions, and the "BATTERY" warning doesn't flicker when the transport is being used.

My guess: There's an extra cell in the pack (9 cells instead of 8), and that it sags to very close to 9.6 volts under load, so the camera doesn't protest to either an undervolt or an overvolt situation.
 

Frank Schwab

Newly Enlightened
Joined
Jun 11, 2002
Messages
46
OK, guys, listen up.

You don't charge NiCd or NiMH batteries using any kind of reference to a voltage. Much like an LED, you care about the current going into the battery. If you're going to charge your battery manually, use a current-limited charger.

A fully-charged (or charging) NiCD or NiMH battery can be 1.4-1.5V. It quickly drops to 1.2V during discharge, and stays there for most of it's life. Thus, your 9.6V, 8-cell pack can read as high as 11-12V during or just after charging - almost exactly what you're seeing! Remarkable.

Most NiCd and NiMH batteries are perfectly happy being charged at a 3-hour rate (A-Hr rating / 3). Many are happy being charged at a 1 hour rate. Some are even happy being charged at 20 minute rates. It's all based on the way the manufacturer built the cell, and charging at too high a rate for the cell you have is a BAD THING.

OK, so let's assume you've chosen the 3-hr rate (600 ma in this example). How long should you charge? The simple answer is, until the battery starts to get warm. This length of time will vary based, of course, on how far discharged the battery is. What happens is that, while the battery is getting charged, most of the power going into the battery is fairly efficiently converting chemicals inside the battery from one state to another. There will be some heating due to resistance losses inside the battery, inefficiencies in the charging, etc., but the battery shouldn't rise more than, say, 10 degrees above ambient (depending on how well it's insulated, etc).

Once the battery is nearly fully charged, different chemical reactions start occurring inside the battery that cause most of the power being dumped into the battery to be converted to heat. The battery temperature will rapidly rise (think about it, at 12V and 600ma you're pumping 7 watts of power into a closed, fairly well insulated plastic box), and if the charging current isn't stopped, permanent damage to the battery will start occurring. This normally shows up as a significantly reduced operating time on the battery (your 1.8AH battery may become a 1.5AH, 1.0AH, or even 0.0AH battery!).

So, my suggestion is, if you're going to charge your battery by hooking it up to a power supply:
1. Set the current limit on the power supply to something reasonable (3-hour rate, say), and keep a finger on the battery. When it starts to get hot, shut off the power supply. If you walk away to go to dinner or something, plan on buying a new battery.
or
2. Set the current limit to something low (30-hour rate), and let it charge for a couple of days. NiCD batteries are generally specified to be able to handle a 30-hour charge rate indefinitely; they won't get damaged. NiMH batteries generally aren't - the manufacturers claim that any overcharge on a NiMH battery is a bad thing.
or
3. If you have profiled the battery you are planning to charge, you can set your power supply to the voltage that the pack reaches at full charge. Keep the current limit set to a c/3 or c/4 rate, and the current should naturally taper off as the battery approaches full charge. Unfortunately, you'll probably have to keep adjusting the endpoint voltage as your batteries age. What probably happened in this situation is the 12V you were cranking into the pack was close enough to the endpoint voltage that, by the time the pack was charged, the current going into the battery had dropped well below the 600 ma that you had seen early in the charging process.

Never use the cheap 10-hour chargers. They pump in too much current to leave the batteries on the charger indefinitely without damage, but don't have any intelligence to shut off the charger when the battery is full.

Never, ever think about charging a LiIon battery pack this way. Ever wonder why you can't buy AA LiIon rechargeable batteries down at Radio Shack? For liability reasons, the manufacturers of LiIon cells won't sell them except to manufacturers who show that they have designed-in circuitry to their product to keep the LiIon cells from being over-charged. If you attempt to charge a LiIon cell without the specific charging circuitry designed to protect the cell, you risk a right nasty little explosion.

And that's my opinion...

/frank
 
D

**DONOTDELETE**

Guest
except for "You don't charge NiCd or NiMH batteries using any kind of reference to a voltage.." (which I don't understand) I agree
with Frank Schwab. Yup, you can adjust your voltage down then if it's not supposed t' be a 12 volt pack..setting thecharging volts at 12 should do it.
Frank, can you tell us anything more about selecting the voltage on other batteries using a regulated variable power supply? How much forward voltage should there be? Or should it ideally reach zero when the battery is full? Or is a half or quarter volt higher OK? Does keeping the voltage down a factor in the heating of the battery, ie. does higher forward voltage get the battery hotter faster? I think this is how my telephone recharger kills the battery packs; it's voltage is to high; 9 volts to a 3 aa battery pack, I use both nimh and nicad packs, and they both get warmer much more quickly in the phone base charger than when I use the power supply set at 4.5 volts , and about 200 ma to charge them.
 

Frank Schwab

Newly Enlightened
Joined
Jun 11, 2002
Messages
46
Well, you can adjust the charging current by changing the voltage, but you really can't even guess what kind of charging current is being delivered to the batteries by measuring voltage. For example, if the pack is completely dead, then setting your power supply to deliver 12V to it might result in several amps of current running through the battery pack (presuming the power supply can deliver that much). This may or may not damage the pack, depending on how the cells were built.

If you want to use a voltage measurement to charge batteries, you must also use a current measurement. You then have to adjust the voltage to get the appropriate current. You will have to adjust the voltage to get the appropriate current several times during the process.

If you call yourself "Ted the Led", you must have some understanding of how LED's work. For example, asking "how much voltage should I use to drive an LED with" is pretty much a nonsensical question. Three identical LEDs may take drastically different currents at identical voltages. For example, setting a power supply to 3.65V and attaching a bare white LED may drive 5 ma of current through one (creating a dull glow), 20 ma of current (creating a bright glow), or 100 ma (possibly blowing it up). That's why most LED circuits attempt to control the current going through the LED, rather than attempting to set a precise voltage.
NiCd and NiMH cells are actually pretty robust. You can cause a lot of damage to them, and they'll still work. For example, I have heard about Radio-Control model guys who, needing a quick charge on their 7.2V packs, will connect them directly across the battery in their car, holding onto the pack and disconnecting the wires when it starts to get hot. You can get an 80% charge that way in a few minutes; you won't get many charges that way, however, before you'll be buying a new pack.

If you want your battery packs to last a long time, I'd suggest paying attention to my previous post. If you don't care so much how long they last, there are a lot of "wrong" ways to do it that'll charge the pack up while doing damage.

/frank
 
D

**DONOTDELETE**

Guest
well Frank, maybe you didn't understand my question. I am using a regulated power supply. the current/amps are regulated as well as the voltage. I measure what current and voltage the battery(s) are getting, in addition to observing the power supply meters, and watch the temperature. (edit) with this regulated power supply I can put 3 amps at 1 and a half volts through a battery, or 20 ma at 30 volts... I already asked about how much forward voltage to use in charging batteries, and it's effect on heating the cells, you don't have to give a straight answer if you don't want to...
Any other experts want to chime in?
 
D

**DONOTDELETE**

Guest
Just to makesure we're all on the same page; I use the term "forward voltage" as the volt reading you get when you take the positive charge wire off the battery pack and measure between it and the pos. battery pack terminal, (while the charger source is on) I am not an electrical engineer, so anyone feel free to correct my usage of electrical terms, I shall endeavor to not obfuscate.
smile.gif
 

Frank Schwab

Newly Enlightened
Joined
Jun 11, 2002
Messages
46
I guess you're not reading my posts either, or I'm not being as clear as I think I am.

Unfortunately, if the equation V=IR doesn't mean anything to you, none of my explanation is going to make sense. I don't have time to compose a two-week undergraduate section on basic electricity.

It doesn't matter what the voltage on your regulated power supply is. What's important is the current that you're putting into the battery. That's what must be limited, what must be controlled in order to safely charge a battery pack. In order to drive a current into the battery, some voltage (depending on the number of cells, the state of charge, etc) will need to be applied to the pack. What is really important is that the current remain constant while the voltage varies. This is what all commercially available, "smart" battery chargers do.
 

Frank Schwab

Newly Enlightened
Joined
Jun 11, 2002
Messages
46
Well, I thought of what might be a helpful analogy while standing in the shower this evening. Of course, the snide little rolly eyes almost stopped me from posting it, but I'm a nice guy so you get it anyway. Jeez, you try to be helpful and people give you poop because you're not helpful enough.

Analogy: Voltage in an Electrical circuit is similar to water pressure in your plumbing. Current (Amps) is similar to the AMOUNT of water that is actually coming out of the tap.

Now, to continue the analogy, put yourself in the place of a battery pack, and imagine you are drinking from a hose (getting charged by a power supply). Of course, to perfect the analogy, your lips are stuck around the hose, and you are relying on your friend to adjust the flow rate.

You can drink from the hose almost indefinitely if the flow is just a trickle. At higher flow rates, you have to gulp like mad to keep up. After a short period, you won't be able to swallow any more water, and something bad will start happening. If the flow rate is way too high, your cheeks blow out causing permanent damage.

Now, if your friend is really good, and you are really thirsty, he can turn the flow rate up to the maximum level that you can gulp water. He can watch your eyes (or check for water coming out your nose) which is his signal that you are full of water, and he can shut it down. Of course, if the phone rings while you're still gulping, and he isn't there when you run out of room to swallow, POP go the cheeks.
Notice that you, as the battery pack, don't care what pressure is in the hose - your ability to swallow water is based strictly on how much water is actually coming out of the hose. Your friend adjusts the flow of water (the amount of current flowing) to match what you are capable of. If he has a gauge to watch the water flow rate, he could determine what flow rate you were most comfortable with. You could then go to another house with different water pressure, and he could adjust the flow rate to match your comfort rate, and everyone would be happy.

And that's my final word.

/frank
 

snakebite

Flashlight Enthusiast
Joined
Mar 17, 2001
Messages
2,721
Location
dayton oh
it is now fully charged.
the voltage drops when you get to full charge.
if using a regulated source the current will go up at full charge.safer to limit it to 1/10c or 180 ma.
so if you forget no problem.
you can do a timed charge if you remember you must put in 150% of what you use.use a blocking diode and a good timer

Originally posted by The LED Museum:
Just wanted to be sure I was doing this right before doing any irreversible damage.

I have a Telepower II (model TP-8U) 9.6 volt 1.8 amp-hour ni-cad, and I'm charging it with approximately 12 volts at 600mA (reading off the 20A scale on my DMM). No heating of the battery pack has ever been noticed. I actually started it at 400mA and left it overnight, then turned it up to 600mA a few hours ago. Is 600mA too much?

FYI this is a first-charge scenario. The battery has never been used before. A quick test before charging showed it was much too low to run the camera at that point.
<font size="2" face="Verdana, Arial">
 

Coherence

Newly Enlightened
Joined
Mar 7, 2001
Messages
130
Location
Bend, Oregon
) with this regulated power supply I can put 3 amps at 1 and a half volts through a battery, or 20 ma at 30 volts
<font size="2" face="Verdana, Arial">I think this is a misunderstanding. I'm guessing you have a knob to adjust the voltage, and one to adjust the current on your supply.

The power is EITHER voltage OR current limited. There is probably an indicator on the supply as to which is active (on mine a LED goes on, 'current limited').

Try this: hook up an LED, a resistor, or a rechargeable battery, etc. to the supply. Set the current limit at 20mA.

Starting from 0.0 volts, increase the voltage. If you have a resistor across the supply, you will see the current increase linearly with the voltage. An LED is a semiconductor, and does not have a linear response.

Once the current exceeds 20mA, the supply should switch modes from voltage limited to curent limited. Look at the voltage. Turning the voltage knob higher should not increase the voltage to the load.


) I already asked about how much forward voltage to use in charging batteries, and it's effect on heating the cells, you don't have to give a straight answer if you don't want to...
<font size="2" face="Verdana, Arial">There is a lot of good info in Frank Schwab's posts. Here is a 'straight' (?) answer:

Use whatever voltage is needed to push the desired amount of current into the battery. Note that this changes over time, you cannot just set some voltage for the whole charge cycle.

Try this: while you still have that resistor hooked to the supply current limited at 20mA, add another in parallel to the first. You have now decreased the resistance in the load. The supply should compensate by decreasing the voltage, and still maintain 20mA.
 
Top