finding optimal - light / battery combination

burpee

Newly Enlightened
Joined
May 27, 2010
Messages
62
Location
Outside
If I have a meter and can read some of the values in a flashlight/battery circuit - can I determine whether increasing the battery capacity will be a worthwhile modification or just a waste of time and battery?

I have an Energizer headband light that is rated as a single one-watt spot LED as well as other multiple dual LED settings.

The light is sold for use with it's own three cell "AAA" alkaline battery bay. I wanted to attach a remote "pocket" battery pack that would hold four NiMH "AA" batteries.

This would effectively change the battery source from 4.5V ~1000mA to a 4.8V ~2450mA capacity.

Is there a way to measure the circuit, using each battery source to see whether this a useful increase in light/battery run time that isn't negatively effecting the LED or wasting battery capacity?

Thank you, any comment appreciated.
 
It's much more simple than you seem to think it is. Given a constant draw, more battery capacity equals more runtime. There's no way to 'waste' battery capacity.

Switching from 3 AAAs to 4 AAs is a bad idea. Alkalines function at below their nominal voltage while nickel-based rechargeable cells function at over their nominal voltage, so matching alkalines cell-for-cell with nickel-based rechargeables is appropriate.

Another issue to consider is the sophistication of the regulation circuitry of the light. If the light relies on voltage drop of 3 series AAA alkalines due to internal resistance to keep the current through the LEDs safely low then using a power source with lower internal resistance (nickel-based rechargeable AAs for example) then the current through the LEDs can increase to levels that can damage the LEDs over time. This is unlikely in your case.
 
Switching from 3 AAAs to 4 AAs is a bad idea. Alkalines function at below their nominal voltage while nickel-based rechargeable cells function at over their nominal voltage, so matching alkalines cell-for-cell with nickel-based rechargeables is appropriate.

Another issue to consider is the sophistication of the regulation circuitry of the light. If the light relies on voltage drop of 3 series AAA alkalines due to internal resistance to keep the current through the LEDs safely low then using a power source with lower internal resistance (nickel-based rechargeable AAs for example) then the current through the LEDs can increase to levels that can damage the LEDs over time. This is unlikely in your case.
I've experienced this first hand: I had installed a 4AA battery holder in place of the 3AAA batteries on a Coleman headlamp and melted the plastic of the lamp assembly using 4 NiMH cells and the lamp was used in a forced cooling environment (mounted on my helmet while cycling at night). The Coleman has a simple resistor to regulate the current getting to the emitter and increasing the cell count from 3 to 4 cells increased the current to the emitter which increased the heat it generated. Going from 3AAA to 3AA will just about triple the run times you get from your light without worry about overheating it.
 
Thanks for all the information. It is true that I am using a cheap meter to test the few lights I have. And I have another cheap meter that reads slightly differently and doesn't have high amperage scale so I am SOL on using it as a backup.

One post says that I "can't waste" battery capacity - but I think that is exactly what I was asking about.

While all I mt testing is old news to you experts, please bear with me.

Is there any way to track either the resistance of a circuit or the luminosity/voltage ratio that would predict an optimal voltage/current combination for a given LED? Can an LED draw additional current without being any brighter?

Is there not "waste" in a flashlight circuit if an LED operates at a voltage that cause heat to be discharged above what is necessary to emit all the photons possible for a given diode?

Is there such a specification as photon-emission/semi-conductive material ratio to predict LED efficiency? Could it be expressed as resistance across an LED when operating at optimal capacity?

I assume LEDs are far more efficient than any other type of light emitter, is this true?

Sorry about the redundancy -but are there FAQs addressing LED technology and the nature of my questions?

Thanks. I've googled but can't find this kind of info in condensed form although I guess it is buried in some technical writings.
 
One post says that I "can't waste" battery capacity - but I think that is exactly what I was asking about.

If you discharge at too high of a current, then you may not get all of the capacity out of the cell. I think that the Ragone plots in this thread demonstrate that very well.

Here is a simple Ragone plot showing three different AA batteries:

PartialAARagonePlot.jpg


From this plot you can see that if you discharge the Energizer e2 lithium AA cell at 0.3W or less, you will essentially get all of the energy out of the cell, ~4.2 Wh. However, if you discharge the same cell at a higher drain rate, say 3W, you get substantially less energy out of the cell, ~2 Wh. Some of that energy is lost as heat dissipated within the cell itself, and some is capacity that is not utilized within the cell.

So ideally you would select a cell that operates at near 100% energy efficiency (i.e. on the vertical part of the Ragone curve) for a given power or current drain. However, in real life, we are often trying to get as much power out of the cells as possible. Let's say you have a flashlight configuration that needs 9W per AA cell. Well, you might select the Energizer NiMH cell shown on that Ragone plot because it is capable of providing that much power, even though you will only get 1/3 of the total energy out of the cell.

I hope this helps to clarify.

Cheers,
Battery Guy
 
This is all very nice, but it runs quite contrary to the point of the thread. The point of the thread asks whether or not there is some optimal battery for a 3AAA headlamp that may or may not be simply the battery with the same voltage as the original battery and the highest practical capacity. There is not.

Beyond that, LEDs tend to become less efficient as drive current increases, with the efficacy peaking at around 40mA of drive current for a single die LED and decreasing with the efficiency thereafter. A multimeter is not necessary.

However, it's all irrelevant to the OP's situation, because the source already has a driver, so an increase in battery capacity means nothing to any component in the headlamp beyond an increase in runtime. There will be no change in current or temperature relative to the light's operation from 3 alkaline AAAs.
 
the problem with the 3AAA configuration is most designs are taking advantage of alkaline usage as a way to throttle the "effective" 4.5v down to drive a ~3.5v LED(s). Some have resistors or linear regulators (smart resistors). The resistored ones just waste the voltage above the "target" Vf output voltage and as the voltage rises they waste more but the voltage to the LED does rise when it goes above the expected voltage while linear regulators will just waste all excess voltage above the target. With non regulated 3AAA lights if you use alkalines you get 4.5v sagging under load to perhaps 4.0v and maybe a little resistance but when you put 3 hot nimh off the charger and get 4.2v (1.4x3) the alkalines will cave down below 4v fast while the nimh will cave a lot slower causing more heat to build up. going to 4 nimh ups to 5.6v, your first mistake is not measuring the battery pack as the 1.2v is nominal, not starting voltage and typically happens under a decent load half way through the life of the nimh cell.
if you want to run that headlamp off external batteries get either a buck circuit, a linear regulator or use 3AAs and a dropping resistor.
Typically my advice is to buy a headlamp that doesn't use 3AAAs instead of investing a lot of time trying to adapt IMO the cheapest design possible (3AAA) to something more useful.
 
I guess I get it. Using fours cells produces more current but no more light.
The voltage I read from either source isn't that much different.

In theory the overloaded 4 cell arrangment produces a two watt light. Although - the LED is supposed to be a one watt. Another smaller light I tried this - does in fact seem to run at full brightness for over twice as long as the normal 3xAAA source. I guess they build some tolerance into the light head, maybe the resisters protect the LED.

Thanks.
 
I guess I get it. Using fours cells produces more current but no more light.
The voltage I read from either source isn't that much different.
if the circuit isn't a linear regulator the extra voltage will lead to more power (current also) to the LED, a linear regulator will just try to burn off the excess voltage and the extra heat could be problematic (melting).
In theory the overloaded 4 cell arrangment produces a two watt light. Although - the LED is supposed to be a one watt. Another smaller light I tried this - does in fact seem to run at full brightness for over twice as long as the normal 3xAAA source. I guess they build some tolerance into the light head, maybe the resisters protect the LED.
Thanks.
most "one watt" LED lights are rarely driven at one watt, many are overdriven to 1.5 watts or more using 3AAAs if they are resistored lights and as the alkalines deplete they quickly die below 1 watt.
Basically if you have a resistored headlamp the 4AA nimh will overheat and fry the LED, a linear regulator will most likely be overheated and fried as it tries to dump the voltage of that extra cell thus gaining you only trouble. Only a buck circuit will be of help to you with 4AAs on 3AA platform as it will convert much of the excess voltage into more current available for use thus conserving the extra voltage headroom into more capacity.
 
Last edited:
Top