mAh vs. mWh and the effect on LED torches/batteries?

piojo

Newly Enlightened
Joined
Apr 13, 2009
Messages
31
In some types of circuits, does a higher voltage battery translate to longer runtime? (mAh * voltage = mWh, so a 2000mAh li-ion battery has 3 times more energy than a 2000mAh NiMH.) And for most tasks, doesn't the power matter more than the current? Or does this distinction not matter, because there is a constant voltage across the load?

For regulated LED lights, is there any such distinction? NiMH AAs (2000mAh) and li-ion 14500s (750mAh) have similar energy capacities, but the current*time capacity is vastly different. Am I correct in thinking that the energy capacity doesn't actually matter for this application, and the li-ion battery will get depleted faster?

(I'm thinking about getting a Quark AA, and I think that this light actually operates differently, depending on the battery type, so the point is moot.) Thanks to anyone who can help clear this up! It's been a while since my physics classes ;-)
 
You are correct in considering the power in the battery, and not only relying on the mAh rating. It depends on the type of circuit in the light. Some may run better on lower voltages (ex. 1.5) and some may do better on li-ion voltages (ex 3.7). Some will run differently on different voltages.

The Quark AA in particular gets brighter on a 14500 battery, with some sacrifice in runtime. In particular, the Turbo mode is significantly brighter on a 14500. If you check out selfbuilt's Quark review (http://www.candlepowerforums.com/vb/showthread.php?t=234960) you can see the differences between a 14500 and a regular AA in his graphs.
 
Definitely have to make the call on an individual application basis.

...However, there are some rules of thumb when dealing with LED flashlights and the regulation circuits often found in them.

Boost regulators are in most cases less efficient than buck regulators. Also, mild boost will generally be more efficient than substantial boost, (like comparing the boost required to run an LED off of a 3.0V cell vs a 1.2V cell).

So... one could say that if you had a 1.2V cell and a 3.7V cell, both with the exact same mWH capacity, an LED flashlight will deliver more of that available stored energy to the LED when using the 3.7V cell than if using the 1.2V cell. When using the 1.2V cell, more of the available stored energy will be converted to heat in the regulator. Runtime will not necessarily be better with one or the other as it would depend on what variations in output come into play.
 
In short- most led lights won't put out as much light on NiMh, thus making them last longer, even though they store less energy and loose less with boosting. (then again, LEDs are more efficient at lower currents, so the boosting things evens out).

Which one to you is entirely up to you and your flashlight. Do check how it performs on different chemistries beforehand tho.:popcorn:
 
Thanks for the answers! I just read about buck/boost circuits. I see that a 1.2V battery will be stepped up to 2.0V (or whatever the LED needs), and a 3.7V battery will be stepped down to 2.0V.

The amount of power through the LED will be constant for a given brightness, and will not depend on the battery type, and because of conservation of energy (and assuming a 100% efficient circuit), the power drawn from the battery will be constant, regardless of the battery voltage. So for a perfect circuit, the mWh of the battery matters and the mAh does not. But for imperfect circuits, buck is better than boost, which is better than buck/boost--but mWh still matters, and mAh does not.
 
You read correctly and understand correctly. I would like to point out though, that as you noticed, the world isn't "perfect" so while conservation of energy remains a rule that is unavoidable, it is not always easy to control what happens to energy in various circuits...

Where I am going with this, is that, having more voltage in and of itself can be a benefit that can more than makeup for a small boost in watt-hours. Very low voltage circuits are going to suffer dramatic losses in every point of possible resistance. Increasing voltage causes a lower percentage of loss to that resistance. So... Assuming you had a "near-perfect" scenario with 100% efficient regulators, one being driven with 3.7V an the other driven with 1.2V, the 3.7V system would be more efficient after taking into account losses from resistance in the contacts/switches/electrical-paths.
 
Very low voltage circuits are going to suffer dramatic losses in every point of possible resistance.

That makes sense--thanks for pointing that out. My gut feeling, though, is that in the Quark AA, it won't make much difference. But I'm very curious to find out, if I end up getting this light.
 
From this and other searches I've done, it looks like the difference is only in the "max" mode--there are 3 or 4 dimmer modes that are regulated more strictly, and are the same brightness, no matter what the battery's input voltage.
 
Top