Cost per lumen/hr.

batmanacw

Enlightened
Joined
Aug 5, 2007
Messages
367
Location
Andover, Ohio
Has anyone who has proper equipment ever done a comparison of the cost per lumen per hour for AA's and 123 batteries?

Lets say you have an Alkaline AA. How many lumens average does it produce per hour, and how much does that cost compared to lithium AA's and then lithium 123's?

I am just curious if the different types of batteries end up yielding similar amounts of light per dollar spent.
 
no but then we hafta graph that alongside of enjoyment factor (i like tiny lights) so it would be dollars per enjoyment...and thats a rather subjective science, but again i also use rechargeables so you hafta figure the cost of electricity in there ...and how many times i have to take my wife out for dinner before she forgives a new flashlight purchase (so that full enjoyment potential can be reached) all in all...my time just isn't worth the graph lol

sorry, i had to say it, i actually would be very interested in an OBJECTIVE perspective on lumens/dollar
 
The metric that you want is lumen-hours/cell.

This is going to depend in part on the flashlight itself. But assuming similar emitters and drivers, you should look at the energy capacity (watt-hours, milliwatt-hours, or joules) of the cell.

Of the three battery types that you mentioned, alkaline AA cells are going to be the loser in energy capacity. This is especially true under high current draw, where the high internal resistance of the alkaline cell causes a lot of the energy to be used in heating up the cell itself.

Alkaline AA cells may still make sense to buy for some purposes, since they're a lot cheaper than lithium cells.

The debate over lithium AA vs. CR123A has come up in other threads. If I remember correctly, the lithium AA has a slight advantage in nominal capacity. But since you have to draw twice the current to get the equivalent power draw compared to a CR123A cell (since the AA has half the voltage), the practical capacities come out pretty close.
 
The metric that you want is lumen-hours/cell.

This is going to depend in part on the flashlight itself. But assuming similar emitters and drivers, you should look at the energy capacity (watt-hours, milliwatt-hours, or joules) of the cell.

Of the three battery types that you mentioned, alkaline AA cells are going to be the loser in energy capacity. This is especially true under high current draw, where the high internal resistance of the alkaline cell causes a lot of the energy to be used in heating up the cell itself.

Alkaline AA cells may still make sense to buy for some purposes, since they're a lot cheaper than lithium cells.

The debate over lithium AA vs. CR123A has come up in other threads. If I remember correctly, the lithium AA has a slight advantage in nominal capacity. But since you have to draw twice the current to get the equivalent power draw compared to a CR123A cell (since the AA has half the voltage), the practical capacities come out pretty close.

I can't believe none of the super geeks here have done a study on this! :grin2:

It would be great to get an idea of the amount of bang for your buck with each different technology. That would be of major interest to many people who think 123 batteries are too expensive.
 
Has anyone who has proper equipment ever done a comparison of the cost per lumen per hour for AA's and 123 batteries?


I understand what you are asking, but the fact is, there are far to many variables to come to any one conclusion, from such a comparison, that would have any real meaning.

The simplest way to go about it I suppose, would be to compare the relative power of the cells compared, which information is widespread. Then you can compare the cost per Watt, then figure out how that relates to individual LEDs, drivers etc. (which will all have a different result) and take it from there.

Your question is a bit like comparing the relative benefits of apples vs. oranges. Ten different tests could provide ten different answers, depending on the parameters used for the tests.

Dave
 
Alkaline AA capacity in Watt-hours
(from http://www.candlepowerforums.com/vb/showthread.php?t=64660 )

Duracell @0.5A draw - 1.75Wh
Duracell @1A draw - 0.89Wh

My expectation is that at low drains (<100mA), max output would be about 3.3Wh, given an average voltage of ~1.25V and a nominal capacity of ~2700mAh

Output will also be lower (possibly much lower) at low temperature.

(For comparison, with Eneloop NiMH AA cells, the 0.5/1A capacity figures are ~2.3/2.4Wh, so they'd be better than alkalines at a 0.5A drain even if stored for a year after charging, and would be miles ahead at higher loads. Alkalines would only win at low loads (maybe under ~200mA?) )

Primary lithium CR123a
(from http://www.candlepowerforums.com/vb/showthread.php?t=67078 )

Decent brands @0.5A draw - 3.5-4.0 Wh
Decent brands @1A draw - 3.0-3.2 Wh

Lithium AA
At 1A draw, ~3.7Wh
(from http://www.candlepowerforums.com/vb/showthread.php?t=157799 )

So, primary lithium AAs and 123s are pretty similar in capacity per cell, and are way ahead of alkaline AAs at anything other than mild loads.

Comparison between alkaline and lithium is highly load-dependent and also temperature dependent, and even if it wasn't, capacity/$ varies so much between people depending where they buy cells that it's not really worth calculating as a general figure.
 
batmanacw said:
I can't believe none of the super geeks here have done a study on this!

Sounds like a challenge :)

[edit in]

I was just thinking about all of the constants that would have to be used that would be variables in the real world to get any sort of chart to fit inside of something readable inside of a day, lol...

I might pluck away at it in a spreadsheet or something as it sounds like a fun idea..
 
Last edited:
Yeah, I was thinking about putting together a GUI calculator to figure out relative costs/ watt for any given cell at a given load, as well as breakeven points for rechargeable systems. The stumbling blocks are interpolating discharge curves for an arbitrary load, and, um, actually learning how to do a GUI in Python (what I'm currently learning). I actually typed out a PM to silverfox requesting the raw test data, but the server ate it without a trace, and I couldn't be assed to compose it again.
 
First of all. You guys are awesome. Thanks for the replies and any other work that is done on this project.

I think it would be wise to just pick a certain emitter and stick with that for testing. I do understand that different temperatures would affect the output of the different cells and there are many, many different variables.

I do think its possible to find a very generic idea of the lumen hours with basic brand name alkalines and lithiums.

Unfortunately I don't have the time or knowledge to do a test like this, but I can't help but think it would be worth the time.
 
It is likely that any geek that cares about the cost per lumen would simply use rechargeables - the cost would be a fraction of what you would pay for primary cells.

Actually, lots of people debate buying 123 lights because of the perception that the batteries are too expensive. I believe that the cost is nearly the same or cheaper with 123 batteries or at least lithium AA's.
 
The primary vs secondary debate depends on usage patterns and personal preference. Most of us will have a smattering of both, with rechargeable options for daily carry or heavily used tools and primary backups for emergencies.

As for the Lumen-Hour/$ concept goes. Here's some thins that I started thinking about:

Very low levels on many flashlights are not actually all that efficient. The driver in many cases consumes more than is fed to the LED on many of the ~1L modes on some lights. Also, high flux LEDs (actually all LEDs) do have a sort of "cut-in" point where they ramp up to their peak efficiency. Lower drive levels do not continue to result in increased lumen/watt conversion all the way down to nothing, I'm not sure where the efficiency peaks on a modern cree, but this would technically be a factor for some of the extreme low modes on some lights, that could actually be operating the LED at a pretty low point of efficiency. These issues is partially offset by the fact that primary cells benefit most from slow drain rates. The Lumen-Hours/$ would still probably calculate out to be relatively poor on some of the low modes out there.

Medium or "normal" modes of many LED flashlights will probably have both the driver and regulator running in pretty good efficiency ranges, while high modes (or "turbo") will take an efficiency hit usually at both the driver and LED.

Generally speaking, buck regulation is more efficiency than boost (buck is often as high as 99%, however, non-flickering designs will take a few percent hit). Extreme boost like from 1x1.2V cell up to an LED is usually the worst efficiency, sometimes as bad as 50% but usually closer to 75%.

All of these factors and more come in to play and make such a comparison very hard to make. One starts realizing, that even an extensive chart is not going to properly cover every flashlight out there unless each flashlight is included in the chart and tested.

----------------

With that in mind, how about I'll throw out some constants and some assumptions and we'll run a few numbers and compare a few possible hypothetical lights:

1xAA alk vs 1xAA Lithium vs 1xCR123 LED light with 3 modes (5, 25, 125 lumen):

We'll assign a 100lm/w constant and assume this across all ranges for the LED.

When driven by a single alkaline cell, the input voltage will range from ~0.8-1.5V to make near-full use of the cell. The voltage will steadily decline through this range through the discharge resulting in continuously diminishing efficiency. Driver efficiency will range from ~50-85% across this range. We'll assume an average of 75% efficiency.

Power consumption in each mode will average ~0.067W, 0.33W, and ~1.67W (low, medium, high respectively).

I should point out at this point, that the high mode in this comparison is creating a very inefficient circumstance for an AA alkaline cell, as it will require current demands exceeding 1 amp at all times (probably averaging close to 1.5A). Alkaline cells at these drain rates perform horrendously in capacity tests. I wanted to run this example in this fashion in order to illustrate the strength of alkaline at low drain vs it's weakness at a high drain rate. Lithium chemistry and pretty much all rechargeable cells tolerate these higher drain rates with less penalty.

Since I haven't seen, nor care to take the time at this time to locate test results, I'm going to throw out some extrapolated educated estimates on the capacity we can expect from the cell at these various drain rates:

2.8WH, 2.2WH, 0.6WH (low medium high).

42 hours, 7 hours, 20 minutes. (low medium high).

Assuming $0.40 per cell.

Alkaline
Low Mode: 525L-H/$
Medium Mode: 438L-H/$
High Mode: 104L-H/$


--

Lithium AA:

I'll take the same flashlight from above, higher average operating voltage increases average efficiency to 80%.

power draw from unit:
0.063W, 0.313W, 1.563W (low medium high)

Battery capacity in each mode:
5WH, 4.8WH, 3.5WH (low medium high)

Runtimes:
79 hours, 15 hours, 2 hours 15 minutes. (low medium high)

Assuming $2.00 per Energizer Lithium.

Lithium AA:
Low Mode: 198L-H/$
Medium Mode: 188L-H/$
High Mode: 140L-H/$


*Note: Marked improvement in operational costs on high mode, everything else suffers in costs dramatically. This comparison does not adequately cover the effects of temperature, keep this in mind! In very cold weather, the Alkaline is likely to leave you in the dark very quickly!

----

CR123:

Another increase in average input voltage results in an increase in average operating efficiency. Lets call it 85% now.

power draw from unit:
0.059W, 0.294W, 1.471W (low medium high)

Battery capacity in each mode:
4.2WH, 4.1WH, 3.9WH (low medium high)

Runtimes:
71 hours, 14 hours, 2 hours 40 minutes. (low medium high)

Assuming $1.25 each

CR123
Low Mode: 284L-H/$
Medium Mode: 280L-H/$
High Mode: 266L-H/$




-----


but but but.... Yea... :)

-Eric
 
Last edited:
The primary vs secondary debate depends on usage patterns and personal preference. Most of us will have a smattering of both, with rechargeable options for daily carry or heavily used tools and primary backups for emergencies.

As for the Lumen-Hour/$ concept goes. Here's some thins that I started thinking about:

Very low levels on many flashlights are not actually all that efficient. The driver in many cases consumes more than is fed to the LED on many of the ~1L modes on some lights. Also, high flux LEDs (actually all LEDs) do have a sort of "cut-in" point where they ramp up to their peak efficiency. Lower drive levels do not continue to result in increased lumen/watt conversion all the way down to nothing, I'm not sure where the efficiency peaks on a modern cree, but this would technically be a factor for some of the extreme low modes on some lights, that could actually be operating the LED at a pretty low point of efficiency. These issues is partially offset by the fact that primary cells benefit most from slow drain rates. The Lumen-Hours/$ would still probably calculate out to be relatively poor on some of the low modes out there.

Medium or "normal" modes of many LED flashlights will probably have both the driver and regulator running in pretty good efficiency ranges, while high modes (or "turbo") will take an efficiency hit usually at both the driver and LED.

Generally speaking, buck regulation is more efficiency than boost (buck is often as high as 99%, however, non-flickering designs will take a few percent hit). Extreme boost like from 1x1.2V cell up to an LED is usually the worst efficiency, sometimes as bad as 50% but usually closer to 75%.

All of these factors and more come in to play and make such a comparison very hard to make. One starts realizing, that even an extensive chart is not going to properly cover every flashlight out there unless each flashlight is included in the chart and tested.

----------------

With that in mind, how about I'll throw out some constants and some assumptions and we'll run a few numbers and compare a few possible hypothetical lights:

1xAA alk vs 1xAA Lithium vs 1xCR123 LED light with 3 modes (5, 25, 125 lumen):

We'll assign a 100lm/w constant and assume this across all ranges for the LED.

When driven by a single alkaline cell, the input voltage will range from ~0.8-1.5V to make near-full use of the cell. The voltage will steadily decline through this range through the discharge resulting in continuously diminishing efficiency. Driver efficiency will range from ~50-85% across this range. We'll assume an average of 75% efficiency.

Power consumption in each mode will average ~0.067W, 0.33W, and ~1.67W (low, medium, high respectively).

I should point out at this point, that the high mode in this comparison is creating a very inefficient circumstance for an AA alkaline cell, as it will require current demands exceeding 1 amp at all times (probably averaging close to 1.5A). Alkaline cells at these drain rates perform horrendously in capacity tests. I wanted to run this example in this fashion in order to illustrate the strength of alkaline at low drain vs it's weakness at a high drain rate. Lithium chemistry and pretty much all rechargeable cells tolerate these higher drain rates with less penalty.

Since I haven't seen, nor care to take the time at this time to locate test results, I'm going to throw out some extrapolated educated estimates on the capacity we can expect from the cell at these various drain rates:

2.8WH, 2.2WH, 0.6WH (low medium high).

42 hours, 7 hours, 20 minutes. (low medium high).

Assuming $0.40 per cell.

Alkaline
Low Mode: 525L-H/$
Medium Mode: 438L-H/$
High Mode: 104L-H/$


--

Lithium AA:

I'll take the same flashlight from above, higher average operating voltage increases average efficiency to 80%.

power draw from unit:
0.063W, 0.313W, 1.563W (low medium high)

Battery capacity in each mode:
5WH, 4.8WH, 3.5WH (low medium high)

Runtimes:
79 hours, 15 hours, 2 hours 15 minutes. (low medium high)

Assuming $2.00 per Energizer Lithium.

Lithium AA:
Low Mode: 198L-H/$
Medium Mode: 188L-H/$
High Mode: 140L-H/$


*Note: Marked improvement in operational costs on high mode, everything else suffers in costs dramatically. This comparison does not adequately cover the effects of temperature, keep this in mind! In very cold weather, the Alkaline is likely to leave you in the dark very quickly!

----

CR123:

Another increase in average input voltage results in an increase in average operating efficiency. Lets call it 85% now.

power draw from unit:
0.059W, 0.294W, 1.471W (low medium high)

Battery capacity in each mode:
4.2WH, 4.1WH, 3.9WH (low medium high)

Runtimes:
71 hours, 14 hours, 2 hours 40 minutes. (low medium high)

Assuming $1.25 each

CR123
Low Mode: 284L-H/$
Medium Mode: 280L-H/$
High Mode: 266L-H/$




-----


but but but.... Yea... :)

-Eric



Very interesting! On a low performing LED the alkalines ran away with the testing for cost per l/h. Since most of use like the higher power buggers, we get more use from the lithiums.

I pay about $1.66 per AA lithium at Sams Club, but $2 is typical for buying them at walmart.

Great post and thanks for all the work. I hope others will chime in on this.
 
but but but.... Yea... :)

My compliments on a very well written post, mdocod. :thumbsup:

It's interesting how many assumptions, probably's, and effective "if's" you presented. These of course, would have to be removed to accomplish anything resembling an accurate study, and you haven't even scratched the surface, as far as variables are concerned!

I think batmanacw has asked a reasonable question. To answer it however, the field would have to be limited to one specific cell of a certain brand, date of manufacture, temperature and so on. Additionally, the LED would have to be one certain model, chosen from a certain Vf bin (which one? :thinking:) where there is quite a bit of variability within bins, and the driver, if used, would have to be a particular model as well (another range of variables within the same make and model, whether it was made on a Tuesday or a Friday etc.). Then there is, as you pointed out, the question of what level the LED should be driven at. High, medium, low, best, worst?

I suppose then you could repeat this with other cells, provided the same date of manufacture etc. was used. I'm sure I've missed at least some essential requirements, but once all was worked out, you could possibly come up with a conclusion. This conclusion, however, would only have any meaningful results when the exact same components were used. What about the other gazillion component combination's available? :)

My point being, the result from this endeavor would be useless for comparison to any other combination, in any practical sense. There are just too many variables. My suggestion would be to ditch the magnifying glass, and just take a look at the big picture. Read the posts and reviews on the Forums, and try to gain some personal experience, to assess the economics and efficiency of different cell chemistries etc.

Dave
 
Top