LD12 - LD22 - 1 battery vs 2 batteries performance at low settings

skypirate

Newly Enlightened
Joined
Aug 8, 2012
Messages
10
According to the Fenix site the runtime for the LD12 (1 AA) at 3 lumens is 97 hours and the LD22 (2 AA's) at 3 lumens is 110.

Can anyone help a newbie understand why the runtime for the 2 AA flashlight isn't longer? I'm guessing it has something to do with the regulated power.
 

AnAppleSnail

Flashlight Enthusiast
Joined
Aug 21, 2009
Messages
4,200
Location
South Hill, VA
According to the Fenix site the runtime for the LD12 (1 AA) at 3 lumens is 97 hours and the LD22 (2 AA's) at 3 lumens is 110.

Can anyone help a newbie understand why the runtime for the 2 AA flashlight isn't longer? I'm guessing it has something to do with the regulated power.

Bingo! It's probably got better regulation. I don't think we have runtime graphs on those modes, but the 2xAA '3 lumens' will probably stay around 3 lumens for much longer, while the 1xAA is likely to tail off in output (And current draw) more quickly.
 

SimulatedZero

Enlightened
Joined
Nov 23, 2011
Messages
586
Location
SouthEast, USA
Bingo! It's probably got better regulation. I don't think we have runtime graphs on those modes, but the 2xAA '3 lumens' will probably stay around 3 lumens for much longer, while the 1xAA is likely to tail off in output (And current draw) more quickly.

Close, it more of has to do with the forward voltage that the extra battery is giving the light. Having a higher voltage means you can draw less power to run the same level of light or draw the same amount of power and get a brighter light. This only works to a point though, I would say that the amp draw for 3 lumens at 1.5v and 3v is close enough that you aren't going to see a huge efficiency bump at that level of light.
 

reppans

Flashlight Enthusiast
Joined
Mar 25, 2007
Messages
4,873
....Having a higher voltage means you can draw less power to run the same level of light or draw the same amount of power and get a brighter light...

This seems to be the opposite though... the OP quoted greater efficiency with the lower voltage.
 

SimulatedZero

Enlightened
Joined
Nov 23, 2011
Messages
586
Location
SouthEast, USA
This seems to be the opposite though... the OP quoted greater efficiency with the lower voltage.

You misunderstood, the higher voltage of two batteries will make the low mode more efficient. The voltage is 3v at say 2500mah instead of 1.5v at 2500mah in a two cell light. By adding the second cell in series, and not in parallel, you are doubling the voltage and not the capacity in this particular case. Let's throw some hypothetical numbers out there, I say hypothetical because I don't feel like calculating everything out right now. It's too early for me to go that far. Now, say the LD12 draws .1mah at 1.5v to achieve 3 lumens of flux. Now let's say that the same circuit only needs to draw .087mah at 3v to achieve 3 lumens of flux. That is a 13% decrease in the amount of power being drawn from the batteries. Therefore, the batteries can supply that amount of power 13% longer than it could before. 110 hours is a 13% increase over 97 hours.

Now there are other factors such as the efficiency of the LED and various voltages and currents, keep in mind LED's need a minimum forward voltage to produce light so you can only get so much of a efficiency increase at the lowest level. As well as the efficiency of the circuit at various voltages and currents. Another big factor is heat, as the light gets brighter and hotter the runtime increase will decrease. These factors are some of the reasons you will not get a linear increase of either brightness or efficiency across the board.
 

reppans

Flashlight Enthusiast
Joined
Mar 25, 2007
Messages
4,873
Doesn't make sense to me... runtime should be more dependent upon total energy of the batts, and in series, your example should be 7.5 watt hrs for 2 batts vs 3.75 w/h for a single batt. I understand the LEDs and drivers should be more efficient at 3V than 1.5V so 2 batts should have greater than 2x the runtime of one batt. The Foursevens XPG Quark QPA vs QP2A runtimes at 4 lumens is 2 days and 5 days, respectively.... that makes sense.
 

SimulatedZero

Enlightened
Joined
Nov 23, 2011
Messages
586
Location
SouthEast, USA
I don't think that the amount of power needed to run 3 lumens is going to simply be cut in half because the voltage doubled. It seems to me that it will suffer from diminishing returns there. But who knows, you could always e-mail Fenix and ask them.
 

lightwait

Newly Enlightened
Joined
Jun 30, 2005
Messages
157
Location
New York
This was asked here a few months back. Somebody suggested it might have to do with how the output measured for the ansi flashlight standard. I think run time is measured as when the light output reaches 10% of it's initial output.
I just tried my ld12 and 22 side by side. On fresh eneloops the ld22 is noticeably brighter. I took the tailcaps off and measured current. 29ma for the LD12 and 19ma for the LD22.
Since the LD22 is brighter, maybe it is reaching the 10% point relatively sooner and making the run times look similar for both lights. I'm just guessing though.
 

hiuintahs

Flashlight Enthusiast
Joined
Sep 12, 2006
Messages
1,840
Location
Utah
Well, some of you are on the right track. Power out = Power in minus the efficiency loss of the driver circuit. A 2 cell light should last twice as long as a one cell light..........if the output current to the LED's are the same. It most likely is not. That's the problem both the LD12 and LD22 are stated at 3 lumens but one of them has to be wrong because the math wouldn't work out. I usually don't get too worried about these things because I have tested many lights with a data logging light meter and pretty much everything comes into balance from an engineering perspective.

Let's take lightwait's measurements because that looks to be reality since he measured it. I'll estimate the batteries to be 1.35V, the forward voltage drop across the LED at around 3.15V, and the efficiency of the driver circuit at 85%. I don't think because the LD22 has two batteries vs the LD12 with one that one will have much of an advantage over the other. It really does boil down to power out = power in minus the efficiency of the driver circuit. The LD22 simply has twice the power at the input. We don't care what the voltages are because boost circuits are typically 85 to 90% efficient and the driver cirucuit is going to bump the input voltage to whatever is necessary to drive the LED at the desired regulated output.

Here's an example of how all the calculations stack up. Run time simply would be the 2000mA capacity of the battery divided by the current being drawn from them.
LD12 run time: 2000mAh divided by 29mA = 69 hours
LD22 run time: 2000mAh divided by 19mA = 105 hours (keep in mind that 19mA is coming out of both batteries since they are in series)

LD12 Power in: = .029A x 1.35V = .039W
LD22 Power in: = .019A x 2.70V = .051W (from this alone I can tell that the LD22 would be brighter because its taking in more power from the batteries. And I can tell the LD22 should last longer because it has 2 batteries but the power being consumed by them is not twice the LD12's)

LD12 Power to LED: = .039W x 0.85 = .033W
LD22 Power to LED: = .051W x 0.85 = .043W

LD12 LED current: .033W divided by 3.15V = .0100A
LD22 LED current: .043W divided by 3.15V = .0137A (based on lightwait's measurements, the LD22 should be about 37% brighter assuming linear relationship of lumens to current)

These are theoretical calculations. To be absolute, actual measurements would have to be made. Plus the battery voltage doesn't stay constant either. So this is a rate of change calculation with differential math but you can pretty much estimate things the way I did above.
 
Last edited:
Top