Is there an optimal driver input voltage for efficiency?

flatline

Flashlight Enthusiast
Joined
Jul 6, 2009
Messages
1,923
Location
Tennessee
It's pretty obvious to me that boost drivers are relatively inefficient since a 2AA light gets significantly more than 2x the run time for a particular output level than a 1AA light does.

But it's not obvious to me if buck drivers have a similar problem. Will a 50L light running on 2xCR123A last 2x as long as a 50L light running on 1xCR123A or is the coefficient higher/lower than 2? Does the efficiency improve as the voltage goes significantly above 3v or does it suffer or does it stay about the same?

--flatline
 

joshk

Newly Enlightened
Joined
Aug 8, 2019
Messages
45
Location
USA
The closer the input and output voltages are for the boost driver, the better. Less work = less loss.

Also, when comparing your run-times between flashlights, keep in mind that a alkaline battery with no boost driver is going to be dead around 1.3v per cell. But the one with the boost driver can keep going, and keep you running well below 1.3v per cell.
 

flatline

Flashlight Enthusiast
Joined
Jul 6, 2009
Messages
1,923
Location
Tennessee
The closer the input and output voltages are for the boost driver, the better. Less work = less loss.

Is the same true for buck drivers? Will a 6v to 3v buck driver be more efficient than a 12v to 3v buck driver?

--flatline
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
Is the same true for buck drivers? Will a 6v to 3v buck driver be more efficient than a 12v to 3v buck driver?

--flatline

Everything else being equal, yes.

But the construction of the driver could have a lot more to do with it, especially at low output voltages. FET switches, synchronous rectification, careful design, and quality parts all contribute to high efficiency.
 

Lynx_Arc

Flashaholic
Joined
Oct 1, 2004
Messages
11,212
Location
Tulsa,OK
In boost drivers the greater the voltage difference the more current is needed from the battery and higher current in lights causes power loss and heating because of the resistance in cells, switches, and contacts and drivers. This explains why a boosting from 1 cell vs 2 ends up with more than double the power and less than half the runtime as the current is a lot greater. On the opposite end a buck circuit loses less power the higher the voltage range is. In boost circuits you are converting amperage into higher voltage while in buck circuits the opposite happens converting higher voltage into current. Essentially the formula for boost/buck is power in = power out - losses. The lower the power needed in a light to drive the LED the less losses will be. As DTWdiver equated a high quality circuit can reduce losses a lot plus circuitry can be optimized for a current/power output by better engineering.
IMO buck circuits are more efficient than boost circuits but boost circuits with not much headroom above the LED voltage may not gain much as they drop out and the light is direct drive.
As far as a 6V to 3V vs 12V to 3V buck driver on todays lights it would probably more like 6V-4V and 12V-4V as they tend to want huge lumen outputs which tax circuitry and the 4V from 6V drops to needing more than 4V fro less than 6V. I don't see the need for a 12v battery for a 3-4V LED typically they go with a 8-12V LED.
 

hiuintahs

Flashlight Enthusiast
Joined
Sep 12, 2006
Messages
1,840
Location
Utah
It's pretty obvious to me that boost drivers are relatively inefficient since a 2AA light gets significantly more than 2x the run time for a particular output level than a 1AA light does...............
I haven't seen it significantly different. Maybe 10, 20% better. But the lumen level has to be low enough that the single AA battery isn't being held back due to lack of current sourcing ability. And that difference would close at the low lumen levels.

If a boost driver is 90% efficient and a buck driver 90% efficient, then they are equal as long as the batteries can supply the current. It's really a situation of power. Power out (to the LED) = power in (form the battery) minus the power loss in the driver. But at lower voltages the battery has to source higher current and that is where an efficiency problem can occur associated with the battery's ability.

Even still I prefer 1AA over 2AA lights. I have no problem with boost drivers as long as the manufacturer doesn't try to squeeze so much current out of the single AA that the efficiency drops off...........like the Fenix LD12, NW version. Keep that light at the lower lumen levels and its very efficient. Put it in the highest output and it chokes big time on a single AA.

As DIWdiver points out: "the construction of the driver could have a lot more to do with it".
 
Last edited:
Top