LED Power/efficiency relationship?

Yoda4561

Flashlight Enthusiast
Joined
Jan 22, 2007
Messages
1,265
Location
Florida, U.S.A.
So here I am dreaming about flashlights that might be out there a couple years from now and got to thinking about how leds decrease in efficiency as you pump more current through them. At first I figured it was entirely a heat issue so as we got more efficient LEDs we would be able to pump them harder without dropping output efficiency.

I did a quickie search through cpf via the google search thing and read that it was a characteristic of the diode that as you increase voltage/current that you lose efficiency not because of heat but because of *insert large words I can't remember*. Ahh, here's the post in question http://www.candlepowerforums.com/vb/showthread.php?t=171182&page=2
and from post #33
Voltage is all that matters.

It's not the higher current level, it's the higher voltage level resulting in a smaller depletion region, the barrier to which the electrons have to tunnel through has been lowered. The probability that the electron will tunnel through the barrier is a rathar non-linear relationship (I think it's exponential, but I can't be bothered to look it up for now). Suffice it to say, this is why efficiency tends to drop off rathar quickly as you push Vf higher and higher.

So the question is, is that a hard limit for a single die LED? Or is it something that can be improved on as technology progresses ie: a 250 lumen/watt LED that produces 1250 lumens at 5 watts and 2500 lumens at 10 watts as opposed to the rapid loss of current LEDs. There's always the multi-die emitters but they give up throw for the increase in output.
 
Last edited:
You're always going to have efficiency decreases at higher currents simply because the diode has greater than zero resistance. Higher currents imply more energy dissipated as resistive heat even other factors like current density and depletion regions can be completely taken out of the equation. That being said, I think we'll do much better than now. Your hypothetical LED might not get 2500 lumens at 10 watts, but there's a good chance it'll get 2000.

For a good analogy of what can eventually be done, look at what has happened with MOSFETs over the last ten years. At one time, their on resistance was fairly high. Nowadays for many types the package resistance is most of the governing factor. It may well be that way with LEDs where the package leads, bond wires, and bulk semiconductor resistance account for most of the losses.
 
Yeah those numbers I threw out were just "theoretical with all else being 100% efficient" numbers. Nothing would please me more in a flashlight than 2000 lumens in a surefire 2 cell that runs for 1 hour... well aside from a 2 stage version with a low that could actually be used for more than blinding airplane pilots.
 
2000lm, 6Wh and one hour? 333lm/W. Impressive and impossible.
 
According to 'Yoda4561' post - Surefire uses CR123 cells only. No chance for higher capacity.
 
I'm talking about the light body, a surefire 2 cell body like a 6p/c2/ etc. A 10 watt light with current tech would run maybe 45 minutes to an hour. A pair of 3 volt cells at 1500 mah (about what we have now) is 9 watt hours, close enough for me and I'm sure batteries will only get better.
 
2000lm, 6Wh and one hour? 333lm/W. Impressive and impossible.
Theoretical maximum for 3500k/80 CRI using Red-amber-green blue is over 400 lumens/watt. I believe we will see 75% conversion efficiency at some point in the future, so careful what you call impossible (though I don't expect it for a long time).
 
Top