Most efficient LED?

pseudonomen137

Newly Enlightened
Joined
Aug 13, 2006
Messages
160
Sorry if this has already been asked before, but I was wondering: around what's the highest efficiency they're making LEDs (any wavelength) these days (in terms of percent optical output from electrical input)? Is there any other light emitter that's more efficient?

Not looking for what the theoretical max is for some process - just wondering what types of efficiencies we can currently achieve. Thanks!
 
Oh c'mon, don't tell me I've stumped the CPF gods this easily! :laughing:
 
Last edited:
Perfect efficiency at 555 nm (all energy in monochromatic green) wavelength is 683 lumens per watt. Best white LED reported in wikipedia is 150 lumens per watt prototype announced by Nichia last year.

Most efficient type of lighting right now is low pressure sodium which puts out 180+ lumens per watt.
 
Last edited:
Thanks, but I was kinda wondering what the most efficient is percentage wise - doesn't have to be white light. I see a lot about lumen output at X Amps but I can't seem to find any info on how energy efficient they actually make these things.

But yeah, just wondering in general, not for white light production. IE if you removed the phosphor from the emitter on a white LED, what's the most optical out you can get from that relative to electrical in.
 
Energy efficiency isn't a very useful metric for lights. For example, if you are interested in radiant energy in the beam, then incandescent lights are very good ... except that you can't see most of that radiated spectrum because it's in the infrared range.
 
Energy efficiency isn't a very useful metric for lights. For example, if you are interested in radiant energy in the beam, then incandescent lights are very good ... except that you can't see most of that radiated spectrum because it's in the infrared range.

That's what I'm interested in actually. Wavelength doesn't matter to me. We can be talking deep IR or UV for all I care. I'm just wondering how efficient these things are for converting electricity into EMR.
 
So far as I know, there is no measurement that can compare the efficiency of devices that turn electricity into EMR. That's because the EMR spectrum is extremely wide (theoretically infinite), going from wavelengths far shorter than the width of a proton to as long as the universe. The ones that scientists usually deal with run from gamma rays (wavelengths in picometers) to ELF (wavelengths in megameters). That includes visible light, x-rays, all the "radio" waves (microwaves, VHF, FM, etc), audible sound, and lots of other stuff that most people never deal with. Even gravity behaves like a form of EMR, with measurable speed and wave-like action. So you have to specify some spectrum of wavelengths in which the EMR behaves in a way that we can interact with, describe, and measure.

Efficiency is really a measure of how well a thing does what we want it to, and how much energy it spends doing things we don't. We would call incandescent light bulbs inefficient because they use a lot of energy producing heat and not visible light, but they are sometimes used as a heater, in which case they're 100% efficient - every bit of energy going in is turned into heat, with none lost. Even all the visible light, assuming you confine it to a sealed box or something, is turned into heat when it's absorbed by the things it hits.

Speaking of heat, that's probably how you should phrase your question, since most of what we consider inefficiencies are just measuring how much energy gets turned into heat as a byproduct of doing whatever else we wanted. So you could ask, what LED, or what light source, has the least percentage of its electricity turned directly into heat? Even that is a tricky question, since many light sources emit some IR, which is totally absorbed as heat by most surfaces. Still, it would be a better way to measure the kind of "pure efficiency" you're talking about. In that case, I suspect that low pressure sodium, since it produces the most lumens/watt, is right up there with anything else in terms of least heat per watt.

Alex
 
Blue LEDs. Production in the 25-30% efficiency range (current dependant), prototypes in the lab in the 40% range. This is the efficiency for converting input electricity into actual light emitted from the diode. I understand internally more light is actually created, they just can not get it out and it recombines as heat.

No it was not a difficult question.

Semiman
 
It is possible to measure radiant (radiometric) flux in watts as opposed to luminous (photometric) flux in lumens. It's not that hard to do with LEDs since their EM spectra do not extend far beyond the visible light range. If you look at the data sheets from Lumileds, Cree etc. some of them actually publish the radiant flux.
 
Last edited:
So far as I know, there is no measurement that can compare the efficiency of devices that turn electricity into EMR. That's because the EMR spectrum is extremely wide (theoretically infinite), going from wavelengths far shorter than the width of a proton to as long as the universe. The ones that scientists usually deal with run from gamma rays (wavelengths in picometers) to ELF (wavelengths in megameters). That includes visible light, x-rays, all the "radio" waves (microwaves, VHF, FM, etc), audible sound, and lots of other stuff that most people never deal with. Even gravity behaves like a form of EMR, with measurable speed and wave-like action. So you have to specify some spectrum of wavelengths in which the EMR behaves in a way that we can interact with, describe, and measure.

Efficiency is really a measure of how well a thing does what we want it to, and how much energy it spends doing things we don't. We would call incandescent light bulbs inefficient because they use a lot of energy producing heat and not visible light, but they are sometimes used as a heater, in which case they're 100% efficient - every bit of energy going in is turned into heat, with none lost. Even all the visible light, assuming you confine it to a sealed box or something, is turned into heat when it's absorbed by the things it hits.

Speaking of heat, that's probably how you should phrase your question, since most of what we consider inefficiencies are just measuring how much energy gets turned into heat as a byproduct of doing whatever else we wanted. So you could ask, what LED, or what light source, has the least percentage of its electricity turned directly into heat? Even that is a tricky question, since many light sources emit some IR, which is totally absorbed as heat by most surfaces. Still, it would be a better way to measure the kind of "pure efficiency" you're talking about. In that case, I suspect that low pressure sodium, since it produces the most lumens/watt, is right up there with anything else in terms of least heat per watt.

Alex
That is a fascinating post. I did not know EMR had such a wide range. But presumably the wavelength "as long as the universe" would have a frequency of 1 cycle per ~14,000,000,000 years, ie zero, and therefore no energy. So how could it exist?
 
Blue LEDs. Production in the 25-30% efficiency range (current dependant), prototypes in the lab in the 40% range. This is the efficiency for converting input electricity into actual light emitted from the diode. I understand internally more light is actually created, they just can not get it out and it recombines as heat.

No it was not a difficult question.

Semiman

Thanks for all the comments but this is basically the type of answer I was looking for.

My primary interest is in lasers, and I know 52%+ efficient 808nm diode arrays are readily available on the market (for those with extended pocketbooks at least), and I was wondering how LEDs could compare. Basically if an LED is meant to put out a specific band, what type of efficiency can you get on that?

So so far I'm hearing up to 30-40% or so from blue LEDs. Any chance there are more efficient ones? IE, how do the most efficient AlGaAs or InGaAs LED compare to the laser diodes I mentioned?

Thanks in advance!

PS: Yeah, I understand if we want to get technical we could complicate this question beyond belief, but I hope by now you see what I'm getting at... Sorry if my phrasing was poor :-/
 
Last edited:
In the graphs I've seen, virtually all the current LEDs show highest efficiency at very low current -- much lower than the typical operating current. So unless you're interested in efficiency at very low light levels where the efficiency is maximum, you'll also need to specify some operating point when comparing various LEDs.

c_c
 
This sort of thread comes up every few months. Here's a copy and paste of my response in the last one:

In terms of percentage of electrical input converted to visible light the best I've heard of so far are the EZBright blue dice made by Cree. The best bins produce 30 to 33 mW of 460 nm blue light from a typical power input of 64 mW, making them 47% to 52% efficient. Note however that since the eye is relatively insensitive to blue light these only have a visual efficiency of about 30 lumens per watt. The best production white LEDs produce around 80 to 85 lm/W but in terms of conversion efficiency this is only around 25%. This is true of both the small, low power 5mm ones and the higher powered ones like the Cree XR-E. Nichia is supposed to start selling a 5mm white LED of 100 lm/W efficiency around now but I haven't heard of it so far. Next year both Cree and Seoul Semiconductor will be selling 100 lm/W white power LEDs. I have heard of some infrared LEDs which convert about 60% of the input power to light. Of course, the light produced by these can't be seen by human eyes.

In terms of raw conversion efficiency the approximate best numbers I've seen so far by color are as follows:

blue: 52%
true green: 14%
red-orange: 25%
red: 40%
yellow green: 5% (?)
infrared: 60%
white (blue plus YAG phosphor): 40% (this is the 131 lm/W sample made by Cree)

I've heard of infrared semiconductor lasers which have gotten around 70% efficiency at room temperature and as high as 85% when cooled to cryogenic temperatures. While these numbers are indicative of what may happen with visible LEDs I suspect it will be some time, if ever, before LEDs reach 90% efficiency. It is theoretically possible, though.

What has changed since then is that 100 lm/W white LEDs are on the horizon. These will have conversion efficiencies of roughly 30% at nominal current, as high as 45% at lower currents. I still feel long term LEDs can reach conversion efficiencies of at least 75%. The 52% efficient at 20 mA Cree blues probably already get close to that at a few mA. It's just a matter of developing ways to prevent efficiency "droop" as we increase current to useable levels.
 
Thank you! So judging from past responses would it be safe to say that lasers diodes attain higher efficiencies than LEDs, or the other way around?
 
Thank you! So judging from past responses would it be safe to say that lasers diodes attain higher efficiencies than LEDs, or the other way around?
At the present time laser diodes in general can attain higher efficiencies but my guess is long term LEDs should come close.
 

Latest posts

Top