Most Efficient LED 2020?

Remembertheslap

Newly Enlightened
Joined
Dec 12, 2020
Messages
51
Hi, happy new year everybody. Say, what is the most efficient LED at the moment? Is there some kind of ranking for efficiency? Don't care about tint, CRI etc, just need to know which one/s output the most lumens per watt. Cheers, wishing you a better year ahead.
 
If no one knows what the most efficient LED at the moment is, maybe a member can
tell us what the more efficient LEDs that came out recently are?

My ignorant guess would be that if the OP wants something to put out more lumens
per amount of power on a clear night he would be better off with a Cool White
LED that was brought to market recently because the highly desirable LEDs that
render colors more accurately don't put out the same level of light.

Or if the OP wants to see farther for the same amount of power he can go with
a Green light as is available with a K1 in the W1 configuration.

But I am new to this and may misunderstand.........
 
Moved this thread to "LED" subforum, since OP has not limited this to "flashlights" and it will have better visibility to the emitter experts there
 
Midpower LEDs used in things like LED fluorescent tube replacements tend to be most efficient. The highest I'm aware of at the present moment is 230 lm/W from Samsung:

https://www.led-professional.com/pr...ciency-of-mid-power-packages-to-higher-levels

We're probably not going to get much higher than that with blue plus phosphor LEDs as we're approaching the maximum theoretical efficiency of 250 to 270 lm/W.
 
Yes, in high-CRI white it is impossible to get much higher than 270 lm/W no matter what the technology. The exact limit depends on color temperature, CRI, etc. 230 lm/W in a commercial product is impressive. And that was over a year and a half ago!
 
I've been reading up a bit on white LED efficacy and the theoretical limit with phosphor (and other) methods.

Estimates varied depending on method of generation and end criteria (CCT, CRI); above 300 lumens/watt
with some above 400. One paper described laser excitation, along with R+G versus yellow phosphors
etc. One discussion included RGB generation while noting some lag in efficacy improvements for
green, etc.; and using LEDs other than blue with phosphor.

In the end of course it's the practical limits which count, looks like we're approaching some. Question
in my mind is, how far do we "need" to go with improvements, now than LEDs have well surpassed
incans and fluorescents (though not in all respects).

As far as higher efficacies, some discussion of differentiating markets came up, i.e. suitability
for some but not others e.g. industrial lighting versus residential. Examples were high efficacy
achieved at CCT above 5000K, or light with greenish tint.

Also interesting reading out there about UV LEDs for germicidal use, a timely topic these days; and
some of the reasons for such low efficiency of UV-C LEDs, compared to visible LEDs.

Dave
 
Last edited:
You can get over 400 lm/W in theory with RGB but right now red and green LEDs lag blue in terms of efficiency. Also, that 400 lm/W number assumes 100% efficient emitters. In reality after two decades of development we're around 75% wall plug efficiency for blue emitters. Most likely efficiency will top out at 80% to 85%. If you reach similar numbers for red and green emitters then maybe you'll be able to emit white light at ~320 to 340 lm/W. That's a big if. Also note that this is for CRI 80 light. The numbers drop for higher CRIs.

Looking at blue plus phosphor LEDs, you'll always have losses in the phosphor conversion process. For CRI 80 light the efficacy of the emitted spectrum for phosphor white LEDs is roughly 330 lm/W. For CRI 95 it's about 270 lm/W. But you lose perhaps 15% in the conversion process. So even with 100% efficient blue emitters the best we can do with CRI 80 white is around 280 lm/W, maybe close to 300 lm/W with better phosphors. For CRI 95 the maximum is around 230 lm/W, assuming a 100% efficient blue emitter. As I said, we'll never get there. If we get to 85% efficient blue emitters then you'll have CRI 80 LEDs with roughly 240 to 255 lm/W efficiency. Note how close the best production LEDs already are to those numbers. With CRI 95 you'll be lucky to break 200 lm/W. Still, these numbers are more than double that of the next best light source.

We don't really "need" to go much further in terms of efficiency improvements anyway. From my standpoint, the biggest benefit of slightly more efficient LEDs is better heat management, not slightly less energy use. For example, those 230 lm/W LEDs I linked to earlier have a wall-plug efficiency of perhaps 70%. If we're using them in a 200 watt light bulb replacement (3200 lumens), then we'll be using 13.9 watts of power and generating 4.2 watts of waste heat. If we manage to reach a wall-plug efficiency of 80%, then we'll be using 12.2 watts of power but generating only 2.5 watts of waste heat. A relatively small improvement in wall plug efficiency cuts waste heat generation by 40%, even though you're only using 12% less power.
 
The big efficiency leaps of the past have come to a close; the dominant white LED technology (blue diode+yellow phosphor) is a very mature product. There will be tweaks to efficiency, but they won't be for the sake of MOAR LUMENS. I believe that thermal ruggedness is now the goal: improve the LED such that it can operate at higher temperatures without damage or performance sag, reduce or eliminate the need for such prominent heatsinking.

Philips invented a blue-pumped green LED several years ago, which greatly narrowed the dramatic performance gap between high-flux green and high-flux blue die. Red is still well behind blue - maybe someone will find the right phosphor formulation to get blue to make red and we'll see a high-efficiency RGB white LED that's more efficient than blue+yellow.

The other two technologies I'm aware of - NUV+white phosphor and ZnSe don't seem to have achieved market acceptance. NUV+white attempted to replicate the florescent mechanism using simple DC and were available on the market for a time; I gather that package degradation from leaking UV meant short-ish lifespans and I suspect that the NUV die were not very efficient relative to blue. ZnSe had awful color rendition (relative to even "angry Nichia" blue) and I doubt it was ever a regular production item by the entity (or entities) that made them, thus they were surely passing novelties even 15 years ago.
 
Last edited:
We seem to have lost interest in NUV + phosphor mainly for efficiency reasons. As wavelengths get smaller, efficiency goes down dramatically. See here. Basically, we'll be able to use the same mechanisms we used to increase blue LED efficiency but UV LEDs are about 20 years behind. Then you have the second problem. Even if we make UV LEDs with ~75% to 80% efficiency, the Stokes losses in the phosphor conversion process are inherently higher since they're proportional to the difference between the emitted wavelength and the pumping wavelength. Really, at this stage there's probably no point to UV + phosphor. However, the pandemic gave us another reason to continue development of UV LEDs which otherwise wouldn't have existed, namely for sterilization purposes. The 220 to 230 nm band is particularly interesting because those wavelengths kill pathogens without affecting skin or eyes. If we make efficient, long-life emitters in that region, we can continuously disinfect public spaces.

My guess is if we eventually move away from blue plus phosphor, it will be to RGB, or even RAGB for better CRI.

Increased LED efficiency is already dramatically reducing the need for heatsinking and the need for LEDs to operate at higher temperatures.
 
Lots of interesting facts and discussion here.

Not doubting 230 lumens/watt but looking at Samsung datasheets, how it is derived, under which conditions?

From my understanding for efficacy in general:

increasing CRI -> decreases
increasing CCT -> increases
increasing drive current -> decreases

Regarding alternative methods RGB or RxGB, I imagine getting the correct colour mix and maintaining it
over operating conditions and LED life would be challenging.


Dave
 
Not doubting 230 lumens/watt but looking at Samsung datasheets, how it is derived, under which conditions?

The product page claims the following:
  • 65mA If
  • 3.0 Vf
  • 38.8 Lm

38.8Lm / (0.065A x 3V) = 38.8Lm / 0.195W = 198.97 Lm/W

However this is likely representative of a typical product rather than the top-binned product which is invariably where the performance claim originates.

The CRI70 data sheet tells a more complete story. If we go for the lower limits of Vf (AY bin) and upper limits of flux CCT 4000K and higher (SM bin) using midpoint values the claim begins to materialize:
41Lm / (0.065A x 2.65V) = 41Lm / 0.172W = 238.03 Lm/W

One suspects that the premium AY fine-binning isn't available thus the Vf is the average for the coarse bin (2.6+2.9)/2 = 2.75 which leads to 229.37 Lm/W, which with only the slightest nudging becomes 230 Lm/W - fudging by a mere factor of 0.0027.

EDIT: Also of note is that Samsung is using a very old trick - rating at 25C (slightly above 'room temperature' which generates more flux) while the rest of the industry has moved on to more real-world 85C binning. If their output vs temperature chart is accurate it looks like you'll only lose about 6% at 85C vs 25C - not a bad drop at all. Vf will also drop to about 97% at that temperature which kind of negates the output loss in terms of Lm/W.
 
Last edited:
The 220 to 230 nm band is particularly interesting because those wavelengths kill pathogens without affecting skin or eyes. If we make efficient, long-life emitters in that region, we can continuously disinfect public spaces.
So so good to see your digital face. I've been away a while. Thanks for that post. You taught me something I didn't know. Now you've got me curious.
 
So so good to see your digital face. I've been away a while. Thanks for that post. You taught me something I didn't know. Now you've got me curious.
I only started posting here again recently after being away for over 5 years. But anyway, until the pandemic came along there was really no incentive to do much R&D for UV LEDs. UV plus phosphor could never reach the efficiency of blue plus phosphor for white light. And the market for UV LEDs for disinfecting was niche at best but the pandemic changed that. It'll still take well over a decade for far UV LEDs to catch up to blue in terms of WPE, but at least now there's an incentive to push forward with their development.
 
Top