Some thoughts on the practical limits of LED efficiency

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
The last year has seen enormous leaps in white LED efficiency after about two years of stagnating in the 40 lm/W area. The next year promises to be just as exciting, if not more so, as we have production LEDs breaking the 100 lm/W barrier. Besides being an important psychological barrier, it's roughly at 100 lm/W that LEDs become equal to any other available white light source. What interests me even more is the accompanying reduction in waste heat as LEDs get more efficient. The earliest white LEDs were less than 5% efficient. For all practical purposes then the waste heat equaled the input power. When 35 lm/W was reached this meant about 10% efficiency. While you can use a figure of 90% of input power for the waste heat generated, it didn't affect accuracy all that much if you still assumed waste heat equaled input power.

The Cree XR-E late last year started to change all that. With the debut of 100 lm/W LEDs only 70% of input power will appear as waste heat. It's starting to make sense to use wallplug efficiency when calculating waste heat in order to properly size heat sinks. Anyway, it occurred to me a few weeks ago that while more efficient LEDs are a good thing, we will eventually reach a point of diminishing returns. I think the following set of calculations for the amount of power and cooling needed to replace a 100 watt incandescent lamp (1700 lumens) should make that clear:

LED_Cooling_Comparison.gif


In all cases I used 330 lm/W as 100% efficiency. The exact value doesn't matter though. What does matter are the percentage efficiency numbers. As we can see the point where we are now requires either a very large passive heat sink or a smaller fan-cooled one to effectively replace a 100 watt incandescent lamp. Note that as efficiency increases further cooling requirements drop dramatically. Once we get to 75% efficiency, a figure I feel we can reach, then we can use small passive heat sinks not much larger than the bulb base. What I find even more interesting is that there aren't really any real advantages once we get much above 90% efficiency. Sure, power requirements drop a bit, but at 90% efficiency we can pretty much fit the lamp inside the socket without any additional cooling. While we may not reach even 90% efficiency, once we do I'm not sure if it would pay to expend vast sums to gain that last 10%. If gradual process improvements can make it happen, wonderful, but if it never does it probably won't matter.

Also note that direct replacement of lower-wattage incandescent lamps will have a point of diminishing returns well before 90% efficiency. I used direct incandescent lamp replacement as a benchmark because this probably represents the most rigorous cooling requirements. With a purpose-built fixture, you can easily use passive cooling even with 10% efficient LEDs and still give thousands of lumens of light.
 
Interesting way to look at things. I would hope that when we hit 250lm/W, we would stop this nonsense of packing things into existing lighting fixtures (the screw base and A-type bulb are totally the wrong format for LEDs), and just replace the whole fixture with one built from the ground-up as an LED lighting fixture. Hopefully, we can also get past the notion that lighting a room means a central fixture, instead try more diffuse overhead lighting solutions.
 
I would hope that when we hit 250lm/W, we would stop this nonsense of packing things into existing lighting fixtures (the screw base and A-type bulb are totally the wrong format for LEDs), and just replace the whole fixture with one built from the ground-up as an LED lighting fixture. Hopefully, we can also get past the notion that lighting a room means a central fixture, instead try more diffuse overhead lighting solutions.
I agree wholeheartedly on both points, especially on the diffuse overhead lighting idea. I'm sure we'll have purpose-built LED fixtures in time, just as we now have linear fluorescent fixtures. In the meantime, though, the quickest way to get people to adopt LEDs will be by direct screw-in replacements, even though that makes dealing with the waste heat the most difficult. Once people start to see how long the LEDs last, it will occur to them that maybe you don't need to design a fixture with replacement in mind. Besides that, LED naturally lends itself to much flatter, far less obtrusive fixtures. That will be when purpose-built LED fixtures finally take off.
 
Last edited:
I would hope that when we hit 250lm/W, we would stop this nonsense of packing things into existing lighting fixtures
Considering the fact that the entire point of the modern incandescent fixture is to facilitate regular replacement of the light source, it would seem perfectly logical... but humans are not primarily rational beings. I suspect the Edison screw still has a few decades left.
 
One catch to looking at efficacy numbers is that they always quote the 350 mA values, while 700-1000 is a more typical in use.

I agree we need a standard base for LED lighting, something that would position a small light source precisely in an optic.
 
jtr1962 said:
Sure, power requirements drop a bit, but at 90% efficiency we can pretty much fit the lamp inside the socket without any additional cooling. While we may not reach even 90% efficiency, once we do I'm not sure if it would pay to expend vast sums to gain that last 10%. If gradual process improvements can make it happen, wonderful, but if it never does it probably won't matter.

Where that will pay off is in power headroom.

Try rearranging your data, but instead with this criterion: determine the maximum luminous output that is achievable with a given heatsink -- say, a small passive one. From that standpoint, an improvement from 90% to 95% efficiency yields no perceivable advantage at a given power level, and affords minimal power savings for a given luminous flux -- but the maximum luminous flux per a given waste heat maximum DOUBLES.

When you look at it that way, then you realize that there is considerable incentive to squeeze out that 90% or more efficiency, when you are looking at really high-powered applications like stadium lighting or film studio/location lighting, say.

After all, why should we stay with the output of a 100W incandescent as "room lighting"? We do, in part to the heat buildup as well as power consumption. In the same manner that people will begin to discover that LED's give them many more color choices than they had before -- they also might start wanting much brighter light as a matter of course than they are used to, once they are no longer heat limited as we are with incans.
 
Try rearranging your data, but instead with this criterion: determine the maximum luminous output that is achievable with a given heatsink -- say, a small passive one. From that standpoint, an improvement from 90% to 95% efficiency yields no perceivable advantage at a given power level, and affords minimal power savings for a given luminous flux -- but the maximum luminous flux per a given waste heat maximum DOUBLES.
If you look at it that way then, yes, if you're trying to replace a 200 watt incandescent with LED there is some advantage of going to 95% efficiency. The reason I used 100 watts as the benchmark is because I seldom see anyone using bulbs larger than that. Point taken, though. I did mention that lower wattage lamps have a lower practical efficiency limit, so it follows higher wattage ones would have a higher limit.

When you look at it that way, then you realize that there is considerable incentive to squeeze out that 90% or more efficiency, when you are looking at really high-powered applications like stadium lighting or film studio/location lighting, say.
I thought of that but didn't mention it in my post. My rationale was that studio/stadium lighting is purpose built and inherently fairly large due to the optics. Because of that, higher efficiency wouldn't necessarily give a size advantage. Let's consider a 100,000 lumen stadium light, for example. At 90% efficient (~300 lm/W) it uses 333 watts and produces 33 watts of waste heat. The size of even a passive heat sink in this case doesn't really represent a design constraint given the size of the needed reflector. Don't get me wrong, I'm all for efficiency increase. I just don't feel once we reach about 90% that we should expand vast sums to gain that last 10% more quickly, but rather just let gradual process improvements get us there instead. There is indeed something inherently cool (no pun intended) about near 100% efficient lighting. You could have a 10,000 lumen flamethrower (take that you incandescent guys ;) ) in a keychain lamp if your battery could handle the load (you would still need about 30 watts to power it even at 100% efficiency, nothing can be done about that).

After all, why should we stay with the output of a 100W incandescent as "room lighting"? We do, in part to the heat buildup as well as power consumption. In the same manner that people will begin to discover that LED's give them many more color choices than they had before -- they also might start wanting much brighter light as a matter of course than they are used to, once they are no longer heat limited as we are with incans.
I agree wholeheartedly. Each time I've changed to more efficient lighting some of the extra efficiency was used to gain extra output. For a few years in the early 1980s I used 2 200W (~7800 lumens total) incandescents in my workroom. When I redid the room I switched to two shoplights, each with 2 4 foot T12 tubes (~10000 lumens). Power consumption was reduced to about 150W counting ballast losses but light output was increased. When I went to T8 shoplights a few years ago power output actually increased (I put in a third shoplight) to about 190W but lumen output rose dramatically (~17000 lumens). I could imagine with 300 lm/W LEDs I might aim for 150W power usage. This would give me 45000 lumens. The room is 77 square feet, BTW, so with 45000 lumens it would be extremely bright.

I've often thought that, among other things, LEDs might make lighting rooms to levels of up to 10% of full sunlight practical. For example, a 150 square foot room at 10,000 lux (~10% of solar maximum) would require about 140,000 lumens. At 300 lm/W this requires roughly 465 watts plus driver losses, perhaps 500 watts total. Given that many people already use chandeliers with that much wattage, this isn't an onerous requirement. And since most of the time you wouldn't want the room that bright, you could dim the LEDs for a considerable power savings. Even lighting to 1000 lux, which is typical of a brightly lit office, would only require about 50 watts. No more need for people's homes to be dimly lit like caves just to save energy.
 
I've often thought that, among other things, LEDs might make lighting rooms to levels of up to 10% of full sunlight practical. For example, a 150 square foot room at 10,000 lux (~10% of solar maximum) would require about 140,000 lumens. At 300 lm/W this requires roughly 465 watts plus driver losses, perhaps 500 watts total. Given that many people already use chandeliers with that much wattage, this isn't an onerous requirement. And since most of the time you wouldn't want the room that bright, you could dim the LEDs for a considerable power savings. Even lighting to 1000 lux, which is typical of a brightly lit office, would only require about 50 watts. No more need for people's homes to be dimly lit like caves just to save energy.
In order to get that sort of brightness however, IMO it will be a must to have a very diffused light source designed to minimize glare -- possibly even luminscent panels lining the entire surface of the ceiling (like natural sky light), or something recessed with louvers to prevent line-of-sight to the emitters (like how the sun when it is at its brightest is up at the sky, not down in people's line-of-sight). Lighting a room to 10,000 lux from point-sources behind lamp shades (like traditional indoor lighting) or hanging from chandeliers could lead to some pretty bad glare issues. I do think that 10% solar is a pretty ideal upper-limit. Any brighter than that, and it will be necessary to wear sunglasses inside :crackup:

IMHO the biggest benefit of LEDs is that it can give some freedom to move away from the "bright point source" design philosophy in more areas than just home lighting, because LEDs don't need to be scaled up to be efficient (unlike incan/HID). The abiltiy to have a large number of small emitters with customized optics for things like spot lighting to offer greater control over light distribution is a great idea. Likewise, doing things like using blue LEDs to drive phoshpor panels, or eventually even things like OLED to produce diffused lighting fixtures that would be more like wallpaper than bulky office fluorescent fixtures is a great idea, as well.
 
To boost LED popularity to the truly massive levels (many billions of power LEDs) necessary to fund the increasingly difficult push towards 200 lm/W and higher, we need only create affordable LED-based T8 and T5 retrofit "tubes". (I realize some are out now, but they're nowhere near affordable!) As mentioned above, people are much more willing to swap bulbs than change out entire fixtures. I think that once these retrofits begin to be seen in use, purpose-built LED office lighting will take off in a big way.

I think this will probably happen somewhere around the price point of $15.00 per "tube". Then, the longer usable life of LED (about 2 to 3 times, depending on fluorescent replaced), combined with their ability to withstand practically unlimited switching, will make their use "worth it" for many purchasers. I know I, for one, will start using these in the facility I service when they reach this range. I'm not looking to change out our fixtures, as this would be a HUGE expense. We already have modern high-efficiency electronic ballasts.

T8 tubes run on a bit under 300 mA, perfect for the efficacies I need even with today's LEDs. Also needed in the "tube" is simple circuitry to deal with the starting voltage from the ballast, while making the ballast think it has a proper lamp installed. There will probably have to be a continuous, thinwall aluminum heatsink covered by a tubular plastic cover.

Will these prices be possible anytime soon? :shrug: I don' know. :thinking: I think we need an LED with the 350 mA efficacy of CREE XR-E, SSC P4, or Rebel 0100, but optimized for affordability rather than maximum possible current handling. :thumbsup:
 
This idea of 100% efficiency... that's based on a 1:1 of electrons to photons, correct? I'm wondering why it wouldn't be possible to make an LED that produces high-frequency electromagnetic waves which would then strike a lumenescent medium and emit two lower-frequency photons for every one high-energy photon? That wouldn't be creating or destroying mass/energy AFAIK, so it seems feasable. Of course, two hurdles: make high-frequency radiation with an LED, and then create a lumenescent material that it would strike and energize rather than just pass right through (giving the person you're illuminating cancer in the process).

EDIT: It occurs to me... in the name of higher quality light, would it be possible to have a mixed-lumenescent material? I mean, one that was composed of a mixture of different materials that would react to the high-energy light by emitting different wavelength photons? If you blended the material well and got the ingredients balanced, that could theoretically give you the potential to make an emitter with full visible-spectrum coverage... right?
 
Last edited:
This idea of 100% efficiency... that's based on a 1:1 of electrons to photons, correct? I'm wondering why it wouldn't be possible to make an LED that produces high-frequency electromagnetic waves which would then strike a lumenescent medium and emit two lower-frequency photons for every one high-energy photon?
Actually, 100% efficiency means that for every watt of input power you have one watt of radiated light energy. It doesn't necessarily mean one photon per electron. Anyway, what you're suggesting is similar in principle to the way phosphors work. The problem with generating light at a higher frequency, and then converting to lower frequency, is that there are inherently losses (termed Stokes losses). For the present-day blue with YAG phosphor LEDs, the Stokes losses are on the order of 20%. I think the minimum loss in theory is around 12%. That means if we had a 100% efficient blue emitter then we could at best make an 88% efficient white LED. If the primary emitter is at a higher frequency, say in the deep UV range like fluorescent lights, then the Stokes losses are inherently higher, perhaps well in excess of 30%. By the laws of physics, the only way we can make a 100% efficient white emitter is to mix light from 100% efficient red, green, and blue emitters (or perhaps even four or five emitters for greater color rendering).
 
Hrmmm... that's good to know! Thanks for explaining things. :thumbsup: So your theory is that eventually we'll have multiple highly-efficient colored emitters instead of single "white" emitters? That's interesting... that means eventually we'll all have multi-reflector lights! Or that means they'll shrink emitters down so far you could fit all the component colors on one little die.... wouldn't that be nice! Can you imagine a RBG emitter balanced for color rendering the size of a current LED? I like this technology thing! :naughty:
 
EDIT: It occurs to me... in the name of higher quality light, would it be possible to have a mixed-lumenescent material? I mean, one that was composed of a mixture of different materials that would react to the high-energy light by emitting different wavelength photons? If you blended the material well and got the ingredients balanced, that could theoretically give you the potential to make an emitter with full visible-spectrum coverage... right?
This is exactly how fluorescent tubes already work. A low pressure mercury vapor lamp is used to generate UV radiation. Cheap ones will use a single phosphor, but higher-end fluorescent tubes will use three, or even four phosphors at different wavelengths to improve color rendering.

Current white LEDs use a combination of a blue plus a yellow/green phosphor. Except for an absence of deep red (which causes many of these emitters to have a greenish tint to them), the color rendering on these is pretty good. A solution to this would be to use a two-phosphor mix, one that would convert blue to yellow-green, and another to convert blue to red. With such a combination, it would be possible to make a neutral white LED with very good color rendition.

So your theory is that eventually we'll have multiple highly-efficient colored emitters instead of single "white" emitters? That's interesting... that means eventually we'll all have multi-reflector lights! Or that means they'll shrink emitters down so far you could fit all the component colors on one little die.... wouldn't that be nice!
Multi-reflectors is pretty tacky. My guess is that they'll use multiple LED dice, perhaps 4 different colors (maybe red, amber, green and blue) in a 2x2 layout. custom designed diffusers and optics built into the emitter package itself could then blend the beam smoothly. The only thing that would be limited in this sort of scenario, as opposed to a multi-phosphor LED, is that even color mixing would be a tradeoff with throw, as the diffuser needed to prevent the "rainbow effect" would severely impact the light's capacity for throw.

An advantage of multi-emitters, as opposed to multi-phosphor however is that the ratios of the different colors coudl be changed on demand. The ability to have cool white light, warm white, or even things like monochromatic light from the same light source would be quite nice.

Ultimately though the biggest reason I believe we don't have a lot of options on the multi-emitter front is that right now, using blue + phosphor is the most efficient way to do it by far, because the blue LEDs are way ahead of the others in terms of efficiency.
 
Last edited:
Yes, I think even the InGaN green are only about 1/3 as efficient as the corresponding blue dice??

Rainbow effect could also be practically eliminated if technology advanced to the point that, let's say, red, amber, green, aqua, and blue dice were made very tiny. Then many tens, or even hundreds, could be grouped on one emitter such that they would take up 1mm by 1mm. The resulting beam would then be pretty much what we've got now, and with a negligible rainbow effect if enough sub-dice were used! :D
 
I was just wondering since it seems like we have alot of experts here :), what do a laser diode have for efficiecy in making light out of electricity?
Have a nice weekend guys and girls!
 
I always wonder if evan9162 and jtr1962 are related as their 'numbers' are strangely alike. I know 1 thing they have in common, both of their LED articles are hardcore and takes some time to digest. :thumbsup: for evan9162 and jtr1962. Thanks!
 
Last edited:
Top