jtr1962
Flashaholic
The last year has seen enormous leaps in white LED efficiency after about two years of stagnating in the 40 lm/W area. The next year promises to be just as exciting, if not more so, as we have production LEDs breaking the 100 lm/W barrier. Besides being an important psychological barrier, it's roughly at 100 lm/W that LEDs become equal to any other available white light source. What interests me even more is the accompanying reduction in waste heat as LEDs get more efficient. The earliest white LEDs were less than 5% efficient. For all practical purposes then the waste heat equaled the input power. When 35 lm/W was reached this meant about 10% efficiency. While you can use a figure of 90% of input power for the waste heat generated, it didn't affect accuracy all that much if you still assumed waste heat equaled input power.
The Cree XR-E late last year started to change all that. With the debut of 100 lm/W LEDs only 70% of input power will appear as waste heat. It's starting to make sense to use wallplug efficiency when calculating waste heat in order to properly size heat sinks. Anyway, it occurred to me a few weeks ago that while more efficient LEDs are a good thing, we will eventually reach a point of diminishing returns. I think the following set of calculations for the amount of power and cooling needed to replace a 100 watt incandescent lamp (1700 lumens) should make that clear:
In all cases I used 330 lm/W as 100% efficiency. The exact value doesn't matter though. What does matter are the percentage efficiency numbers. As we can see the point where we are now requires either a very large passive heat sink or a smaller fan-cooled one to effectively replace a 100 watt incandescent lamp. Note that as efficiency increases further cooling requirements drop dramatically. Once we get to 75% efficiency, a figure I feel we can reach, then we can use small passive heat sinks not much larger than the bulb base. What I find even more interesting is that there aren't really any real advantages once we get much above 90% efficiency. Sure, power requirements drop a bit, but at 90% efficiency we can pretty much fit the lamp inside the socket without any additional cooling. While we may not reach even 90% efficiency, once we do I'm not sure if it would pay to expend vast sums to gain that last 10%. If gradual process improvements can make it happen, wonderful, but if it never does it probably won't matter.
Also note that direct replacement of lower-wattage incandescent lamps will have a point of diminishing returns well before 90% efficiency. I used direct incandescent lamp replacement as a benchmark because this probably represents the most rigorous cooling requirements. With a purpose-built fixture, you can easily use passive cooling even with 10% efficient LEDs and still give thousands of lumens of light.
The Cree XR-E late last year started to change all that. With the debut of 100 lm/W LEDs only 70% of input power will appear as waste heat. It's starting to make sense to use wallplug efficiency when calculating waste heat in order to properly size heat sinks. Anyway, it occurred to me a few weeks ago that while more efficient LEDs are a good thing, we will eventually reach a point of diminishing returns. I think the following set of calculations for the amount of power and cooling needed to replace a 100 watt incandescent lamp (1700 lumens) should make that clear:
In all cases I used 330 lm/W as 100% efficiency. The exact value doesn't matter though. What does matter are the percentage efficiency numbers. As we can see the point where we are now requires either a very large passive heat sink or a smaller fan-cooled one to effectively replace a 100 watt incandescent lamp. Note that as efficiency increases further cooling requirements drop dramatically. Once we get to 75% efficiency, a figure I feel we can reach, then we can use small passive heat sinks not much larger than the bulb base. What I find even more interesting is that there aren't really any real advantages once we get much above 90% efficiency. Sure, power requirements drop a bit, but at 90% efficiency we can pretty much fit the lamp inside the socket without any additional cooling. While we may not reach even 90% efficiency, once we do I'm not sure if it would pay to expend vast sums to gain that last 10%. If gradual process improvements can make it happen, wonderful, but if it never does it probably won't matter.
Also note that direct replacement of lower-wattage incandescent lamps will have a point of diminishing returns well before 90% efficiency. I used direct incandescent lamp replacement as a benchmark because this probably represents the most rigorous cooling requirements. With a purpose-built fixture, you can easily use passive cooling even with 10% efficient LEDs and still give thousands of lumens of light.