Moore's Law vs. LED tech

SmokeDiver

Newly Enlightened
Joined
Jun 17, 2006
Messages
62
Location
Chicagoland
Here's a question: has anybody out there in CPF land done a sort of Moore's Law look at LED technology? Meaning, is there a way to reasonbly predict where the state of the art in LED technology be in 6, 12 or 18-24 months? Using Fenix lights as an example, am I better off buying the P3D now, or waiting 4 months or so for the next generation in LED efficiency to come out to give me that extra 30 lumens? Once again, not to pick on Fenix, but it is sort of a benchmark light in that it is cheap, of good or better quality, and every time I seem to be content with my latest purchase, another light a few steps above comes up. Should I wait until a 200 lumen LED comes out? If so, is it worth the extra $$ to have it now, vs. the time (usually 4-6 months) for the technology to filter down to Fenix, or DX? Three years from now, am I going to have an EDC with a CR123A that can put out 300 lumens for an hour?

http://www.intel.com/technology/mooreslaw/

http://en.wikipedia.org/wiki/Moore's_law
 
How many lumens do you want? The fenix's are focused on a edc not ridiculously rough use crowd, i don't really need more than say 50 lumens for most tasks, or more than a few hours runtime. Gimme a couple levels and im a happy guy. Sure, lights will get brighter and longer running eventually, but if there's one on the market now that fits your needs, go for it.
 
SmokeDiver said:
...Should I wait until a 200 lumen LED comes out? If so, is it worth the extra $$ to have it now, vs. the time (usually 4-6 months) for the technology to filter down to Fenix, or DX?...http://en.wikipedia.org/wiki/Moore's_law
I think Moore's Law is more an article of faith than a physical law, so its' application to LED technology instead of chip fabrication technology is a leap of faith thing, not a science thing. But yes, LEDs will get better.

Only you can say if its worth waiting or not. The best value LED flashlight (or laptop) you could ever buy is the one you get the day before you die, the worst one is the one you get today.
 
LEDs actually have a maximum theoretical efficiency (ie 1 photon per input electron). I'm not sure where they are at the moment, maybe 30-40%? Eventually in a few years time we'll be at 70-80%, so improvements beyond that point will necessarily be incremental. This is unlike computers where the maximum computational power is essentially limited only by the ability to engineer new substrates to perform the computation on.

At the moment though we're at the point where radical LED improvements are possible (witness Cree vs Luxeon) which could very well blow away your existing lights in terms of run time, brightness, heat output, tint etc.

It's like anything, keep waiting long enough for something better to come along and you won't get the chance to enjoy what's available NOW.
 
All the lights I bought 18 months ago are obsolete now with the cree LEDs. I imagine better bins will improve on what's available now too. It's just like computers of the 90's. Not that my old lights are useless though. It just depends how fast you want to keep up with the Jones'.

For me I think when they double the lumens I'll buy another, for instance when a P2D type comes along with 200 lumens. Otherwise you may as well chase your tail. You'll never catch it. :laughing:
 
Although efficiency can't continue to imrpove forever, there are other areas where LEDs can, and will, improve:

1) Power output -- right now most emitters can only handle about 3W of input. Note that the biggest limiting factor here has to do with heat. As LEDs get more efficient though (say, approaching theorietical limits) even small improvements in efficiency will lead to big drops in the amount of waste heat generated inside the emitter.

2) Color rendering -- right now, LEDs are weak in red output. Also, the fact that they have a lot of blue output makes them poor choices for use in the fog, as the blue is th most prone to getting scattered. Using two phosphors, or multiple emitters in the same package to achieve imrpoved color rendering, and more neutral white output is a big room for improvement.

3) Price -- right now, high power LEDs are quite expensive. However, the materials used to make them aren't inherently expensive, it's just so much that the process is expensive, and the economy of scale isn't there. As LEDs start to be adopted for general lighting, expenct them to become a lot cheaper like so many other semiconductor devices (like microchips in $2 toys), eventually these will probalby cost next to nothing, instead of $5 an emitter.
 
Whether or not LEDs are going to continue to do a "Moore's Law", they don't have to
improve very much to surpass everything else in efficiency. After that it's just a
question of color, and LEDs can produce light of any color.
 
nerdgineer said:
Only you can say if its worth waiting or not. The best value LED flashlight (or laptop) you could ever buy is the one you get the day before you die, the worst one is the one you get today.
And I'm sure you can work out which you'll get most use out of.
 
I think that it is important to understand the difference between functional and technical obsolence. A light is technically obsolete when a different light is produced that has superior properties, i.e. brighter, smaller, longer runtime, etc. For instance, the computer I (and probably most everybody else) am using right now is technically obsolete, based solely on the fact that there are faster, more powerful computers available.

A product is functionally obsolete when we can no longer use that product for some reason or because the cost to do so is higher than adopting new technology. For example: if a new flashlight came out that was so efficient that the cost of batteries for my old flashlight was more expensive than the new light, the light would be functionally obsolete. If for some reason, 123 batteries could no longer be made due to regulatory changes, many of my lights would be functionally obsolete. Otherwise, I'm keeping my old lights, they are just as useful as they were before.

The slide rule was made technically obsolete in a very short time in the 1970's, but a slide rule is still functional. Sometimes even more so than an electronic calculator. If you had to make calculations in the rain, under water, or in an environment not conducive to electronic devices, a slide rule would be a good tool to have. The SUV did not functionally obsolete the horse, but Novatac did functionally obsolete all of the HDS accessories by changing the threads.

While LED's themselves may have a theoretical limit, power sources do not, so it is possible for flashlights to stay on the curve of Moore's Law. Market forces may not drive flashlight technology like the semiconductor market does, but I'm sure we will continue to see significant improvements for years to come.
 
I agree with aggiegrads, eventually, the greatest leaps in flashlight technology will be batteries. If you have an 80% efficient light, another 5% increase in efficiency is much less significant than a 20% increase in battery capacity. However, this is not true in the sense that heat is a large problem for LED's. Still, once we get the efficiency high enough that very little heat is given off, then we can start to also improve how much power we can run though single LEDs. You can get a 100W incan bulb, but an LED? I don't think so.
 
Moores law doesn't apply to LEDs at all. Given the limits on LED efficiency, it's fairly safe to say we've had the one big leap from Luxeon IIIs to XR-E / SSC P4. We won't ever see an overnight doubling in efficiency again - from here it will all be incremental, at most 20% or so at a time. That's not to say they won't get much better than now, but the step change we have just seen which shocked many here is the only one like it, so upgrade now and don't really worry about future improvements.
 
chris_m said:
Moores law doesn't apply to LEDs at all.
I agree. Moore's law is based on the size of an individual transistor. Basically, as you reduce the size of the transistor by new process, you increase the number of transistors you can put on an IC for the same cost. With LEDs, we don't WANT 'more' devices on a chip, we want bigger, more efficient, better power dissipation, better high-temp operation, and stuff like that.
 
There is sorta a limit. I don't think there's really a limit to how big of an LED you can make to suck all sorts of gigawatts, but there is a limit to how much light you can get per electron. It depends on the color of the light.

For a white light the limit is 242 lumens/watt, for maximum light that people can see it's 683 lumens/watt

http://en.wikipedia.org/wiki/Luminous_efficacy
 
Last edited:
Alteran said:
I agree with aggiegrads, eventually, the greatest leaps in flashlight technology will be batteries. If you have an 80% efficient light, another 5% increase in efficiency is much less significant than a 20% increase in battery capacity. However, this is not true in the sense that heat is a large problem for LED's. Still, once we get the efficiency high enough that very little heat is given off, then we can start to also improve how much power we can run though single LEDs. You can get a 100W incan bulb, but an LED? I don't think so.
Thanks for the vote of confidence.

Yup, as the efficiency of LEDs get better, the smaller the problem of heat dissapation becomes. So even if we reach the maximum efficiency of LEDs, there is no theoretical limit to output, and therefore great advances could still be made in overall output as battery technology evolves.
 
sparkysko said:
There is sorta a limit. I don't think there's really a limit to how big of an LED you can make to suck all sorts of gigawatts, but there is a limit to how much light you can get per electron. It depends on the color of the light.

For a white light the limit is 242 lumens/watt, for maximum light that people can see it's 683 lumens/watt

http://en.wikipedia.org/wiki/Luminous_efficacy

I don't meant to be needlessly picky, but I am. Sorry. :p

Anyhow, I wanted to say that any light we cannot see is 0 lumens, because lumens factor in how receptive we are to light. So if we cannot see it, it doesn't have single lumen, even if it's enough Ultraviolet to blind us in seconds. :aaa: For anybody who happens to know, is that why integrating spheres need to be calibrated? So they can accurately rate the lumens of a light by factoring in our sensitivity to that wavelength(s) of light?
 
I believe the technical definition of lumens includes a definition of the wavelengths over which the light is to be measured (to match eyeball wavelength sensitivity). Most common light meters, etc. sort of cover a similar spectrum so we generally ignore that, but it's there if you need to be exact.
 
Top