What's next for LED technology?

Vermonter73

Enlightened
Joined
Jul 25, 2006
Messages
335
Anyone heard good news or rumors about what's next in LEDs? I heard something about a new Cree that's 160 lumens per watt, but I forget the details.
 

blasterman

Flashlight Enthusiast
Joined
Jul 17, 2008
Messages
1,802
My guess is we'll start seeing more multi-emitter type LEDs such as MC-E, P7s, Bridgelux, etc. More efficiency would be nice, but the technology so far seems to be in slow motion for neutral and warm-white tints, and thats a requirement for commercial use.

Right now more efficiency is a solution in search of a problem because there are other issues regarding large scale implementation of LEDs such as power requirements, heat envelopes, etc.

I'm seeing a similiar analogy with CPUs. Eventually heat and inefficient pipelines put the damper on faster and faster clock speeds, so Intel and AMD opted for more cores per die.
 

IMSabbel

Enlightened
Joined
Dec 4, 2004
Messages
921
Sorry, the Multi-Die analogy is 100% wrong. In facts, its nearly completely opposite...


I think that the next 18 months, we will begin to see the upscaling of leds. By bigger dies, multidies, complete systems, etc.

There has been quite a bit of news recently in that regard (just look at the threads on the first page here), and it will continue.

PEAK Efficiency, otoh, i dont expect to improve a lot / quickly. Efficiency under heavy load might, though.
 

Gunner12

Flashaholic
Joined
Dec 18, 2006
Messages
10,063
Location
Bay Area, CA
I'm hoping for anti droop technology so the efficiency stays higher for higher currents. They can already hit over 100 lumen per watt, now it's keeping that efficiency at a higher power. Might require a new phospor and better heat transferring.
 

znomit

Enlightened
Joined
Aug 1, 2007
Messages
979
Location
New Zealand
I'm hoping for anti droop technology so the efficiency stays higher for higher currents. They can already hit over 100 lumen per watt, now it's keeping that efficiency at a higher power. Might require a new phospor and better heat transferring.

Why don't they make LED bulbs with the phosphor on the glass and the die where the filament would be, like florescent?
The phosphor would run cooler and would be converting far less blue to white light so perhaps would be much less droopy?
 

AvPD

Enlightened
Joined
Oct 2, 2007
Messages
343
Location
Adelaide, Australia
Looks like the next breakthroughs will be in manufacturing them cheaply, there was a story about eliminating the need for a sapphire layer (or something similar) last year.
 

Gunner12

Flashaholic
Joined
Dec 18, 2006
Messages
10,063
Location
Bay Area, CA
Why don't they make LED bulbs with the phosphor on the glass and the die where the filament would be, like florescent?
The phosphor would run cooler and would be converting far less blue to white light so perhaps would be much less droopy?

That could work, though like a florescent lamp, there will be a pretty large emitting, which would mean floodier beam. Also I think the die also has less efficiency as it heats up.
 

blasterman

Flashlight Enthusiast
Joined
Jul 17, 2008
Messages
1,802
Sorry, the Multi-Die analogy is 100% wrong. In facts, its nearly completely opposite...

Then explain in detail why I'm wrong rather than saying I am.

"Do not assume the 'Flashaholic' designation under my name assumes I give a flip about flashlights, because I don't."
 
Last edited:

IMSabbel

Enlightened
Joined
Dec 4, 2004
Messages
921
Then explain in detail why I'm wrong rather than saying I am.

"Do not assume the 'Flashaholic' designation under my name assumes I give a flip about flashlights, because I don't."

Well, if you want:



The point is that for CPUs, electrical power and processing power are in no direct relationship.
Changes in architecture multiplied the processing power in the same thermal envelope.
Multicores are useful because there are limits to enhance processing power for single threads, as there are dimishing results. You simply CANNOT make one core faster without exessive ressources. You can double the transistor (and thus size) budget only to get single % improvements. And also notice that the _size_ and the _performance_ of a core are completely decoupled from the power consumption and each other. All design choices.
Also, multi-cores are produced on one die (if feasible, there are execptions). On a device level, thats one chip. You could also count each bond wire of a XRE as "a core" in that regard.


LEDs, otoh, are perfectly "parallel", AND their performance is proportional to their electrical consumtion. So there is no inherit advantage of multi-die or bigger dies. Currently, multi-dies are common, because it allows selection of mass produced elements. But this might change at any point. There are NO technical drawbacks from big dies on the application side, just currently in the production (well, there are, but not really before getting WAY bigger than the 1-2mm^2 we are now).


To sum up:

CPUs: Hightly complex systems, that depend on many variables. Single cores have dimishing return issues.

LEDs: You just need mm^2 emitter surface. Bigger dies, multicores, doesnt matter much, you can just choose whats easy to produce.

The choice to go multi-core is for different reasons, to different reasults, in different ways.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,506
Location
Flushing, NY
Cost is probably the biggest driving factor in LED research now. While cost per lumen is coming down, it's still at least 20X the cost of alternatives. This means LED must be sold on its other merits. In order to get the cost down, it's necessary to get the power density up without sacrificing efficiency. Regarding efficiency, while it may be "good enough" at this point I wouldn't call it a solution in search of a problem. Every increase in efficiency means less waste heat to deal with for a given number of lumens, or more lumens for a given amount of waste heat. And of course more lumens for any given package size.

Multi-die seems to be mostly a design decision based on costs and yield. It's obviously more costly to put several dice in one package than one. However, if the yield of one larger die is very poor then the economics swing in favor of multi-die. Once we learn how to make larger dice with decent yield multi-die LEDs will fall out of favor.
 

blasterman

Flashlight Enthusiast
Joined
Jul 17, 2008
Messages
1,802
The point is that for CPUs, electrical power and processing power are in no direct relationship.
Neither is the case for LEDs.
LEDs, otoh, are perfectly "parallel", AND their performance is proportional to their electrical consumtion.
Only in parallel, just like computer processors. Doubling the amount of current to a single LED chip *doesn't* deliver twice the light, and doubling the clock of a computer processor requires more than doubling the current directed at it. Using more LED chips on a single die *does* deliver twice the light at double the power levels, and two processor cores *does* double processing power. The added technology requirement is minimal. Intel and AMD dove into multi-core CPUs because the per die economics dicated they move in this direction, and the people buying the new multi-core processors had sufficient technology to support and take advantage of SMP. Just like CPU's, LEDs have become sufficiently efficient with color problems solved that more chips is really the way to go. That is, unless you know of some 400lumen, single chip, warm-white Cree I can buy.
So there is no inherit advantage of multi-die or bigger dies.
Perhaps I made a mistake with semantics in my first post, but I was referring to MC-E and Bridelux type architectures, and these indeed have significant advantages over single chips if you are capable of simply adding up the cost to reach a desired lumen level. Except for Intel strapping multiple P4 Xeons on a single die and calling it 'Dual core' for a bit, the analogy is sound. Multi LEDs on a single die are likely the trend for quite awhile because single chip technology doesn't seem to be going anywhere other than occasional news blurbs from some Engineering wing of a semiconductor maker looking to notch up stock prices at the end of a quarter based on hype. Seriously, many of you are naive to think otherwise.

If I want to produce and sell a 800lumen fixture that can replace an incandescent or HID bulb on a large scale it's obvious which direction the industry is going, and it's not waiting for some mythical technology to maybe hit the market 5years from now. There's a distinct quantum limit to how far current technology can extract visible photons from LED technology, and we're seeing the diminishing returns curve. Again, lighting up the tops of trees with purple/white light with a chemical battery based flashlight *isn't* the priority of the semi-con industry.
 

SemiMan

Banned
Joined
Jan 13, 2005
Messages
3,899
Nothing I hate more than pointless analogies.... :)

Computers went to multiple cores for POWER/HEAT reasons. For most desktop usage, one core running at 5Ghz will do more than 2 cores running at 2.5GHz. However, as was pointed out, running a core at 5GHz actually takes more power than 2 cores running at 2.5GHz. While switching power usage for logics is a function of P=C*V^2*F, to run at 5GHz versus 2.5GHz, requires a higher V which results in overall much higher power. Add in generally higher leakage to achieve those speed and you have two cores at 1/2 speed requiring much less power than one full speed core. For servers, multiple cores makes sense as task switching becomes a significant overhead and hence server processors and architectures use many cores.

What does this have to do with LEDS? Little I think. If you put 10watts into a 2mm*2mm die, you will have the same power density as 10watts into 4 1mm*1mm dies. There is no inherent power/efficiency advantage of multiple dies over a single larger die. The reason for multiple dies is purely one of cost as has been pointed out as well. Currently, yield for LED chips is not fantastic. Hence is it much cheaper to make 4 good 1mm*1mm dies versus 1 good 2mm*2mm die.

Now if you are comparing running a single 1mm*1mm die at 1A to achieve the same light output as 2 1mm*1mm die at 350mA, there is truth to that.

Of course, if you have seen Phlatlights announcement.... a company that is not the fringe of some semiconductor maker, but a serious player in projection and LCD backlight, then you will see that companies are making progress on large dies. There are also companies in the processing world moving LEDs away from 2inch specialty wafers up to larger scale processes that will increase die per wafer and likely the yield on larger chips too.

Putting phosphor on the "envelope" as opposed to the LED is not new concept. There have been announcements of that.

Semiman
 

bshanahan14rulz

Flashlight Enthusiast
Joined
Jan 29, 2009
Messages
2,819
Location
Tennessee
I kind of like nichia's idea of using standard 5mm package, but (from what I hear) with enhanced encapsulent material that is better at letting heat away from the die. Wouldn't it be cool if the consumer could replace a faulty LED with another one without worrying about finding the right size mcpcb and finding a way to reflow it, etc.

As for the research side of the industry, I can't wait to see what they pull out of their sleeves. perhaps they'll find a way to make true white LED chips by doping w/ some chemical only found in human earwax and some flavors of ramen.
 

lonesouth

Newly Enlightened
Joined
Feb 4, 2009
Messages
172
Location
Florida
Do LEDs follow along with Moore's Law same as computers. Of course I realize it is more a generalized statement than a hard and true law. It seems that it should roughly.

If so, then it would follow that you should see any combination of lower cost, higher output, lower power consumption, etc. Can someone give an insight to how LEDs today, say an R2 compares to a top of the line LED from a 18 months ago, would that be a Luxeon III?

Food for thought.
 

lonesouth

Newly Enlightened
Joined
Feb 4, 2009
Messages
172
Location
Florida
The R2 was released around about 18 months ago.

there's food for thought. So what is the latest and greatest in the lab today? It should be roughly double the output or efficiency or half the size for the same output, if LED technology follows Moore's law more or less...

I guess I'm in real trouble if I'm looking 18 months down the road and thinking double the output in the same size? That new Fenix Q5 I just bought is so dim...
 
Last edited:

AvPD

Enlightened
Joined
Oct 2, 2007
Messages
343
Location
Adelaide, Australia
Do LEDs follow along with Moore's Law same as computers.

After some quick googling I found that Haitz's Law applies to LEDs.

Haitz's Law states that every decade, the price of light-emitting diodes have fallen by a factor of 10 while the performance (measured in flux per unit) has increased by a factor of 20, for a given wavelength (color) of light.

nphoton.2006.44-f1.jpg
 
Last edited:

PhotonWrangler

Flashaholic
Joined
Oct 19, 2003
Messages
14,432
Location
In a handbasket
Why don't they make LED bulbs with the phosphor on the glass and the die where the filament would be, like florescent?
The phosphor would run cooler and would be converting far less blue to white light so perhaps would be much less droopy?

They do now. Lumination's (formerly Gelcore) VioLED has an LED die mounted flat against the pcb, covered by a plastic dome. I've seen a couple of these with the domes off, and there's significant space between the dome and the LED, so this is what you're describing.
 
Top