CREE about to break 300 lumens per watt in R&D?

BigRiz

Newly Enlightened
Joined
Sep 9, 2010
Messages
50
Before you keep on reading, be warned that this is only about speculation and there are no hard facts here.

If you're still here, consider this chart showing the history of R&D efficacy announcements:

cree_lpw.png


As you can see, historically, CREE have been announcing a new breakthrough about once a year between February and May mainly, and we normally see an increase of a little above 20 lumens/w. Last announcement was on February last year, and it was 276. So what's next? Seems like it will soon be time for another announcement.... or will they wait until they manage to break the 300 round number?
 
It is possible that they are getting close to hitting a brick wall soon which means that increases in efficiency will drop. Personally I'm of the mindset to care less about the R&D figures which are typically taken at about a 1 watt output such that it doesn't really give you real world figures of normal in use operating efficiencies. I would like to see inexpensive 200 lumens/watt LEDs that put out lets say 500 lumens per LED at that efficiency (minus supporting electronics). It is when we start seeing lighting solutions using LEDs that start to break the 120-150 lumens/watt efficiency in use that I think things will really start to ramp way up as then you will see LED starting to compete with other high efficiency lighting solutions in commercial use.
 
300 lm/W may happen this year but I'm honestly skeptical it will. The luminous efficacy of the emitter spectrum of typical blue phosphor whites is in the area of 300 to 350 lm/W. I've read that phosphor conversion efficiency maxes out in the 82% area due to Stokes losses. That means maximum theoretical efficiency for phosphor-based whites is 246 to 287 lm/W. I've also read (can't find the source right now) that Nichia stated 300 lm/W is possible with improved phosphors although you would need a blue emitter with efficiency close to 100% WPE to reach that goal. As things stand right now I estimated that the blue emitter used for Cree's current record more than likely had a WPE above 90%. It will be difficult to get WPE all that much higher, although I'll grant we may yet squeak by 300 lm/W.

I'm personally more interested in the efficiencies of production LEDs. We're well on our way to solving the droop issue at the production level. Moreover, we've exceeded 60% WPE in production blue emitters. The steps needed to go from 60% to 90%+ may or may not eventually be feasible in mass production, although I do feel we'll exceed 80% in production by the end of the decade. Combined with better phosphors, this could mean phosphor whites which achieve ~250 lm/W.

The only roadmap to potentially achieve as high as 400 lm/W is with white light made of red, green, and blue emitters. For that to happen, we need to greatly increase the WPE of green and red emitters. If we had green and red emitters at the same WPE as blue emitters, we could achieve 240+ lm/W in production with RGB white emitters right now. We could also potentially reach 300+ lm/W as WPE exceeds 80%.

On another note, we're at the point where small increases in efficiency yield big decreases in waste heat production. If you go from a fixture efficiency of 100 lm/W to 150 lm/W, you decrease waste heat by 50%.
 
On another note, we're at the point where small increases in efficiency yield big decreases in waste heat production. If you go from a fixture efficiency of 100 lm/W to 150 lm/W, you decrease waste heat by 50%.

Hi jtr,
I do not follow your logic for the reduction in waste heat.

Assuming a LER value of 330 lumens per watt used to calculate Wall Plug Efficiency(WPE):
100 /330 = .30(30% WPE)
150 /330 = .45(45% WPE)

For a 10 watt fixture:
A 30% WPE yields-
3 watts radiant flux
7 watts waste heat

For a 10 watt fixture:
A 45% WPE yields-
4.5 watts radiant flux
5.5 watts waste heat

While there is a 50% increase in radiant flux(3 to 4.5 watts) going from 100 l/w to 150 l/w, the waste heat decrease is only 21.4%(7 to 5.5 watts).
 
What jtr was saying was that as the WPE gets higher than 50%, and particularly as it gets closer to 100%, the percentage change in the waste heat gets larger.
If the overall efficiency is 80%, then the waste heat is 20%. An overall efficiency of 90% has a waste heat of 10%.
So an increase from 80 to 90 has cut the waste heat by more than a factor of 2, since the overall power needed for the same light output will be reduced too.
We are approaching ( in the lab anyway) a range of WPE where the calculation is qualitatively different.
 
Last edited:
Hi jtr,
I do not follow your logic for the reduction in waste heat.

Assuming a LER value of 330 lumens per watt used to calculate Wall Plug Efficiency(WPE):
100 /330 = .30(30% WPE)
150 /330 = .45(45% WPE)

For a 10 watt fixture:
A 30% WPE yields-
3 watts radiant flux
7 watts waste heat

For a 10 watt fixture:
A 45% WPE yields-
4.5 watts radiant flux
5.5 watts waste heat

While there is a 50% increase in radiant flux(3 to 4.5 watts) going from 100 l/w to 150 l/w, the waste heat decrease is only 21.4%(7 to 5.5 watts).
For any given lumen output waste heat would decrease by 50%. In your example you're keeping power input constant, not lumen output. For example, let's say the desired output is 3000 lumens. In both cases let's assume the LER of the spectrum is 300 lm/W (this is about right for the higher CRIs used in interior lighting). Here are the calculations:

100 lm/W

Input power = 30 watts
Radiant flux = 10 watts
Waste heat = 20 watts

150 lm/W

Input power = 20 watts
Radiant flux = 10 watts
Waste heat = 10 watts

200 lm/W

Input power = 15 watts
Radiant flux = 10 watts
Waste heat = 5 watts

You decrease waste heat by 50% going from a fixture efficiency of 100 lm/W to 150 lm/W, and again by 50% going from 150 lm/W to 200 lm/W.
 
What jtr was saying was that as the WPE gets higher than 50%, and particularly as it gets closer to 100%, the percentage change in the waste heat gets larger.
If the overall efficiency is 80%, then the waste heat is 20%. An overall efficiency of 90% has a waste heat of 10%.
So an increase from 80 to 90 has cut the waste heat by more than a factor of 2, since the overall power needed for the same light output will be reduced too.
We are approaching ( in the lab anyway) a range of WPE where the calculation is qualitatively different.
Yes, exactly. You decrease waste heat for any given light output both by reducing the input power requirements, and also by converting that power to light more efficiently. I made chart a long time ago for 100 watt incandescent replacements (~1700 lumens output) which nicely illustrates this concept:

LED_Cooling_Comparison.gif


Note that as we go beyond about 200 lm/W, cooling requirements are very easy to meet, even in a very small former factor.
 
Last edited:
Nah, the people who run the press announcements were gonna tell us this last month, but this thread let them know that they could wait until this month. "BigRiz said they wouldn't be expecting anything until March, delete that blog post you were about to make!"
 
The only roadmap to potentially achieve as high as 400 lm/W is with white light made of red, green, and blue emitters. For that to happen, we need to greatly increase the WPE of green and red emitters. If we had green and red emitters at the same WPE as blue emitters, we could achieve 240+ lm/W in production with RGB white emitters right now. We could also potentially reach 300+ lm/W as WPE exceeds 80%

This is the first time I've seen RGB triple-LED lights mentioned. Nice out-of-the-box thinking jtr1962! Before I open my mouth and insert my foot tho;

What's the ballpark WPE of Red & Green emitters in production these days? I'll bet that dialing down the Blue emitter for white-balance forfeits more lumen than lost in the present phosphor conversion...

Anybody??
 
Last edited:
This is the first time I've seen RGB triple-LED lights mentioned. Nice out-of-the-box thinking jtr1962! Before I open my mouth and insert my foot tho;

What's the ballpark WPE of Red & Green emitters in production these days? I'll bet that dialing down the Blue emitter for white-balance forfeits more lumen than lost in the present phosphor conversion...

Anybody??

Philips announced a high-efficiency working concept (200 LPW system efficiency if I recall) using RGB emitters to produce white. Trick was that the green was actually a blue die stimulating narrowly-tuned phosphors that down-converted the light to green.
 
Philips announced a high-efficiency working concept (200 LPW system efficiency if I recall) using RGB emitters to produce white. Trick was that the green was actually a blue die stimulating narrowly-tuned phosphors that down-converted the light to green.

Maybe they will figure out how to make "mood" bulbs using RGB technology. Just dial your color in if you want warm white or cool white just turn the dial. I think my issue with all these high efficiency designs is ultimately what is the cheapest cost they could ever be produced in lasting quality. I don't think the average person wants a fancy bulb that costs much in excess of $20 even if it is super efficient it will still take years to pay off in use vs a cheap $2 CFL.
 
Maybe they will figure out how to make "mood" bulbs using RGB technology. Just dial your color in if you want warm white or cool white just turn the dial. I think my issue with all these high efficiency designs is ultimately what is the cheapest cost they could ever be produced in lasting quality. I don't think the average person wants a fancy bulb that costs much in excess of $20 even if it is super efficient it will still take years to pay off in use vs a cheap $2 CFL.

If RGB LED lighting is ever feasible, it will probably be within mainline lighting arrays first; where "mood" can be a selling point. Imho, LEED proponents are not cost driven and 99.44% of all "Green" initiatives don't break-even before they break-down. But, hey; [tongue in cheek] what are government grants for anyway?

Likewise, within CPF; higher WPE's translate into more OTF lumen from a given thermal-resistance flashlight platform and/or longer run-times in "Turbo-mode". (i.e. nirvana) Flashaholics are defined by their expensive collection of blindingly bright flashlights purchased with economic abandon. Subsequently, cost justification relative to CFL's is blasphemy here. :grin2:

Sarcastic Glossary of Jargon;

- RGB - the primary light colors; Red, Green, and Blue.
- LED - Light Emitting Diode; Okay, I'm being snide here..
- CPF - the Candle-Power Forums community; Flashaholics not anonymous.
- Imho - In My (debate-ably) Humble Opinion; I don't know why I'm in a good mood today..
- LEED - Leadership in Energy and Environmental Design. Think; Greenpeace for lighting folks.
- OTF - out [of] the front; Lumen measurement entirely outside of the flashlight.
- WPE - wall-plug efficiency. OTF lumens / battery power pumped into LED driver.
- CFL - Compact Florescent Light; those squiggly light-bulbs everyone uses now.
 
Last edited:
If RGB LED lighting is ever feasible, it will probably be within mainline lighting arrays first; where "mood" can be a selling point. Imho, LEED proponents are not cost driven and 99.44% of all "Green" initiatives don't break-even before they break-down. But, hey; [tongue in cheek] what are government grants for anyway?

Likewise, within CPF; higher WPE's translate into more OTF lumen from a given thermal-resistance flashlight platform and/or longer run-times in "Turbo-mode". (i.e. nirvana) Flashaholics are defined by their expensive collection of blindingly bright flashlights purchased with economic abandon. Subsequently, cost justification relative to CFL's is blasphemy here. :grin2:
<glossary edited out for ease of reading reply>
Yeah too much of "Green" initiatives are hype these days. I do find it interesting that white (blue) LED's have gone from being less efficient than red or green ones from the start to seemingly the most efficient these days leaving other colors in the dust. It makes me wonder if they could spend a fraction of the research into increasing Red and Green LED efficiency such that an RGB using Native colors without phosphor coatings would be possible and in that effort could actually exceed and white(blue) LED efficiency claim.
 
Yeah too much of "Green" initiatives are hype these days. I do find it interesting that white (blue) LED's have gone from being less efficient than red or green ones from the start to seemingly the most efficient these days leaving other colors in the dust. It makes me wonder if they could spend a fraction of the research into increasing Red and Green LED efficiency such that an RGB using Native colors without phosphor coatings would be possible and in that effort could actually exceed and white(blue) LED efficiency claim.

The substrate for Blue & Green LED's is gallium nitride. (GaN) The substrate for red, orange, and yellow LEDs is gallium arsenide. (GaAs)

Around 1980; my college SemiConductor class taught that GaAs was better because its' crystalline lattice better matched supporting layers. But GaAs is very sensitive to defects. OTOH, GaN could be layered, but there was a huge LPW droop at significant power levels. The world choose layering in lieu of purity I guess. I chose to not be a semiconductor engineer.

Unless LEED or someone else pushes large matrixes of super-efficient (i.e. low-power) LEDs; I imagine that manufactures will stick with a smaller quantity of GaN and phosphor conversion LEDs in lieu of purifying GaAs for RGB use. (if there's any real semiconductor engineers in the audience; please correct me here)
Fwiw; I've belonged to CPF for almost 10 years and I'm just now earning my Flashaholic badge with this post. Yea!!! :party:
 
Yeah too much of "Green" initiatives are hype these days. I do find it interesting that white (blue) LED's have gone from being less efficient than red or green ones from the start to seemingly the most efficient these days leaving other colors in the dust. It makes me wonder if they could spend a fraction of the research into increasing Red and Green LED efficiency such that an RGB using Native colors without phosphor coatings would be possible and in that effort could actually exceed and white(blue) LED efficiency claim.

I am sure this has never occurred to anyone :)
 
Actually the substrates are sapphire and silicon carbide and silicon. The junction composition is Indium-Gallium-Nitride and Aluminum-Indium-Gallium-Phosphide.

Defects are just small part of the problem ... need a bandgap at the required wavelength.
 
The substrate for Blue & Green LED's is gallium nitride. (GaN) The substrate for red, orange, and yellow LEDs is gallium arsenide. (GaAs)

Around 1980; my college SemiConductor class taught that GaAs was better because its' crystalline lattice better matched supporting layers. But GaAs is very sensitive to defects. OTOH, GaN could be layered, but there was a huge LPW droop at significant power levels. The world choose layering in lieu of purity I guess. I chose to not be a semiconductor engineer.

Unless LEED or someone else pushes large matrixes of super-efficient (i.e. low-power) LEDs; I imagine that manufactures will stick with a smaller quantity of GaN and phosphor conversion LEDs in lieu of purifying GaAs for RGB use. (if there's any real semiconductor engineers in the audience; please correct me here)
Fwiw; I've belonged to CPF for almost 10 years and I'm just now earning my Flashaholic badge with this post. Yea!!! :party:

I sort of have a hard time believing that using colored LEDs to make white is more efficient than using 3 white LEDs if two of the LEDs come from blu LEDs one of which uses phosphor to make green from blue.
 

Latest posts

Top