Theoretical limit for the efficacy of red, green, blue and white LEDs

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
According to Wikipedia, green light at 555nm has a maximum efficacy of 683 lumens per watt.
What are the limits for red and blue LEDs and those which produce a full spectrum white light?
 

Anders Hoveland

Enlightened
Joined
Sep 1, 2012
Messages
858
What are the limits for red and blue LEDs and those which produce a full spectrum white light?
Well, for red light, it depends how much red saturation you are willing to sacrifice when it comes to color rendering. The red color receptors in the eye have a maximum sensitivity to 564nm, but at this wavelength there is still a huge overlap with the green receptors (564nm actually appears slightly greenish yellow). The wavelength must be much longer to appear red. The red line in the spectra of a fluorescent tube is at 611.6nm (actually very orange-red), for example. "Deep red" does not really begin until about 660nm, but because there is a rapid fall-off in the human eye's ability to detect these longer wavelengths, there is a big sacrifice in efficiency. A good trade-off between red saturation and efficiency is probably somewhere in the range of 630-650nm.

Blue light is more complicated. The rod receptors in the eye have a peak sensitivity to 498nm wavelengths (cyan), and are about 60-70% more sensitive to the longer blue wavelengths than shorter blue wavelengths. However, the blue cone receptors have a maximum peak sensitivity at 437nm (this actually appears very indigo to violet). 460nm is about right in the middle of the overlap between the rod receptors and blue cone receptors, which may be why most white LEDs use a 460nm emitter. Also, by 460nm, the green receptors have very little sensitivity to this part of the spectrum, meaning there is high blue saturation in the color rendering. My personal opinion is that the presence of longer blue wavelengths are very important for creating a softer quality of light that is easier on the eyes, but from the standpoint of efficiency and maintaining maximum color rendering, it can be seen why 460nm is desirable. Below about 450nm, it becomes a little difficult for the eye to focus on the light. The rod receptors are much more sensitive than the blue cone receptors, but it is the cone receptors that allow us to see color.
 
Last edited:

Anders Hoveland

Enlightened
Joined
Sep 1, 2012
Messages
858
What are the limits for a full spectrum white light?
This, of course, is a more complicated question, one that is more open to opinion.

Uchida and Taguchi estimate the theoretical luminous efficacy of a white LED with good color rendering to be 300 lm/W. Coltrin found that it is theoretically possible to synthesize a light achieving 408 lm/W with a color rendering index in excess of 90 using four discrete wavelengths (unfortunately they noted that there currently does not exist the LEDs, or an all-phosphor formulation, which could adequately satisfy the exact requirements for these particular wavelengths). Murphy found that the maximum efficacy is in the range 250–370 lm/W, and that the color temperature of the light had only a modest impact on efficiency. This of course is not really "full spectrum" light.

The carbon arc lamp can achieve efficiencies up to 36-40 lm/w, and is essentially a 4000K blackbody radiator. Of course, a real blackbody radiator spectrum is less efficient because of all the infrared and invisible UV produced. Ceramic metal halide is a fairly full spectrum white source (96 CRI), and modern commercial fixtures can achieve 110 lm/w.


It was calculated that a 2800 K black-body emission, truncated to within ≥5% photopic sensitivity range, would have a luminous efficacy of 343 lm/w. That means only the visible part of such a spectrum would be generated.

Also, scoptic vision (at night) is a little different from photopic vision (under ambient lighting conditions).
At low lighting levels, the human eye is most sensitive to 507nm, which would translate into a theoretical efficiency of 1699 lm/W.


This link is also interesting:
http://agi32.com/blog/2014/01/05/thoughts-on-color-rendering/
 
Last edited:

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
Thanks - that's some pretty interesting info! Though I'd still like to know the peak theoretical efficacy for red and blue LEDs if you know or could find out please.

For the white LED, I'm not so worried about CRI (70%+ is fine), since I'm not convinced that in double blind trials people can really tell the difference between 70 CRI and 100 CRI as long as the hue, brightness and saturation is identical. Even if they can, I assume indirect lighting (such as cove lighting) will help to spread the wavelengths out a bit. But I'd be fascinated to be proven wrong about the whole CRI thing.

Cree claim they have reached 300 lm/W which is very close to the 400 lm/W limit you suggested. At that point, does this mean heat generation will be zero or negligible? If not, then I don't think 400 lm/W is really the max theoretical peak efficiency of white light.
 

SemiMan

Banned
Joined
Jan 13, 2005
Messages
3,899
Thanks - that's some pretty interesting info! Though I'd still like to know the peak theoretical efficacy for red and blue LEDs if you know or could find out please.

For the white LED, I'm not so worried about CRI (70%+ is fine), since I'm not convinced that in double blind trials people can really tell the difference between 70 CRI and 100 CRI as long as the hue, brightness and saturation is identical. Even if they can, I assume indirect lighting (such as cove lighting) will help to spread the wavelengths out a bit. But I'd be fascinated to be proven wrong about the whole CRI thing.

Cree claim they have reached 300 lm/W which is very close to the 400 lm/W limit you suggested. At that point, does this mean heat generation will be zero or negligible? If not, then I don't think 400 lm/W is really the max theoretical peak efficiency of white light.

Count on Anders to spout facts that he likely barely understands in order to make himself look smart without actually answering the question at hand ... :)

This is a roughly accurate graph:

http://www.photonics.com/images/Web/Articles/2009/3/8/RadiometryFigure4.jpg

Semiman
 

Anders Hoveland

Enlightened
Joined
Sep 1, 2012
Messages
858
I'm not convinced that in double blind trials people can really tell the difference between 70 CRI and 100 CRI as long as the hue, brightness and saturation is identical.
70 CRI is only suitable for outside lighting. For office lighting, it really needs to be at least 80 CRI, and even then things can look a little white-washed or greyish. If you want things to look "colorful", it needs to be 90 CRI.

Many people do not notice– not on a conscious level– though they may have some intuitive sense that all the life in the room has been sucked out. I personally do not feel 80 CRI is quite adequate for lighting in my home– maybe for some people– but not for many others. 85 CRI feels more "adequate".
 

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
SemiMan, thank you very much for that graph - I can extrapolate the lm/W from that! After re-reading Ander's posts more carefully and researching on the topic elsewhere though, I found his posts to be very useful in getting to the heart of the problem as we have to be careful with what we define as red and blue, and the compromises involved (balancing efficiency against colour purity). So maybe he was waiting for me to respond before giving a proper answer for the red and blue efficiencies.

Anyway, since you've both helped me a great deal to get to the answer, here are the glorious results in full:

75 lm/W for deep red at 650nm
300 lm/W for near-red at 620nm (sacrificing some colour purity)
683 lm/W for green at 555nm
100 lm/W for near blue at 480nm (sacrificing some colour purity)
25 lm/W for deep blue at 450nm

These figures used that graph from SemiMan, along with the common wavelengths typically used for red, green and blue LEDs taken from here and here. They also tie up fairly nicely with Ander's wavelengths and posts, along with what Wiki says here if you divide the "Typical efficacy" by the "Typical efficiency".

Assuming we build an LED colour bulb the 650nm red, 555nm green and 450nm blue LEDs, I wonder what the lumens/watt would turn out like if we wanted to obtain white from that bulb. As far as I know, we can't simply average the lm/W for each of the three LEDs, as we may need to significantly strengthen the blue and red LEDs to balance out the powerful green LED.
 
Last edited:

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
70 CRI is only suitable for outside lighting. For office lighting, it really needs to be at least 80 CRI, and even then things can look a little white-washed or greyish. If you want things to look "colorful", it needs to be 90 CRI.

Is this because if we don't use pure enough red and blue wavelength components, then the resulting 'white' will be tinged green? Surely if that's the case, we can just reduce the amount of green in the light? Do you have any comparison images so I can see the differences for myself?

I know this is off-topic now, but do you know what the CRI would be if we used just 3 LEDs to create a 6000K white: deep red at 650nm, green at 555nm, and deep blue at 450nm? I assume because the spectrum is 'spiked', the CRI would be very low, though I can't help but feel the light would look fine.
 
Last edited:

Anders Hoveland

Enlightened
Joined
Sep 1, 2012
Messages
858
Is this because if we don't use pure enough red and blue wavelength components, then the resulting 'white' will be tinged green?
White LED light typically has a poor red-green color contrast due to the nature of how the light is generated. Blue light from the diode emitter causes a yellowish phosphor to glow. The emission spectrum of the phosphor is a hump-like distribution of wavelengths centered around yellow to greenish yellow. There are some red wavelengths, and some green wavelengths, but it is mostly yellow. Red objects are going to be rendered very orange-colored, and green objects are going to appear yellowish.

This is for low CRI LEDs, particularly the ones that only use a single type of phosphor in their formulation. 80 CRI LEDs are better, but still have this fundamental problem.

Tri-color fluorescent tubes have a different set of issues, however.


I know this is off-topic now, but do you know what the CRI would be if we used just 3 LEDs to create a 6000K white: deep red at 650nm, green at 555nm, and deep blue at 450nm? I assume because the spectrum is 'spiked', the CRI would be very low, though I can't help but feel the light would look fine.
A typical RGB white LED is 67-81 CRI. However, the CRI index in this case is not a very good indicator of color accuracy, and there are problems rendering both yellow colors and skin tones accurately with this approach while achieving any level of red saturation. An RGBA white LED (basically a RGB that adds an amber emitter) is a little better, and can achieve 92 CRI.
An RGB setup by itself, because of the poor color rendering, would probably not be suitable for actual room lighting. Most RGB LED stage spotlights (for special effects) also combine some white emitters in there too.

"Spikiness" in the spectrum does not necessarily mean low CRI, it really depends how the wavelengths are distributed. It just happens to be the case that most artificial sources of lighting have spiky spectrums, and when a spectral emission is smooth, that generally means it has more coverage across different wavelengths in the spectrum.

Also an RGB setup typically has about half the efficiency of a comparable white LED. This is because current green wavelength LED emitters have less efficiency. This is somewhat ironic because green should be the most efficient color, in terms of lumens, since the human eye is so much more sensitive to it. This is actually a major problem they are working on. Another approach they are beginning to use is phosphor-converted green. This is just a blue emitter with greenish phosphor, and actually has an extremely high lumen per watt efficiency. When it comes to trying to produce the color green from LED, it is actually much more efficient to produce a hump of various greenish wavelengths via phosphor than to produce a single specific wavelength peak directly. The downside to this approach is lack of color purity, but this does not matter in white setups for general illumination.
(for some further information, you might read these articles: Osram increases efficiency in green LEDs, The "Green Gap", and see this thread: EQ-white )

It can also be noted here there is the Philips Luxeon Z Lime emitter (actually more greenish-yellow color), which has a very high 190 l/w efficiency, and is useful for customizable-color changing LED fixtures, where better color rendering ability is desired, since phosphor-converted LEDs have broader spectral bandwidth coverage than bare LED emitters that only emit a single peak wavelength. These emitters were made use of in the Philips Hue LED bulb.
(more information here: Lime-Green LEDs Encourage Color-Tunable Lighting )

Sorry for getting off-topic a little bit, but one really can't discuss the maximum theoretical efficiency of white LEDs without mentioning the subject of phosphor-converted green LEDs.


Do you have any comparison images so I can see the differences for myself?
Unfortunately, there is not really any way to post an accurate colored picture of exactly what the color rendering looks like.
This might give you a rough idea of the type of difference CRI can make though:
http://www.donsbulbs.com/bulbs/g623/glossary/cri.examples.jpg

Also see the two pictures at the bottom of this page: http://lowel.tiffen.com/edu/color_temperature_and_rendering_demystified.html

http://2bora.com/wp-content/uploads/cri-lamps-new.jpg
http://www.fosilum.si/static/uploaded/htmlarea/cri-comparison.jpg
http://www.naturalux.com/CRI Examples.html
http://www.fusionlamps.net/assets/img/CRI-comparison-apples.jpg
 
Last edited:

uk_caver

Flashlight Enthusiast
Joined
Feb 9, 2007
Messages
1,408
Location
Central UK
Assuming we build an LED colour bulb the 650nm red, 555nm green and 450nm blue LEDs, I wonder what the lumens/watt would turn out like if we wanted to obtain white from that bulb. As far as I know, we can't simply average the lm/W for each of the three LEDs, as we may need to significantly strengthen the blue and red LEDs to balance out the powerful green LED.
With those wavelengths, what would be the ratio of lumens of red/green/blue light needed to produce something like 'white'?
 
Top