Red or red-orange for increasing CRI?

lolzertank

Enlightened
Joined
Dec 29, 2008
Messages
555
Location
The Land of Silicon
The main reason I tried this instead of the Seouls is that 12 Seouls and optics don't fit in a Mag head.

Here's some pics of the light...

Just a 3C Mag.
IMG_9051.jpg


Lots of LEDs...
IMG_9049.jpg


14 LED little brother joins in.
IMG_9051.jpg
 
Last edited:

blasterman

Flashlight Enthusiast
Joined
Jul 17, 2008
Messages
1,802
Oh yeah....in regards to taking pictures:

Years ago I used to have to make custom film profiles for digital imaging, and an easy target to shoot and see a LOT of color variation is a big box of Crayola crayons.

If you can, shoot RAW and white balance off the grey crayon. The different between CRI should be obvious then.
 

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
137
Is it better to use red or red-orange LEDs to increase the CRI of a cluster of neutral white LEDs?

Thanks!
I can answer this question, since this is a subject I looked into in great detail.

Actually you could use a wide range of different possible wavelengths. Any of them would increase CRI, especially in the red area, and whether you go with a longer (red) or shorter (red-orange) wavelength has its own unique advantages and disadvantages.

The first thing that needs to be said is that if you simply add red to white, you are not going to have white color light. You are going to have pink-tinted light. The human eye is very sensitive to small color tints and color shifts.
But let's ignore that part for a moment and get to answering your question.

You can use a wavelength as short as 615nm, or as long as 660nm. Actually it depends somewhat on the CRI of the white LED to begin with. If you are using a very low 70 CRI white, and the color temperature is somewhat high >4000K, then something like 615nm is going to give a greater CRI and efficiency benefit than it would in other situations.

The human eye is much less sensitive to longer wavelengths, so the amount of red light added is going to be significantly (several times) greater at 660nm than it would be at 615nm.

I have played around with this and found that 650nm still gives excellent red color saturation, and is almost twice as efficient as 660nm, due to the rapid drop-off of human wavelength sensitivity. In fact the only reason to choose 660nm is if you are trying to go for an unnatural saturation of red.

635nm is also an excellent compromise, and gives pretty good color rendering overall, although is probably not going to bring you above 94 or 95 CRI.

Something else I have noticed that is very important - if most all of the red wavelength light is coming from the red LED, then if you use a wavelength that is too long, it is going to make skin tones look unnatural. Skin tones will not be as brightly illuminated (meaning they can still appear a little dead grey) and will also acquire an unnatural pinkish tinge, rather than a healthy orange glow.

You see, the problem is that you cannot up the amount of 660nm light to a level that will properly illuminate skin tones without it overdoing the red saturation on other red colors and throwing off the color rendering balance. The ideal red-orange for a healthy-glowing skin tone is about 520nm.
When you have a "high CRI" white LED of about 91 CRI, then the spectrum will be high in 620nm orange-red light, and so it will illuminate warmer orange-red colors well, giving their appearance balance, warmth, and life. But the deep red colors will still not be rendered the most properly and will acquire a more orange tinted red than they should.

If you are using an 87 CRI white LED, then 650nm is an appropriate choice to go with it. Especially at higher color temperatures at 4000K or above. At a lower color temperature (say 3000K) or if using a lower CRI (say 80 CRI) white LED, you may want to go with 640-645nm red, or perhaps even a mix of 620 and 650nm LEDs.

So it depends on several factors. The CRI of white LED that you are using, the desired final color temperature range, and the desired CRI level that you need to obtain.

I think "neutral white" is usually 4000K (sometimes 4000-4500) so in that situation I think 535nm is probably the wavelength of red you would want to use. To more specifically answer your question. It should be able to take your 80-85 CRI and bring it up to the equivalent of maybe around 93 or 94.

The amount of red you will want to add is really not that great. Maybe in the range of 7 to 11 percent of the total light.
 
Last edited:

alpg88

Flashlight Enthusiast
Joined
Apr 19, 2005
Messages
5,339
This thread was created in 2009, when MCEs and p7s ruled, Hi CRI leds did not exist back then, now, there are 95+ cri leds. no need to mess with color leds to increase it.
 

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
137
This thread was created in 2009, when MCEs and p7s ruled, Hi CRI leds did not exist back then, now, there are 95+ cri leds. no need to mess with color leds to increase it.
For the hobbyist, yes, that is correct (or very much mostly correct). However it's still interesting from an engineering point of view. It could theoretically be used to increase power efficiency levels a little bit, or even (arguably) very subtly increase certain desirable properties of the light and its characteristics.

I think the main reason there are virtually no commercial LED makers that use a multi-wavelength emitter design is the added manufacturing complexity and expense. (Ideally one would want to integrate the different emitters on the same chip) Then the difficulty of trying to figure out exactly how to combine different wavelengths and in which ratios, something that it not well understood, or an area that is not understood by many people due to it being more complicated. And lastly there was a well-recognised industry issue of the output of red LEDs dropping over time as they warmed up, leading to color tint imbalances of the resulting white light. This required added electronic mechanisms to regulate and adjust the ratios. I'm sure there are probably some creative ideas for passive strategies to address this, but they might not always work perfectly in all situations. The human eye is very sensitive to subtle color tint shifts off from white.

I've looked into the ratios, and basically a multi-emitter design involves somewhere around maybe 82 to 88 percent "white" (phosphor) light, and then tiny amounts of other wavelengths (1,2,3,4, or sometimes even 5 other wavelengths). For a designer, that might not appear very practical, to have to be adding numerous little additional emitters when they do not contribute much light output. It would greatly increase the manufacturing expense for what most people would see as minimal gains.

Furthermore, if the main objective is only to increase color rendering as much possible and efficiency is a secondary concern, then the overall gains in efficiency are going to be very minimal. Most are not going to want to go to the trouble if the gains in efficiency are only 4 or 5 percent higher than all-phosphor approach.
(In that case, only tiny amount of additional 650nm red wavelength light would be added, with most of the red light still originating from the phosphor)
I think most schemes to add a separate red emitter are not going to be based only just where maximum possible CRI is the primary concern, therefore. Typically what we've seen so far is that the primary objective was obtaining significant increases in efficiency, with some added CRI (and improved red color rendering) being seen as an added bonus.

Thoughts on Cree TrueWhite technology (July 29, 2016)
 
Last edited:

HarryN

Flashlight Enthusiast
Joined
Jan 22, 2004
Messages
3,977
Location
Pleasanton (Bay Area), CA, USA
In LED strip lighting, there actually is quite an increase in the use of strips that have 4 colors - RGBW.

The controls that run them are pretty inexpensive and can create some neat effects.

A number of companies offer them - example superbrightleds and ledsupply.

The controls range from simple PWM via knobs to DMX and more.
 

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
137
In LED strip lighting, there actually is quite an increase in the use of strips that have 4 colors - RGBW.
There is a post that mentioned in another 7 year old thread that RGB (white) LEDs have a CRI of only 67-81, but that RGBA can achieve a CRI up to 92.
Theoretical limit for the efficacy of red, green, blue and white LEDs

I still really don't think RGBA is the best way to achieve good light quality. Despite seemingly decent CRI number it still could have some significant color rending issues.

The reason that extra amber emitter boosts up the RGB's CRI so much is that reddish-oranges (620 to maybe 590 nm) is very important for proper rendering and lively illumination of skin tones and warm colors, and with only one red wavelength it's difficult to do that while simultaneously making red colors appear a deep pure red.

And furthermore, in the absence of any phosphor white emitter, a 570 nm amber helps compensate for the green light when trying to more properly render those orangish-yellow, yellow, and yellow-green colors.

The reason an RGB is never going to attain satisfactory coloring rending like a fluorescent tube is that the green wavelength in fluorescent phosphor is a yellow-green wavelength, whereas the G in RGB is going to be a deeper green. This is for two reasons, first because yellow-green LEDs have much lower efficiencies than green LEDs, and second because the primary intended use of an RGB LED is not just to produce white light, sometimes it will be expected to produce color light and the colors are not going to look very well if the only green in the palette is a yellow-green.
But that part is probably beyond the scope of this discussion since we are focusing on red-orange wavelengths.

I have tried adding 650 nm red LED light to low CRI white LED light, and while it does improve the feel and appearance somewhat, interestingly it still does not render skin tones or warm wood colors the most bright or lively. This is because the eye has a diminishing sensitivity to longer deeper red wavelengths, and if one were to use enough 650 nm light to accomplish the goal, the proportion of red color added to the light would be too great. It's much easier to balance out reddish-orange light to make white light, or orange-yellow light, than it is to try do the same with pure red light. The other problem is that 650 nm light is still "too red", and when that is the primary source of light within the red to orange range, it makes skin tones appear too pink, in a way that does not look the most flattering or attractive. However, use 615 to 590 nm (red-orange) light and then skin tones appear like a healthy tanned attractive bronzed color.

It's fine to add 635 nm red to a white LED when the resulting desired color temperature of the light will be higher than 3650K. That results in an acceptable balance. But when the desired color temperature is lower, like 3000K or especially 2700K, then it becomes more important to use a longer red wavelength like 650 nm. And I would recommend using a white LED that is not too much higher than the desired color temperature. For example, if the light you want is 3000K, use a 3500K or 4000K white LED, do not use 5000K and then try to make that work.

This is for a standard 84-86 CRI white LED emitter. If using a lower CRI white LED, I suspect it's going to be more important to add in or choose a shorter wavelength red light. The thing is, a standard 84-86 CRI does have enough combined orange and red-orange light in its spectrum to balance out with added red light from a red emitter. But then if you move into 70 CRI territory there is going to be a serious deficit of light in the red-orange area, and there will not be enough orange light by itself to balance out added pure red light. You actually do not want most of your total red light in that spectrum to be all red.
But sure, if you're willing to use both 650 (red) and 615 nm (red-orange) emitters, then I'm sure you could use a 70 CRI white emitter.

70 CRI white +615nm+650nm is not going to have any advantage in light quality over 85 CRI white + 650nm , but would have an even higher energy efficiency, especially at lower color temperatures, 2700 and 3000K.
85 CRI white +650nm is I think going to give you that excellent saturated deep red color (R9) rendering you can only get from those rare 98 CRI LEDs (using blue emitters; I am NOT referring to those that use violet chip), but without that big loss in efficiency -- a 98 CRI LED is going to be about 27 percent less efficient than a 90 CRI LED, mainly because for an LED phosphor to produce a distribution that is centered at 650nm it has to also produce a lot of even longer wavelength light that is not visible to the eye.
 
Last edited:

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
137
I said that 615 nm isn't necessary for 85 CRI white. That isn't entirely true. If the desired light is 2700K then just a little bit of added 615 nm will be helpful (although not very much needs to be added).
The rule is that the exact qualities of the red light become more important at low color temperatures. And the other rule is that the lower the color temperature, the more important it is for CRI to have a longer wavelength of red light be present.

However, if you are going to use a 90 CRI white emitter, then you could dispense with any need for having to add additional orange-red (615 nm) light. At that point it would no longer be necessary.
 
Top