JoakimFlorence
Newly Enlightened
- Joined
- Jun 4, 2016
- Messages
- 171
With normal white LEDs, which just use a blue emitter in combination with a phosphor, there is an inherent trade-off between CRI and light efficiency.
How much is this trade-off, exactly?
70 CRI is typically around 9 to 12% more lumen efficient than 80 CRI, all else being equal.
Going from 80 to 90 CRI, the total lumen output will drop by about 10 to 15%.
Going from 90 to 95 CRI, the lumen output will drop by 15 to 18%.
reference source: https://www.clearlighting.com/why-high-cri-led-strips-dont-provide-more-lumens/
Why does higher CRI result in a drop in lumen efficiency?
Mainly it has to do with generating red wavelengths of light. Generating red light from a phosphor is not very efficient. This is especially the case for deeper (longer) wavelengths of red light, when we are talking about getting more into the pure "ruby" red, rather than just orange-red.
This is due to two reasons. The eye has a lower sensitivity to deeper longer wavelengths. 630nm is more than twice as bright as 650nm.
The way a typical LED phosphor works, it generates a broad band of wavelengths. To be able to produce a peak of red wavelengths mainly around 640 or 650, the phosphor would also be converting a lot of the input energy to near-infrared light, which is invisible to the eye.
There is one last reason as well. Blue light contains more energy than red light (given an equal number of photons). This means that creating blue light and converting it to red light results in an inherent efficiency loss. For 450nm to 650nm, it would be a loss of 31%.
These efficiency losses could be mitigated if one just used a red LED emitter to produce the red light for the white LED.
This might sound very simple, but it does result in some complications.
Firstly, the light has to be well mixed. Secondly, the light has to be precisely color balanced. This can be difficult because the human eye is very sensitive to slight color tint deviations from "pure white", also because red LED emitters lose a small but significant amount of lumen output as they heat up, more than the blue LED emitter does. A large part of the reason for that is that increasing temperature results in a small shift to slightly longer wavelengths, and in the red area of the spectrum, eye sensitivity very quickly begins dropping off as the wavelength increases. This tends to result in throwing off the color tint balance.
Most white LEDs use a mix of two phosphors. (Actually it's most commonly the same type of phosphor, but the doping concentration is a little different resulting in a shift of wavelength output) There is a "red" phosphor and a "green" phosphor. (Actually that's not really the most accurate, it's usually more like a "mostly yellow-orange phosphor" and a "mostly yellow-green phosphor") Used together, they help to give a flatter and more extended distribution of wavelengths across the spectrum, ideally from red to green. But often the phosphor does not do the best job of that and the total light output from the phosphor tends to be more "yellowish" in its distribution. (This is why lower quality LEDs often had a reputation of producing "faux white" light, that somehow didn't seem quite right)
When trying to make white light by combining a red LED emitter together with a blue emitter with an adjusted phosphor composition, you can't just use a "green" phosphor. It is not as simple as that. That would result in an imbalance and result in lower CRI and poorer color rendering ability. There are two possible approaches. You could use a very yellowish-green phosphor and then combine that with a very orange-red LED wavelength, such as 510-115nm. Or you should use both red and green phosphor, but just use less of the red phosphor. In this case you could (and should) select a "red phosphor" that is more yellow-orange, since the LED emitter would be providing the deeper red wavelengths.
What exact wavelength of red LED should one pick?
This can depend on several factors, such as efficiency, desired CRI and red color rendering ability, and what the color temperature of the light will be. (Also, the consideration of simplicity and cost may make choice of just one red wavelength preferable to two different red wavelengths)
For 2700K or 3000K, the best red wavelength is 650nm. When going above 3650 to 3800K, or let's just say 4000K, it is better to go with 630-635nm. If you are aiming for the very maximum possible lumen efficiency, then 620nm will be the best choice. If using two different red wavelengths, then 650nm and 610-620nm would be the most optimal.
(Also, if adding a green LED emitter, 520nm is the most optimal)
What about using 660nm? Will that give better red color rendering?
Red colors will indeed be more saturated under 660nm, but it will be in a way that's beyond how red colors look under any natural light. Also consider that 660nm has less than half the lumen efficiency of 650nm. Using 660nm is really unnecessary and not optimal. 650nm is completely acceptable and will still make reds appear with the same saturation as they do under incandescent or natural sunlight.
What about using the red LED to generate most of the red light in the spectrum? Why does one still need to rely on the phosphor to produce red light?
The problem with relying on red LEDs to produce most of the red light is that it will result in skin tones looking far too pinkish. Not only too pinkish but also paradoxically too pale and not very warm. Why? Because when using deeper redder wavelengths, there is only so much that can be added before it starts throwing off the color balance of the light. To have a warm and healthy "tanned" skin tone appearance, you need very orange-red light, which is somewhere in the range of around 505 to 515 (or an added red emitter of 515-520nm, when combined with phosphor).
What percentage should the red emitter light be out of the total white light?
The red emitter should only be around 5 to 12% of the total light in the spectrum, and probably around 6 to 8%.
What CRI could I expect to achieve from this approach?
Adding a red LED emitter to an 80-85 CRI type phosphor composition could easily result in 94 CRI.
If a better phosphor composition is used, 95 or even 96 CRI.
A single greenish-yellow phosphor combined with 510-515nm orange-red can achieve 93 CRI.
What about 97 or 98 CRI ? Is that possible?
For the 2700 to 2900K range, it is possible, but you would need to be using an excellent phosphor composition.
Most probably there would need to be two separate wavelength red emitters in that case.
Currently, 97 or 98 CRI is usually regarded as impractical due to the significant efficiency loss, but using separate LED emitters it could allow that to theoretically be just as efficient as regular LEDs.
With all-phosphor LEDs, there is a very significant loss in efficiency going from 95 to 98 CRI.
If we are talking about 4000K or 5000K, I think it would top out at 96 CRI, as the maximum theoretically possible. To be able to exceed that, we would need to talk about doing some other things.