North American Lighting Sector Calls for Energy Star to Address LED Lamp Light Qualit

slebans

Enlightened
Joined
Mar 1, 2010
Messages
457
Location
Moncton, NB Canada
North American Lighting Sector Calls for Energy Star to Address LED Lamp Light Quality
http://www.ledinside.com/news/2013/1/north_american_lighting_sector_energy_star_20130123

Those people at Soraa certainly are busy beavers. Basically, they want the EPA to relax Energy Start efficacy requirements for high CRI bulbs.

In a nutshell:
"Fundamental physics research shows that there is a ~2% penalty in luminous efficacy per point of CRI. So, going from a CRI of 80, where most LEDs operate, to a CRI of 90, there is a ~20% penalty in lm/W," said Dr. Shuji Nakamura, Professor of Materials Department and Co-Director for the Solid State Lighting & Energy Center at the University of California, Santa Barbara.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
For what it's worth, the majority of people can't tell the difference between 80 and 90 CRI, and an 80 CRI LED is superior in terms of light quality to an 80 CRI CFL.

On another note, I really wish the LED industry as a whole would get away from "warm white". There is another ~20% penalty going with incandescent-like CCTs as opposed to sunlight CCTs. The majority of people would find 4000K to 5000K perfectly acceptable once they got used to it. Besides, if you think about it, natural light is only "warm white" for a short time around sunrise and sunset. It's not really natural for us to be under this type of light for hours at a time. The fact that we've gotten used to it because that was the only type of light available with legacy lighting technologies doesn't mean we should continue to emulate it with LEDs. Get rid of both the warm white and 90+ CRI and you're looking at around 30% more efficiency. You could always have high-CRI LEDs (in all CCTs, not just warm white) for applications that really need it. Lighting someone's living room or bedroom or kitchen just isn't an application where lighting quality is super critical.
 

SemiMan

Banned
Joined
Jan 13, 2005
Messages
3,899
For what it's worth, the majority of people can't tell the difference between 80 and 90 CRI, and an 80 CRI LED is superior in terms of light quality to an 80 CRI CFL.

On another note, I really wish the LED industry as a whole would get away from "warm white". There is another ~20% penalty going with incandescent-like CCTs as opposed to sunlight CCTs. The majority of people would find 4000K to 5000K perfectly acceptable once they got used to it. Besides, if you think about it, natural light is only "warm white" for a short time around sunrise and sunset. It's not really natural for us to be under this type of light for hours at a time. The fact that we've gotten used to it because that was the only type of light available with legacy lighting technologies doesn't mean we should continue to emulate it with LEDs. Get rid of both the warm white and 90+ CRI and you're looking at around 30% more efficiency. You could always have high-CRI LEDs (in all CCTs, not just warm white) for applications that really need it. Lighting someone's living room or bedroom or kitchen just isn't an application where lighting quality is super critical.

JTR, I often agree with you, but differ here a bit.

My kitchen is lit with glorious near perfect 4K light (Xicato Artist series). My work areas, laundry rooms, with 5000K daylight .... but the bedrooms are 2700 and 3K as are many of the living areas.

Higher CCT is great for work/active areas, but when you are just relaxing and the light levels are lower, then lower CCT will be preferred and that is not simply a matter of getting used to but human preference and lack of allowing the biological clock to say that yes, night is upon us.

I think relaxing energy star for higher CRI does make sense. Most will choose the cheaper lower CRI higher output bulb, but if you are lighting a painting, or perhaps in the bathroom, or even the kitchen, then high CRI is a nice option.

Semiman
 

Lynx_Arc

Flashaholic
Joined
Oct 1, 2004
Messages
11,212
Location
Tulsa,OK
I say put the pressure on LED makers to get decent light quality with decent energy savings to get the Energy Star logo. They can make LED lights without it till they get more efficient I don't really think encouraging manufacturers to make lights that can be inferior in energy savings with the same color output to give LEDs an advantage over CFLs that will less encourage manufacturers to make the highest quality lamps we could see too many of the makers just doing what they can do to barely meet specs instead of exceeding them.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
What is the basis for this opinion?
Because natural sunlight is 4000K to 5000K most of the day, and most people prefer sunlight over any kind of artificial light. The only reason high CCTs haven't been more popular in my opinion is because most high CCT sources have had poor color rendering. LED can change that. Despite what the numbers say, a CRI 80 LED looks much better than a CRI 80 CFL. If you really want to go all out, then something like the high-CRI Nichia 219 in the 4500K bin is very close to artificial sunlight at most northern latitudes. Certainly it's not something I would think the majority of people would find harsh or lacking. I modded an LED bulb with 219s. It's in my bedroom now. I find it perfectly relaxing. I have another identical bulb I modded with 6500K XPGs. I use that one when I want to be more alert. Actually, when I use both in combination I find the light to be pretty much like natural daylight, both in terms of CCT and CRI.

Face it, incandescent-type CCTs are more an acquired taste and also a cultural thing (i.e. 5000K is popular for residential lighting in Asia, particularly Japan). If you like it, fine, but I'm getting increasingly annoyed that LED retrofits are going the same route as CFLs, where it's hard to find anything other than warm white. It's only in the last few years that non-warm white CFLs have become more than niche items. I don't want to have to wait ten years for the same thing to happen with LEDs. It's actually more imperative that we get away from warm white as some kind of standard with LEDs than it was for CFLs. At least with CFLs there wasn't an efficiency penalty for warm white. In fact, quite the opposite actually. Warm white CFLs are often 5 to 10 lm/W more efficient than daylight CFLs. With LEDs you suffer a ~20% efficiency penalty with warm white. And if you also want high CRI, then the total penalty is roughly 30% over high CCT, medium CRI. The best warm white retrofits are getting over 90 lm/W. That tells me we could be getting ~120 lm/W at cooler colors. Moreover, you can even do high-CRI, high CCT with LED fairly easily. This would fill a niche which is largely unfilled by CFLs (other than expensive, so-called full-spectrum CFLs).

Bottom line, there are biological and practical reasons for making a transition away from warm white. I'm not saying to not make warm white at all, but rather to also focus making quality lighting in higher CCTs, and to have equal numbers of such bulbs on store shelves. Or better yet, make variable CCT bulbs. Sure, I get it that people might not care for 6500K, 65 CRI in their bedroom. But 4000K to 5000K, 80 or 85 CRI, seems to hit a sweet spot. Most of those who like incandescents find it pretty decent. Those who can't stand incandescents don't find it overly yellow.
 

brickbat

Enlightened
Joined
Dec 25, 2003
Messages
890
Location
Indianapolis
...most people prefer sunlight over any kind of artificial light... ... Most of those who like incandescents find it [4000K to 5000K] pretty decent....

Again, is there some study or white paper that backs this? Or is it your personal opinion, based just on your own tastes?
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
JTR, I often agree with you, but differ here a bit.

My kitchen is lit with glorious near perfect 4K light (Xicato Artist series). My work areas, laundry rooms, with 5000K daylight .... but the bedrooms are 2700 and 3K as are many of the living areas.

Higher CCT is great for work/active areas, but when you are just relaxing and the light levels are lower, then lower CCT will be preferred and that is not simply a matter of getting used to but human preference and lack of allowing the biological clock to say that yes, night is upon us.
I'm surprised you of all people would use 2700K or 3000K at all considering that you often mention how lousy it is for tasks like reading (and many people read in their bedrooms or living rooms). My take on lighting is when I want artificial light, it generally means I'm doing something, which in turn means I need to see as well as possible. If I'm really relaxing, as in watching TV, I'll probably have the lights off. The Xicato Artist 4000K (or perhaps 3500K if you're going for a relaxed ambience) seems like it would be perfect for bedrooms and living areas as well. I actually find the 4500K Nichia 219s to be pretty warm and relaxing. Then again, I guess it's all relative. My eyes are used to daylight or high CCT artificial light. Anything much less than about 5000K starts to seem warm. I can't tolerate anything under about 3500K at all without getting major headaches.

Don't warm white LEDs still have a spike around 450 to 470 nm? It's not as pronounced as cooler whites, but it seems it's still enough to keep people thinking it's still daytime. Same with warm white CFLs, which actually have quite a few spikes under 500 nm. I'm also thinking even if we use warm white for the reasons you say (to relax at night), at the same time we're often staring at a brightly lit TV screen just loaded with short wavelengths. That being the case, you might as well just use light which works better with our visual systems.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
Again, is there some study or white paper that backs this? Or is it your personal opinion, based just on your own tastes?
I found this just with a cursory search. Not a study perse, but there is mention of which CCT fluorescents are popular in which markets. This in turn would correlate with preferences. Even here in North America, 4100K is the preferred choice. In Asia practically the entire market is 5000K to 6500K, even for residential.

I also read a white paper some years back which had people rate different light sources. I remember the very low and high CCTs sources didn't fare too well. Surprisingly, RGB LED sources did very well despite the fact that CRI was only about 25 to 50. It may have been because they made certain colors more vivid.

There are also any number of studies showing the human visual system functions best under light with a CCT similar to sunlight (see page 3), provided of course the spectrum is reasonably full. This is actually the overriding reason for diatribe against warm white-it's not really well matched to our visual systems. It has nothing to do with my own preference or taste. We're actually sacrificing efficiency to produce a type of light which is demonstrably inferior from a biological perspective. If we want to talk preferences or tastes, well, lots of people can prefer or get used to things which are suboptimal or even bad for them. That doesn't mean that the reverse isn't true. Give them something better, and the preferences will quickly change. I've seen my fair share of people who went with 3500K or 4100K or 5000K, stuck with it, and then one day said to me I never realized how yellow and awful those old bulbs were.

Like I said, use whatever you prefer, but please let's start having some choices here. 95% of the LED lighting I see in stores these days is warm white. It's as if the LED manufacturers are so pleased with themselves for being able to imitate incandescent lights that they feel no need to offer anything else to your average consumer. It's a shame because of all lighting technologies, LED is the one which can come closest to imitating natural light.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
Relevant to this topic. Last paragraph:

Avoid over-specifying warm colour temperature or CRI because you will pay for it in power.

Pick the colour temperature that your customer desires, work out the CRI that your customer needs, and then check for special spectral requirements.
 

SemiMan

Banned
Joined
Jan 13, 2005
Messages
3,899
I'm surprised you of all people would use 2700K or 3000K at all considering that you often mention how lousy it is for tasks like reading (and many people read in their bedrooms or living rooms). My take on lighting is when I want artificial light, it generally means I'm doing something, which in turn means I need to see as well as possible. If I'm really relaxing, as in watching TV, I'll probably have the lights off. The Xicato Artist 4000K (or perhaps 3500K if you're going for a relaxed ambience) seems like it would be perfect for bedrooms and living areas as well. I actually find the 4500K Nichia 219s to be pretty warm and relaxing. Then again, I guess it's all relative. My eyes are used to daylight or high CCT artificial light. Anything much less than about 5000K starts to seem warm. I can't tolerate anything under about 3500K at all without getting major headaches.

Don't warm white LEDs still have a spike around 450 to 470 nm? It's not as pronounced as cooler whites, but it seems it's still enough to keep people thinking it's still daytime. Same with warm white CFLs, which actually have quite a few spikes under 500 nm. I'm also thinking even if we use warm white for the reasons you say (to relax at night), at the same time we're often staring at a brightly lit TV screen just loaded with short wavelengths. That being the case, you might as well just use light which works better with our visual systems.


I don't read with 2700-3000K, it is bad for the eyes. I have reading lamps for just such purposes. That said, my living room and bedrooms still have warm as their primary lights.

I believe the Kruithof curve has merit and for conditions like bedrooms and living rooms which at night are typically lit 100lux and below, that consideration must be given the warmer lights as being pleasant.

We are pretty much on the same side. I often refer to our lighting "candle culture" and the need for change. With the exception of living rooms, bedrooms, and many dining establishments where low light levels are used, 2700-3K really does not have much of a place. It still surprises me that retail in many cases is enamored with 3K, though I see that changing. High CRI 4K would make so many more colors pop in a retail environment ... and for that reason, in some sectors higher CCT high CRI ceramic metal halide is the light of choice.

At the end of the day though, the customer makes the decision. I think this groups claims that either real or perceived light quality issues could prevent adoption for a significant percentage of the market is in fact true ... and not just Anders :) Of course, you can use red LED plus phosphor white and get high CRI and efficiency, though not the deep reds.

Better CRI white LEDS have fairly small blue spikes and yes CFL have spikes too, but remember it is not height of the spikes, but total area under the spike that defines power. I would look more at Lumileds data sheets for this versus Cree as they have some curves for higher CRI warm whites which have relatively small spikes at 2700-3000K and the total areas you can tell is quite a bit less.

I am not sure the total light from a TV would be enough to stimulate circadian rhythm. The receptors are spread around the eye, not just the central area and I am not sure it is understood if there is a weighting factor based on area yet. I do believe that my computer monitor < 18" from my eyes is more than enough to cause sleep issues. Philips has televisions and monitors with blue surrounds and blue backlight to aid in eye focusing. No idea if it worked or was just marketing.

I wonder if your issue with headaches with light less than 3500K is related to focus/depth of field issues? Circadian rhythm and pupil response appear to be the same mechanism.

I know when I stare at my little laptop display in a dark room for too long I have issues for about the next 15 minutes focusing. Remember when they used to tell people to do CAD in a dark room?


Semiman
 

Marcturus

Enlightened
Joined
Sep 27, 2009
Messages
337
Location
230V~
I found this just with a cursory search. Not a study perse, but there is mention of which CCT fluorescents are popular in which markets. This in turn would correlate with preferences. Even here in North America, 4100K is the preferred choice. In Asia practically the entire market is 5000K to 6500K, even for residential.
It's a wonderful site, but what you quote is one of its weakest parts because these "preferences" might just be based on marketing experience, sales figures, or regional standards.

I also read a white paper some years back which had people rate different light sources. I remember the very low and high CCTs sources didn't fare too well. Surprisingly, RGB LED sources did very well despite the fact that CRI was only about 25 to 50. It may have been because they made certain colors more vivid.
As the details of the setup and light sources matter, just remembering some isn't enough, sorry. (probably wasn't the light sources that got rated, but the illumination, btw.)

There are also any number of studies showing the human visual system functions best under light with a CCT similar to sunlight (see page 3), provided of course the spectrum is reasonably full. This is actually the overriding reason for diatribe against warm white-it's not really well matched to our visual systems. It has nothing to do with my own preference or taste. We're actually sacrificing efficiency to produce a type of light which is demonstrably inferior from a biological perspective. If we want to talk preferences or tastes, well, lots of people can prefer or get used to things which are suboptimal or even bad for them. That doesn't mean that the reverse isn't true. Give them something better, and the preferences will quickly change. I've seen my fair share of people who went with 3500K or 4100K or 5000K, stuck with it, and then one day said to me I never realized how yellow and awful those old bulbs were.

I guess we are familiar with your personal lighting tastes, and who would not welcome more choices in higher CCT lighting? I have no problem with your preferences or your extended personal experiences involving an unidentified sample of people and lamps. Anything beyond this, I find the kind of evidence provided of somewhat, um, questionable quality. What you point to, in the NREL link, goes back to this:

Franta, G.; Anstead, K. (1994). "Daylighting Offers Great Opportunities." Window & Door Specifier-Design Lab, Spring; pp. 40-43.

Have you actually read this? I admit I did not, and I intend to search for it as little as I read Anders H.'s posts.
 

Anders Hoveland

Enlightened
Joined
Sep 1, 2012
Messages
858
I really wish the LED industry as a whole would get away from "warm white". There is another ~20% penalty going with incandescent-like CCTs as opposed to sunlight CCTs.
Perhaps what you fail to realise is that, at least in regular phosphor white LEDs, the "warm" color is directly related CRI. Without the phosphor, it is just a narrow blue frequency spike without any CRI. It is the addition of the phoshor that adds the broad spectrum of other frequencies. While you may prefer a more bluish colored light, with normal LEDs it will come at the expense of making the colors it illuminates look more greyish.

I am really annoyed with the concept of people being expected to put up with inferior quality of light for the sake of "efficiency". I wonder, if highly efficient green colored LED at 555nm was developed, would we all be expected to have to put up with greenish colored light because that is the frequency where the human eye is most sensitive too?

In fact, if we measure efficiency by the average ammount of light that colored objects reflect, it may acually be that lower CRI light sources are actually less efficient. Because the real efficiency of a lamp is not how bright it appears, but how much light is reflected from the objects being viewed. In a room with mostly red colored objects, it is possible that an LED may actually be less efficient than a halogen lamp!


Theoretically, perfect CRI is possible at any color temperature from LEDs, without sacrificing efficiency. It is just a matter of complexity and expense. The strategy of combining multiple LED chips of different frequencies, in addition to phosphors, would be the most energy efficient, and provide the most complete spectrum.
 
Last edited:

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
It's a wonderful site, but what you quote is one of its weakest parts because these "preferences" might just be based on marketing experience, sales figures, or regional standards.
Regardless, the link shows a substantial demand for higher CCT lighting which currently just isn't being met by LED manufacturers. Like I said, I think they got so enamoured with being able to produce warm white which is reasonably close to incandescent that they thought their job was done (at least in regards to spectrum). Remember, there's a huge market for 5000K to 6500K high-CRI fluorescents. Granted, some of this is gimmickry, as in so-called "full-spectrum" lighting to reduce SAD. A lot of though is due to the growing realization that people function best in light resembling the type of light they evolved under.

As the details of the setup and light sources matter, just remembering some isn't enough, sorry. (probably wasn't the light sources that got rated, but the illumination, btw.)
I might have the study on one of my hard drives, along with lots of others. I don't really have time to look for it right now. Funny though how people asking me for sources aren't willing to produce any of their own to contradict what I'm saying. Perhaps it's because such sources don't exist for the simple reason "warm-white" is more an acquired taste rather than something which could be explained in terms of physiology. The Kruithoff Curve is the closest thing I've ever seen to explaining lighting choices, but even here note that it's strictly based of what type of light people find pleasing, not what is best for seeing. Also note that there are huge variations in preferences. At low by today's standards indoor lighting levels of 40 lux the preference ranges from 3000K to 6000K, for example. At somewhat higher lighting levels the lower bound exceeds 3500K. The curve only brackets the typical incandescent CCT range of 2700K to 3000K at around 10 lux. Most indoor spaces are far more brightly than this (i.e. 10 lux is roughly 100 lumens in a 100 square foot room).

I guess we are familiar with your personal lighting tastes, and who would not welcome more choices in higher CCT lighting? I have no problem with your preferences or your extended personal experiences involving an unidentified sample of people and lamps. Anything beyond this, I find the kind of evidence provided of somewhat, um, questionable quality. What you point to, in the NREL link, goes back to this:

Franta, G.; Anstead, K. (1994). "Daylighting Offers Great Opportunities." Window & Door Specifier-Design Lab, Spring; pp. 40-43.

Have you actually read this? I admit I did not, and I intend to search for it as little as I read Anders H.'s posts.
No, I didn't read the paper you mentioned, but that's only one of many sources listed in the paper I linked to. It's not hard to find dozens of sources and also plenty of anecdotal evidence stating that most life forms which evolved in sunlight prefer it over other types of light. Unfortunately, for most of recorded history humans didn't have available to them a light source which could imitate sunlight. First we used fire, and then glowing metal. Neither has existed long enough for humans to adapt evolutionarily to them, particularly because until recently artificial lighting was very expensive, and hence used as sparingly as possible. It's only in the last 70 years, give or take, that we could make fairly inexpensive light sources which could reasonably approximate the CCT of natural daylighting (I'm not counting the arc lamps used possibly as far back as the ancient Egyptians because those were hardly ubiquitous, and the power sources, basically primitive batteries, were huge). Arguably, it's only in maybe the last 25 years or so that we could inexpensively make artificial lighting whose spectrum somewhat resembled sunlight. In short, until about a generation ago, the lighting options which resembled natural lighting either had deficiencies, or simply were too expensive/unwieldy for common use (i.e. arc lamps). For practical reasons then, people stuck with either candles, or starting in the early 20th century, incandescent lamps. While this wasn't enough time to biologically adapt to these sources, it's certainly more than enough time to get used to them, and to even consider them "normal". This I believe explains the reasoning behind the LED manufacturers (and CFL manufacturers also) for pursuing warm white.

As SemiMan said, in the final analysis 2700K to 3000K really doesn't have much of a place when you look at things from a reasoned, rather than an emotional, perspective. Yes, this is what lots of people are used to, but the irony here is this preference is likely only based on about 100 years of artificial lighting. People are very malleable. I'll bet good money most of the demand for "warm-white" comes from the over 40 or 50 crowd who grew up in an era when homes, and even a lot of schools, were lit with only incandescent. Those younger than that have lived under a much larger variety of artificial light sources, and therefore would be at least willing to give something different a try. What's at stake here is that we can save power equivalent to several power plants by just specifying warm white as little as possible. Perhaps in addition to making LED lighting in all common CCTs, there should also be a little information printed on the warm white varieties explaining the energy-use issue and also the biological reasons why cooler lighting might actually be superior. Again, SemiMan is on the right track-a lot more colors just pop under 4000K or 5000K, as opposed to being lost in a haze of yellow. People have to try it and see it for themselves before they actually get it. On a practical perspective, it also makes decorating easier. Colors basically look the same whether it's day or night. I know all too well from the days of incandescents that trying to have any type of decor where the colors look good day and night is next to impossible. You need to pick one or the other, and optimize for it.

Anyway, believe what you will. I'm merely trying to explain here why what the LED industry is doing makes no sense. I'm glad at least that the idea that you always need 90+ CRI is being questioned. I hope the next step is questioning the need for warm-white. The LED manufacturers seem to think sales would suffer if they didn't focus efforts there. I think people's tastes are mostly malleable enough that they wouldn't. The problem I think is too much weight was given to what lighting designers say. I've noticed far too many lighting designers still regard incandescent as the ultimate light source. We need to question and study that assumption very carefully.
 
Last edited:

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
Perhaps what you fail to realise is that, at least in regular phosphor white LEDs, the "warm" color is directly related CRI. Without the phosphor, it is just a narrow blue frequency spike without any CRI. It is the addition of the phoshor that adds the broad spectrum of other frequencies. While you may prefer a more bluish colored light, with normal LEDs it will come at the expense of making the colors it illuminates look more greyish.
Not true at all. Do things look greyish under sunlight? You could easily make an LED with a similar CCT and nearly as good color rendering. It's all a matter of choosing and balancing the phosphors. Sure, even with cooler CCTs, efficiency will suffer if you increase CRI, but high CCT, high CRI is still about 15% more efficient than low CCT, high CRI.

What you fail to realize is that while there are certain times you do indeed want high-quality lighting, there are many more times when you just don't need it. Do we need 90+ CRI in an office or a grocery store (except maybe in the meat department)? Absolutely not. CRI 75 or 80 is more than adequate to see the colors which need to be seen. When you get to streetlighting and parking lots, CRI 65 is more than adequate because absolute light levels matter more here than whether or not you can distinguish every color. High CRI should certainly be an option, but the caveat is that more often than not you simply don't need it. Even when you go to CRI 75 or 80 with LEDs, the end result looks worlds better than an equivalent CRI tri-phosphor fluorescent. At least the LED doesn't have huge gaps in the spectrum with virtually no emission.

I am really annoyed with the concept of people being expected to put up with inferior quality of light for the sake of "efficiency". I wonder, if highly efficient green colored LED at 555nm was developed, would we all be expected to have to put up with greenish colored light because that is the frequency where the human eye is most sensitive too?

In fact, if we measure efficiency by the average ammount of light that colored objects reflect, it may acually be that lower CRI light sources are actually less efficient. Because the real efficiency of a lamp is not how bright it appears, but how much light is reflected from the objects being viewed. In a room with mostly red colored objects, it is possible that an LED may actually be less efficient than a halogen lamp!
And how many people live in rooms with mostly red objects? Decor these days seems to tend towards lots of neutral colors. Those actually look better under 4000K or 5000K. And no, nobody, including me, is suggesting that we would ever use 555 nm green should we ever develop an LED which could efficiently emit at that frequency. Light quality would be just as bad as under LPS. Part of seeing is being able to distinguish objects of different colors. For most purposes a CRI of 65 to 80 is more than adequate for that task. It's rare you actually need anything higher even if you seem to think otherwise. BTW, I purposefully changed the font color of this paragraph to the closest option to 555 nm green just to illustrate how awful such lighting would be, possibly even worse than LPS.
 

Anders Hoveland

Enlightened
Joined
Sep 1, 2012
Messages
858
Regardless, the link shows a substantial demand for higher CCT lighting which currently just isn't being met by LED manufacturers. Like I said, I think they got so enamoured with being able to produce warm white which is reasonably close to incandescent that they thought their job was done (at least in regards to spectrum). Remember, there's a huge market for 5000K to 6500K high-CRI fluorescents. Granted, some of this is gimmickry, as in so-called "full-spectrum" lighting to reduce SAD. A lot of though is due to the growing realization that people function best in light resembling the type of light they evolved under.
I completely agree. I will be very interested in buying LEDs once they make higher CCT with excellent (95+) CRI. However, I suspect what most people think of as the color of "sunlight" is actually closer to 4500K, perhaps even less under low-level light conditions.

As SemiMan said, in the final analysis 2700K to 3000K really doesn't have much of a place when you look at things from a reasoned, rather than an emotional, perspective. Yes, this is what lots of people are used to, but the irony here is this preference is likely only based on about 100 years of artificial lighting. People are very malleable. I'll bet good money most of the demand for "warm-white" comes from the over 40 or 50 crowd who grew up in an era when homes, and even a lot of schools, were lit with only incandescent.
I am sure there is some truth to this. But I actually find my energy-saving halogen bulbs a little too "cold" and glaring when I am trying to relax in the evening. They are rated 2960K. I prefer 2800-2900K, which I can get from the old "energy saving" regular light bulbs (which did not have a very long rated lifespan) or from 2000-3000 hour rated halogen bulbs. This is only my preference for some situations. In others I do indeed prefer the whiter 2960K. But it is rather strange. Psychologically somehow I perceive a room being lit by the higher color temperature halogen as being almost bothering yellowish, even with a slightly green tint, whereas with 2800K bulbs the warm orangish glow does not seem to bother me at all. I know this does not seem to make sense. Everyone does have their own preferences apparently, that is why we should have more options. I tried LED bulbs, 2700 and 3000 CCT, and was unsatisfied. It just has an annoying pinkish orange color tint that annoys me. I will try LEDs again when they become available in higher CRI. For now I will be keeping my incandescents and halogens.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
We are pretty much on the same side. I often refer to our lighting "candle culture" and the need for change. With the exception of living rooms, bedrooms, and many dining establishments where low light levels are used, 2700-3K really does not have much of a place. It still surprises me that retail in many cases is enamored with 3K, though I see that changing. High CRI 4K would make so many more colors pop in a retail environment ... and for that reason, in some sectors higher CCT high CRI ceramic metal halide is the light of choice.
I'm glad we're on the same page here. IMO, the biggest hurdle is getting people to try higher CCT. Once you can convince them to do so, the rarely go back.

At the end of the day though, the customer makes the decision. I think this groups claims that either real or perceived light quality issues could prevent adoption for a significant percentage of the market is in fact true ... and not just Anders :) Of course, you can use red LED plus phosphor white and get high CRI and efficiency, though not the deep reds.
I think the larger problem is lighting designers speaking for the population as a whole. As you may already know, far too many lighting designers put incandescent on a pedestal.

I wonder if your issue with headaches with light less than 3500K is related to focus/depth of field issues? Circadian rhythm and pupil response appear to be the same mechanism.

I know when I stare at my little laptop display in a dark room for too long I have issues for about the next 15 minutes focusing. Remember when they used to tell people to do CAD in a dark room?
I'm nearly 100% sure the headaches are caused by the inability to auto white balance because the headaches occur even if I'm not trying to do any type of visually intensive work. Like a camera, the mind is capable of white balancing a scene, but everyone has their limits. Below or above a certain CCT, there is just no white point. Evidently my brain doesn't like this, and I get headaches as a result.

BTW, I have issues with most laptop displays, but not only when viewing them in dark room. The pixels are just too large (same with most LCD monitors). The end result is "screen door" effect. I hope we get 250 or 300 pixel per inch displays relatively soon.
 

Anders Hoveland

Enlightened
Joined
Sep 1, 2012
Messages
858
Do things look greyish under sunlight? You could easily make an LED with a similar CCT and nearly as good color rendering. It's all a matter of choosing and balancing the phosphors. Sure, even with cooler CCTs, efficiency will suffer if you increase CRI, but high CCT, high CRI is still about 15% more efficient than low CCT, high CRI.
I meant that regular LEDs with higher CCT make things look more greyish. Because toward to higher frequency colors in the spectrum, it is mostly just a narrow bluish frequency spike. Obviously this is not the case with sunlight. To say it simply, the blue light in sunlight has a broader spectrum than the blue light in LEDs. I am not sure what you mean here.

What you fail to realize is that while there are certain times you do indeed want high-quality lighting, there are many more times when you just don't need it. Do we need 90+ CRI in an office or a grocery store (except maybe in the meat department)? Part of seeing is being able to distinguish objects of different colors. For most purposes a CRI of 65 to 80 is more than adequate for that task. It's rare you actually need anything higher even if you seem to think otherwise..
I do not really want someone else deciding what I do and do not "need". Sure, people may not absolutely need it, but that does not mean it is not a good thing. Many people spend most of their lives inside an office with artificial lighting. You do not think the quality of light is important? (okay, well admittingly it is still better than most of the awful fluorescent lighting in most offices now). I cannot imagine spending 8 hours each day in a workspace with a CRI of 65, it would suck the life out of everything. People might not need it, but many still want it. A workspace, and especially the home, is not supposed to be about just meeting the minimum of what people need. It should not be about deprivation.

IMO, the biggest hurdle is getting people to try higher CCT. Once you can convince them to do so, they rarely go back.
Well, many people have tried higher CCT and been turned off to the whole LED thing because of it. Some people just do not like bluish colored light in their home. This may have something to do with the von Kries hypothesis at lower light levels, or it may have something to do with evolutionary legacy, from old times when humans relied on fires for nighttime illumination. I have read some interesting studies on racial differences in preferences for color temperature illumination, the hypothesis being that people of european descent are more likely to prefer lower color temperatures that more closely resemble fire. Although of course there are plenty of whites who prefer cooler color temperature also. I would also suspect that there is a significant gender gap, with most of the ones prefering higher CCT being males.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
I meant that regular LEDs with higher CCT make things look more greyish. Because toward to higher frequency colors in the spectrum, it is mostly just a narrow bluish frequency spike. Obviously this is not the case with sunlight. To say it simply, the blue light in sunlight has a broader spectrum than the blue light in LEDs. I am not sure what you mean here.
"Regular" LEDs actually are mostly deficient in reds, particular R9. And yes, sunlight has a broader blue spectrum than LEDs, and also has lots of UV. Like deep red on the other end, deep blue and UV rarely come into play when it comes to rendering colors. You can drop off anything below 450 nm and above 650 nm, and still have 98+ CRI if your spectrum is reasonably full and balanced.

I do not really want someone else deciding what I do and do not "need". Sure, people may not absolutely need it, but that does not mean it is not a good thing. Many people spend most of their lives inside an office with artificial lighting. You do not think the quality of light is important? (okay, well admittingly it is still better than most of the awful fluorescent lighting in most offices now). I cannot imagine spending 8 hours each day in a workspace with a CRI of 65, it would suck the life out of everything. People might not need it, but many still want it. A workspace, and especially the home, is not supposed to be about just meeting the minimum of what people need. It should not be about deprivation.
People actually had CRI 62 halophor fluorescents in offices from the 1940s until probably the late 1970s and they managed just fine. Besides, nowadays CRI 65 would be specified for parking lots or streetlights, not indoor lighting. In all likelihood, offices would be lit with CRI 80 to 85 LEDs, which frankly look as good to my eyes as CRI 90+ fluorescents. The end result would actually be more than acceptable. It would be an improvement over what we have today, which is CRI 75 to 85 triphosphor lamps. The only argument for higher CRI in the workplace is if it makes people more efficient. Again, present day workplace lighting standards are based on studies which show that you need good, but not perfect lighting, to avoid workplace fatigue. Those old flickering halophosphor fluorescents certainly had a negative effect on productivity. On the other hand, there would be no measurable productivity difference between CRI 80 and CRI 95 LED lighting, but there would be 15 or 20% higher energy usage. Or put another way, ask some workers if they're willing to take a salary cut for slightly better lighting, then get back to me. Make sure you have a good pair of running shoes on when you do this. ;)

And a big problem today which you inadvertently mentioned is far too many people treat their wants like needs. Sure, 95+ CRI lighting might be wonderful to put everywhere, but few people would even notice the difference over 80 to 85 CRI. Even fewer would actually benefit because there are scant few situations where there is a real need to distinguish very fine gradations of color. In short, it would be a waste, just as it would be a waste designing every automobile with the capability to go 250 mph. Those who really need high CRI already know it, and are willing to pay a premium in terms of efficiency for it.

Well, many people have tried higher CCT and been turned off to the whole LED thing because of it. Some people just do not like bluish colored light in their home.
4000K to 5000K isn't blue-it's usually perceived as pure white. You're referring here to cheap retrofits using 5mm LEDs. Nobody takes these seriously as a lighting option. They're mostly gone now that we have some standards for LED lighting in place. At least they weren't around long enough to give LED a bad name.
 

Anders Hoveland

Enlightened
Joined
Sep 1, 2012
Messages
858
Did you not read the article?

North American Lighting Sector Calls for Energy Star to Address LED Lamp Light Quality
http://www.ledinside.com/news/2013/1/north_american_lighting_sector_energy_star_20130123

"Recently, the lighting sector has urged the U.S. Environmental Protection Agency's (EPA) ENERGY STAR program to address light quality, specifically color rendering, in its new lamp specification, in a bid to ensure the long-term success and widespread market adoption of LED lamps. Many consumers will make their first LED lamp purchases in the next few years, and the market is entering a critical window for making a positive impact on consumers' first impressions of LED technology," said Eric Kim, CEO of Soraa. "However, for LED lamps to achieve significant market share, consumers must be confident that these lamps can give them the light quality they need and want. McKinsey's 2011 Lighting the Way report suggests that consumer and commercial lighting purchase decisions are driven as much by light quality, as they are by the cost of the light bulb.

The slow market adoption of CFLs over the last 20 years demonstrates that simply because a product produces enough light, saves energy and is cost-effective, broad market adoption of that technology is not ensured. To persuade consumers to purchase LEDs instead of incandescent lamps, LED lamps must be seen as high-quality products worth the initial higher price differential. Therefore, LED lamps must closely replicate the color rendering and color appearance of the incandescent and halogen lamps that they replace," said Carlos Alonso-Niemeyer, Energy Efficiency Program Manager of NSTAR a Northeast Utilities Company."

If LED bulbs do not deliver the quality of light consumers want when they first try them, it could do irreparable damage to the reputation of this technology.


"Regular" LEDs actually are mostly deficient in reds, particular R9. And yes, sunlight has a broader blue spectrum than LEDs, and also has lots of UV. Like deep red on the other end, deep blue and UV rarely come into play when it comes to rendering colors. You can drop off anything below 450 nm and above 650 nm, and still have 98+ CRI if your spectrum is reasonably full and balanced.
Just try looking at a red rose under halogen light, and then compare that to LED light. Tell me you do not see a difference. Those deep reds and teal greenish colors in the leaves are poorly rendered by the LED spectrum. Even the "enhanced" LED lamps with red LED chips are using 635nm frequency for higher efficiency (since the human eye is more sensitive), which still does not do a great job of rendering deep red colors. Using common 660nm red chips instead would give better color rendering.
 
Last edited:
Top