LED bulbs with selectable color temp

CRI definitely isn't ideal with these bulbs, but still is way better than most CFLs. At the hotel this past week, the Costco bulb had a significantly better CRI than whatever garbage bulb the hotel had in there.
 
Well, since I posted that I ended up moving, and replaced a bunch of CFLs of various color temps with these Feit 60W bulbs. Set them all to 2700. So far they are pleasant and I have had no problems. It's certainly an improvement over 4000K CFLs...

I will update if anything remarkable happens. Sliding the switch does change the color temp and it seems about right. Man if you told me when I first joined this forum that something like this would be available today I'd be skeptical, but it is pretty cool. $20 for basically upgrading the lighting in half the house is not a bad deal at all. (I suspect some of that might still be state subsidies, LED light bulbs have been cheaper here in MD than in VA for quite some time.)

I don't have any dimmers, so can't comment on dimming performance. As I stated before I was impressed with the Feit 60W 2700K bulbs in my last place, with Lutron Maestro C-L dimmers they performed better than any other bulb I tested. I hope that these are similar.

Had my first failure today. It's been less than a month as I didn't move in until the first of June. Discovered something about these bulbs as well. Last night I was downstairs working on the furnace/AC (irrelevant to the story, was hooking up a Nest thermostat and had to rerun the cable) and then this morning I went back down there to clean up. I don't spend a lot of time down there as that's not "my" space, nobody's living there but that is not mine per rental agreement, although I do have to go down there for laundry. That *is* relevant, as it means that the new bulbs down there only have a couple hours on them, as in maybe 10 at most. Yes, I replaced them because they were awful (again, there were a lot of high color temp spiral CFLs here) and the place looks so much less industrial with good 2700K lighting. Consider it my gift to landlady/future tenants. Anyway one of the bulbs was out this morning which was disappointing and I noticed that it would come back on brighter and brigher as I slid the switch. So what I can only assume is that there is one LED or set of LEDs for 2700K and another for 6500K, and the intermediate positions are achieved by mixing the two.

I didn't throw it out in case I find an application for 6500K or I feel froggy and want to disassemble it, but the latter likely won't happen as I'm very busy ATM. Good thing I did clean up this morning, there's supposed to be someone coming to look at the basement this evening so at least I got a head start on it and it looks decent now.

I still have yet to test one on a dimmer; if I did have a dimmer anywhere it would be in my bedroom but that room does not have a ceiling fixture and I actually would like to pick out a new lamp for it because I don't have one that I like currently, other than an old Robert Sonneman floor lamp that would be more appropriate by a reading chair.
 
CRI definitely isn't ideal with these bulbs, but still is way better than most CFLs. At the hotel this past week, the Costco bulb had a significantly better CRI than whatever garbage bulb the hotel had in there.

Feit claims "90+" but I have no way of testing. Has anyone done it yet?

I guess you'd have to run it at each color temp, as well...
 
Does CRI on LEDs change with brightness? you might have to test at different brightness levels as well.
 
Well, since I posted that I ended up moving, and replaced a bunch of CFLs of various color temps with these Feit 60W bulbs. Set them all to 2700. So far they are pleasant and I have had no problems. It's certainly an improvement over 4000K CFLs...

I will update if anything remarkable happens. Sliding the switch does change the color temp and it seems about right. Man if you told me when I first joined this forum that something like this would be available today I'd be skeptical, but it is pretty cool. $20 for basically upgrading the lighting in half the house is not a bad deal at all. (I suspect some of that might still be state subsidies, LED light bulbs have been cheaper here in MD than in VA for quite some time.)

I don't have any dimmers, so can't comment on dimming performance. As I stated before I was impressed with the Feit 60W 2700K bulbs in my last place, with Lutron Maestro C-L dimmers they performed better than any other bulb I tested. I hope that these are similar.
I've been using the GU24-base version of the A19 Feit HCRI single/fixed CT bulbs in the recessed ceiling fixtures in my kitchen for a year or two, and the improvement over the CFLs they replaced has been substantial and enjoyable. The replacements I have on hand are the newer 3x selctable CT version, and I expect to like them as well. I can see how a steak is cooked without resorting to a flashlight now;-)
 
Feit claims "90+" but I have no way of testing. Has anyone done it yet?

I guess you'd have to run it at each color temp, as well...
I'm usually pretty good at this (better than most people). Just by looking at the light, I'd estimate it to be 90 CRI.
(possibly 91, but it very much seems below 93, judging by skin tone)

Probably the color temperature would not result in much or any change in CRI. We can assume it probably uses a mix of 2700K and 5000K emitters with the same CRI specifications.

In my opinion, 90 CRI is "good enough" for home lighting, maybe even "kind of nice", but really nothing fantastic or amazing.
This seems to be considered the "economy range" of "higher CRI".

Does CRI on LEDs change with brightness? you might have to test at different brightness levels as well.
Basically no, CRI does not change with different LED brightness settings (with a few very small caveats, for those who like to nitpick and demand absolute complete accuracy)
 
Last edited:
  • Like
Reactions: TPA
I'm usually pretty good at this (better than most people). Just by looking at the light, I'd estimate it to be 90 CRI.
(possibly 91, but it very much seems below 93, judging by skin tone)

Probably the color temperature would not result in much or any change in CRI. We can assume it probably uses a mix of 2700K and 5000K emitters with the same CRI specifications.

In my opinion, 90 CRI is "good enough" for home lighting, maybe even "kind of nice", but really nothing fantastic or amazing.
This seems to be considered the "economy range" of "higher CRI".


Basically no, CRI does not change with different LED brightness settings (with a few very small caveats, for those who like to nitpick and demand absolute complete accuracy)

I'm just happy that we seem to be getting more options labeled 90+ on store shelves; up until a couple years ago it seems that 80 was the best you could hope for without ordering online.
 
Does CRI on LEDs change with brightness? you might have to test at different brightness levels as well.
An update:
It appears I was somewhat wrong about that. (So I just wanted to elaborate for full accuracy)
Apparently the CRI level of LEDs can change at different current (power) levels. Why would this happen?
At higher intensity levels a greater percentage of the blue light from the emitter is able to pass through the phosphor, while at lower intensity levels the percentage of the blue light able to pass through is a little bit lower. This can shift color temperature a little bit.
There is the phenomena of the phosphor reabsorbing some of the light emitted by the phosphor and down-converting it into longer wavelengths. A low percentage of wavelengths as long as about 520 to 525nm (emerald green) can still cause excitation of the phosphor, and then there would be more emission of longer wavelengths (red light).
But countering this effect, it also takes a larger percentage of red light to be able to increase CRI at lower color temperatures.
And another effect is that an increase in heat also causes a shift in the spectral emission of the phosphor a little more towards longer wavelengths (more towards the orange). (This used to be a bigger problem in older LED phosphor formulations)

(Edit: One other possible reason, I do not know if this is the case, the slight increase in CRI, especially at lower color temperatures, could have something to do with the spectral distribution of the blue LED emitter. At lower current levels, the peak is lower and the percentage of much longer blue wavelengths - trailing off towards the cyan region - is higher, although the percentage is not very high to begin with. Remember, at lower current levels there's a tendency for the smallest voltage bandgap - or longest wavelength - domain areas of the LED fill up first, however small they may be as a percentage of the total)

The effect is not really huge, but can be significant. For a 90 CRI rated LED, typically the CRI might be able to increase by about 2 points (possibly 3 at most) if the LED is driven at maybe 20 or 28 percent of its rated (recommended) current (power) level.
(We might expect this to be accompanied by a decrease in lumen efficiency, but a counteracting effect is that the LED emitter itself will also be a little more efficient driven at only 20 to 28 percent of its rated current level)

But this effect will be much less for LEDs that are dimmed with pulse width modulation (PWM). And I believe that is very much the situation for these selectable color temperature bulbs, since I strongly suspect they are already using a PWM design to modulate the amount of lower color temperature and higher color temperature light that is mixing, to result in the desired color temperature at the intermediate setting.

So I am not really sure how relevant this phenomena is for this particular situation.

In addition to that, the typical level of dimming, especially for a "60 Watt equivalent" bulb which is already not very bright, is probably only going to be about 40 to 70 percent, realistically. So even if it were not for PWM, we would expect any change in the CRI level to be very minimal, I think.
 
Last edited:
An update:
It appears I was somewhat wrong about that. (So I just wanted to elaborate for full accuracy)
Apparently the CRI level of LEDs can change at different current (power) levels. Why would this happen?
At higher intensity levels a greater percentage of the blue light from the emitter is able to pass through the phosphor, while at lower intensity levels the percentage of the blue light able to pass through is a little bit lower. This can shift color temperature a little bit.
There is the phenomena of the phosphor reabsorbing some of the light emitted by the phosphor and down-converting it into longer wavelengths. A low percentage of wavelengths as long as about 520 to 525nm (emerald green) can still cause excitation of the phosphor, and then there would be more emission of longer wavelengths (red light).
But countering this effect, it also takes a larger percentage of red light to be able to increase CRI at lower color temperatures.
And another effect is that an increase in heat also causes a shift in the spectral emission of the phosphor a little more towards longer wavelengths (more towards the orange). (This used to be a bigger problem in older LED phosphor formulations)

(Edit: One other possible reason, I do not know if this is the case, the slight increase in CRI, especially at lower color temperatures, could have something to do with the spectral distribution of the blue LED emitter. At lower current levels, the peak is lower and the percentage of much longer blue wavelengths - trailing off towards the cyan region - is higher, although the percentage is not very high to begin with. Remember, at lower current levels the smallest voltage bandgap - or longest wavelength - domain areas of the LED fill up first, however small they may be as a percentage of the total)

The effect is not really huge, but can be significant. For a 90 CRI rated LED, typically the CRI might be able to increase by about 2 points (possibly 3 at most) if the LED is driven at maybe 20 or 28 percent of its rated (recommended) current (power) level.
(We might expect this to be accompanied by a decrease in lumen efficiency, but a counteracting effect is that the LED emitter itself will also be a little more efficient driven at only 20 to 28 percent of its rated current level)

But this effect will be much less for LEDs that are dimmed with pulse width modulation (PWM). And I believe that is very much the situation for these selectable color temperature bulbs, since I strongly suspect they are already using a PWM design to modulate the amount of lower color temperature and higher color temperature light that is mixing, to result in the desired color temperature at the intermediate setting.

So I am not really sure how relevant this phenomena is for this particular situation.

In addition to that, the typical level of dimming, especially for a "60 Watt equivalent" bulb which is already not very bright, is probably only going to be about 40 to 70 percent, realistically. So even if it were not for PWM, we would expect any change in the CRI level to be very minimal, I think.
Great analysis Joakim. From what I've read, the different phosphor components have different saturation or droop points, where any further increase in LED intensity doesn't produce an equivalent intensity in down-converted output. It seems that the current formula for red phosphors is the most prone to this saturation, so a phosphor-white LED that was pushed beyond a certain brightness would see a percentage drop in red, shifting the color temp towards blue and dropping the CRI. I don't know if there's a fix for this on the horizon yet.

On a related note, LED videowalls are always PWM'd to help maintain the color balance across the entire range of perceived brightness.
 
From what I've read, the different phosphor components have different saturation or droop points, where any further increase in LED intensity doesn't produce an equivalent intensity in down-converted output. It seems that the current formula for red phosphors is the most prone to this saturation, so a phosphor-white LED that was pushed beyond a certain brightness would see a percentage drop in red, shifting the color temp towards blue and dropping the CRI. I don't know if there's a fix for this on the horizon yet.
This is going off topic, and I'm just speculating, but I suspect it may have more to do with the ratio of absorption at higher intensity levels.
This makes sense if you understand how these types of Stokes shift phosphors work. If the excitation wavelength (in this case blue) is either too close to or too far from a selected emission wavelength, the conversion probability will be significantly lower. In this case I suspect what may be happening is the chance of blue light being absorbed by the red phosphor is higher than by the green phosphor, so at the normal rated power (blue light intensity) levels they have to use a larger percentage of green phosphor, and reduce the percentage of red phosphor, to compensate. But as the blue wavelength intensity increases, a point will be reached where the majority of the red phosphor sites are being excited. At that point the probability that a blue particle of light will be absorbed by the green phosphor, rather than red, will increase.
So the issue comes down to the absorption probability (what they sometimes term "cross section") difference between the phosphors. And there is really no easy fix for that, other than to use a great excess of phosphor relative to the intensity of the light, which usually entails a remote phosphor design (layer of phosphor with a larger area separated some distance from the LED emitter).

(In most standard white LEDs, the green and red phosphors have the same chemical composition, but are just doped at different concentrations to shift the wavelength distribution. Thinking about using a different chemical composition probably wouldn't be too helpful because typically it is desirable for the red phosphor to be as efficient as possible, and higher efficiency usually is accompanied by a higher absorption cross section)
 
Last edited:
I was introduced to white LED chromaticity several years ago when working to second-source small SMT LEDs used for backlighting. It turned out to be a bit of a jungle, with different but overlapping "bins" among devices.

LED data sheets, at least the good ones, show chromaticity shift versus current. Of course this requires correct interpretation of the x-y points/ranges. I don't have a good feel for the visible manifestation of this shift, would actually need to observe it.

Dave
 
Top