What colour temperature is subjectively closest to "pure white"?

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
My question falls more into the realms of subjectivity, but it should be possible to get a consensus on what most people would consider the most 'white'.

Standard incandescents have been around for ages, and I'm sure 99% of people wouldn't consider them nearly close.

A few years ago, I thought 6400k would be the closest to pure white. But I'm beginning to wonder if this hue is slightly veering towards blue after all. I remember reading a not-strictly-scientifically-controlled study done by someone who showed a range of hues to students and they said that temperatures around 4000-5000k were closest to what they perceived as true white.

I also came across this page, which also suggests 5000k is true white. Other sources will vary from anything between 4000 and maybe 7000 for their definition.

I bet you guys may know better though. Sources appreciated too.
 
Last edited:

StarHalo

Flashaholic
Joined
Dec 4, 2007
Messages
10,927
Location
California Republic
Re: What colour temperature is subjectively "pure white"?

There is no pure white; you're always going to bias one light source against another, so if you're been in 4300K lighting and move to 6500K, it'll look cold, if you've been in 8000K lighting and move to 6500K, it'll look warm, etc.

The manufacturer that for me personally has always gotten closest to an all-around usually-looks-white is Jetbeam, but they don't list what the color temp of their emitters is (it's probably somewhere around 5000K), but again, it'll look warm or cool depending on what you've been viewing before..
 

ElectronGuru

Flashaholic
Joined
Aug 18, 2007
Messages
6,055
Location
Oregon
Re: What colour temperature is subjectively "pure white"?

My testing shows 4300K to be the mark, but 4200-4600 is close enough in most applications. 5000K is still pretty blue. 3000K is quite yellow.

One of the issues is that this scale is designed from the use of a single technology (glowing hot metal). So a non-glowing-metal source (LEDs for example) can be (measure at exactly) 4300K and still be tinted with a color not on the color temp scale, like green.
 
Last edited:

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
There is no pure white; you're always going to bias one light source against another, so if yoau're been in 4300K lighting and move to 6500K, it'll look cold

Yes, but that's just the sensory contrast of going from one situation to another. If you stick in one enviroment, then there should be a colour temperature that's closest to what most people would consider pure white (as you imply yourself afterwards).

My testing shows 4300K to be the mark, but 4200-4600 is close enough in most applications.
Interesting, and it does seem to tally up. I'll read that page more closely soon, but in a nutshell how did you arrive at that figure? Intuitively, it would seem that only testing what people would consider closest to white (and then averaging the results) would be the only way, but I could be mistaken.

So a non-glowing-metal source (LEDs for example) can be (measure at exactly) 4300K and still be tinted with a color not on the color temp scale, like green.
Yes good point.
 

StarHalo

Flashaholic
Joined
Dec 4, 2007
Messages
10,927
Location
California Republic
Yes, but that's just the sensory contrast of going from one situation to another. If you stick in one enviroment, then there should be a colour temperature that's closest to what most people would consider pure white (as you imply yourself afterwards).

Right, if you're in a 3000K room, a 4500K light will look pure white. The problem is, once you move to a 6500K room, that same light will now look warm.
 

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
The problem is, once you move to a 6500K room, that same light will now look warm.
...until you get accustomed to it. Let's assume you're going to be in the same room for at least say, 5-10 minutes.

Or let's assume that I fill my house with the same colour temperature bulbs. Going from one room to another now won't produce any difference. Question is, for a true white hue, do I go filling my house full of 3500k bulbs, 7000k bulbs, or something in between.
 

StarHalo

Flashaholic
Joined
Dec 4, 2007
Messages
10,927
Location
California Republic
...until you get accustomed to it. Let's assume you're going to be in the same room for at least say, 5-10 minutes.

Or let's assume that I fill my house with the same colour temperature bulbs. Going from one room to another now won't produce any difference. Question is, for a true white hue, do I go filling my house full of 3500k bulbs, 7000k bulbs, or something in between.

I should complete the sentence as follows: "Right, if you're in a 3000K room, a 4500K light will look pure white. The problem is, once you move to a 6500K room, that same 4500K light will now look warm." So there is no one color temp you can make a flashlight that will look white even a majority of the time, because the environment you're using it in is always changing.

As for lighting your house, you have to be careful with how cool you get, otherwise it just looks like you're lighting your living room with garage worklights; there needs to be a bit of warmth to house lighting to prevent it from looking too sterile. I find ~3500K works nicely in office and bathroom lighting, and ~3000K is about right for bedrooms and living rooms.
 

TedTheLed

Flashlight Enthusiast
Joined
Feb 22, 2006
Messages
2,021
Location
Ventura, CA.
I'd say since we've been walking the earth for the past 250,000 years or so the light we are most used to seeing things in has been the sun shining in the morning and the afternoon.

the sun is whitest at noon, directly overhead, at 5500K.. a little before and after noon the sun is a bit warmer, or yellower, the way we see it most of the time..

so I'd say light your house at 4800K or so for the most 'normal' / slightly warm appearance of everything.
 

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
So there is no one color temp you can make a flashlight that will look white even a majority of the time
.
Oh I see. I was talking though about just room lighting, rather than flashlights. You're right about how 3000k may make the house look warmer. However, I've often thought instead of making the light warmer, have it as neutral, and simply make the furniture colours warmer. That way, other things can be seen with all colours (even blues), whilst the general surrounding will always be 'warm'.

I'd say since we've been walking the earth for the past 250,000 years or so the light we are most used to seeing things in has been the sun shining in the morning and the afternoon.
I've sometimes thought along similar lines, but I'd be wary, especially as the colour temperature varies throughout the day. So we can't really get an exact figure that way.
 

StarHalo

Flashaholic
Joined
Dec 4, 2007
Messages
10,927
Location
California Republic
I've often thought instead of making the light warmer, have it as neutral, and simply make the furniture colours warmer.

I've used this trick in office lighting before, but it's really more to do with paint than furniture; the trick is to use a warm wall color (anything with a touch of red, which is a broad range of colors), then any accents that are white, replace with an off-white, like a bone or light cream shade. It makes the workspace stand out, but so do skins tones and clothing colors, so I still don't recommend the technique for family/gathering places..
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,506
Location
Flushing, NY
Humans have a built-in color balance in their brains. As a result, a fairly wide range of color temperatures look "white" once you get used to them. The real question should be what color temperature looks white to us without this automatic color balance? Speaking from experience, this is likely to be the most pleasant color temperature to live under because our brains won't be constantly trying to correct for white.

The answer is it varies from person to person, but it is a given that nobody can auto color balance at the extremes of the scale. Because of this, they are to be avoided. Generally CCTs less than 3500K, or more than about 7000K, fall outside the range. I personally find that I get headaches when under incandescent or similar yellowish light for prolonged periods precisely because my brain is trying ( and failing ) to correct for white balance. Same thing for very bluish LEDs. Probably right around the middle of the scale, 4500K to 5500K, is where things look "white" without your brain doing any correction. And this makes sense from a biological perspective, because this falls within the range which sunlight has during most of the day. Confounding things a bit is the Kruitoff Curve. Lower light levels can skew the midpoint CCT down. Under typical room lighting conditions, ~4000K might look white, while 5000K looks slightly blue.

I personally have standardized on 5000K everywhere. It hardly makes things look clinical to me. Rather, it helps me see as well as possible, and that's really the point of lighting. If creating an "ambience" by using low CCT lighting means vision ends up being compromised, then it's not a good idea. That's why I generally don't understand the use of 2700K or 3000K in the home. The end result looks like you put a sodium vapor streetlight indoors. All lighting should seek to mimic the sun. That's what humans see best under. If you want to have accent lights here or there to give a hint of warmth, that's fine, but they shouldn't be the primary light source. Better yet, just use warm colors if a warm appearance is what you're after. Honestly though, I'm not a big fan of warm tones anywhere, especially after this heat wave.
 

StarHalo

Flashaholic
Joined
Dec 4, 2007
Messages
10,927
Location
California Republic
It makes sense to our man-logic that you should fill a room as amply as possible with scientifically-approved photography-studio-ready light that is ideal for reading and working. But if you think back to those spaces you remember best, such as areas in a museum, restaurant spaces that really stood out, etc, you'll find they pretty much all had warm lighting. I would agree with the above posters that mankind is accustomed to outdoor ambient sky lighting, but I would also submit that wherever man gathered in the evening, there was always warm light; first from the open fire and then from the lantern. The cozy and welcoming soft light offered the safety and camaraderie that our ancestors sought out, and is still instinctively linked in our minds to the above notions like "welcoming" and "soft".

Not every space needs to be a lit like a laboratory..

portraitwinner.jpg
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,506
Location
Flushing, NY
Well, she would look good no matter what light she was under. :thumbsup:

I've heard that warmer lighting flatters skin tones. If you're talking about relatively pale northern Europeans who might look almost pasty white under sunlight, then probably true. If you're talking Asian and/or Mediterranean people, then not at all. Especially Asians since some have a yellowish tone to their skin to start with. When that's the case, yellowish lighting ends up making them looking orange. Mediterraneans with olive skin don't look much better under yellowish light either.

Climate also has something to do with it. Hotter regions tend to like cooler lighting and vice versa. Last thing I need when it's 90 degrees out is to come into a room where the lighting reminds me of being near a fire.
 

blasterman

Flashlight Enthusiast
Joined
Jul 17, 2008
Messages
1,802
Years ago I did a lot of very color intensive video analysis, and after about a year I learned to differentiate down to about 1cc if I was relaxed enough. So critical was our work that any slight deviation in my color correction resulted in the rest of production being off and it being obvious. I needed to keep 'zeroed' at all costs, and I quickly learned what things caused me to drift off. Even going to lunch and spending time in the bright sun for 45minutes screwed me up. Or, wearing colored sunglasses for more than a few minutes. Even ambient room light causes your reference point to move.

The problem with using artifical lighting as a CCT reference is other than the sun, or maybe Xenon, other common sources like fluorescent tubes or halides or increasingly LEDs do not have flat spectrums. These light sources are typically blue-green generators with just enough yellow and red to qualify for a CRI stamping.

Given a typical artifical light source, like a LED or fluorescent tube, if you lined 10 different color temps up and put them in a room with any type of ambient I'm pretty sure a random sampling of people would pick light sources around 4100k as being neutral white. If you upped the CRI to 95 I'd would bet that picked sample would start to creep up towards 5000k.

We often refer to 4100k as being unofficial white, but I think this has less to do with a technical index than the fact that common color spikes in artifical light sources skew what we think is white. Get rid of the spikes and our preference would move up towards the CCT of natural sunlight.
 

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
Interesting posts again - I thought this could be the best forum to post under :)

I tend to think one can 'get used' to a near-white hue as being more or less white, but that still it would have a subconscious 'off-white' hue to it, without that person really even knowing consciously. Could be wrong though. When a shade is so close to white it's hard to distinguish, then that's half the problem anyway.

jtr1962, I tend to think that 4-5k light is 'best overall' for similar reasons to you. Perhaps some people think 4500k+ light is 'clinical' because of the poor CRI / green (and maybe other) spectrum spikes - is that a possibility? In any case, it's nice to have a variety sometimes (I was even thinking of using 2-4 different colour temperatures in one room to create more interesting lighting - This pic shows what that might look like despite the strange object matter).

blasterman, very interesting again, and pretty close to ElectronGuru's estimate. Curious also the way you say it'll creep up to 5000k with a higher CRI. Would the lack of the spectrum spikes really do that?

I think at the end, I'm a little surprised scientists haven't actually performed this experiment on the public because it's pretty trivial to test for. It would also stand as a reference point to what monitors should use for white. Okay, that almost sounds naive. They surely "must have" done this experiment to set a standard reference white for the world to use (or more likely, some other more accurate metric than colour temperature I guess, but then one can always convert to colour temperature afterwards...).
 
Last edited:

Twinbee

Newly Enlightened
Joined
Aug 15, 2006
Messages
77
Just to add this to the confusion. Apparently, most standards use 6500k as a white/grey level, despite the other evidence that hues closer to 4-5000k are seen as more white-ish generally. I wonder what gives...

Here's some quotes from:
http://en.wikipedia.org/wiki/Color_temperature

"The NTSC and PAL TV norms call for a compliant TV screen to display an electrically black and white signal (minimal color saturation) at a color temperature of 6,500 K"

"Digital cameras, web graphics, DVDs, etc. are normally designed for a 6,500 K color temperature. The sRGB standard commonly used for images on the internet stipulates (among other things) a 6,500 K display whitepoint."
 

lyyyghtmaster

Newly Enlightened
Joined
May 24, 2006
Messages
148
Location
Tucson, AZ
Yes, I've wondered this too about display screen lighting- it seems a screen at 6500K looks warmer than a room at 6500K, even if using the same light source (triphosphor or white LED for example). Could it be because the screen is a large, diffuse source similar to viewing something outdoors against a bluish sky?

I tend to use 6500K indoors in all spaces (living and work) for working, etc but I also have warmer sources in all the living spaces for use when I want to set a mood or it gets late and I want to wind down. Or I could use a CCT-controllable LED fixture to do this, but I haven't actually done so yet.:shakehead

I can't stand 6500K CFLs with a green tint! And most (all?) manufacturers make them that way. TCP used to make a beautiful 6500K product that looked slightly lavender compared to other manuf. products but was closest to sunlight in terms of having no color tint. Then they went to the typical greenish-tint-phosphor fare.:fail: Probably because this has slightly higher lumens due to green looking brighter. And/Or maybe the greenish-tint phosphor blend is cheaper???

I think that the tendency of most artificial sources to need to be warmer than continuous-spectrum sources (which I've noticed myself too!) is due largely to the lack of red, and particularly deep red, that correctly renders skin/wood tones. To try to compensate for this, we use warmer CFLs, LEDs, and metal halides than we otherwise might, in a fruitless attempt to correct for the lack of deeper red so wonderfully prevalent in hotwire sources and sunlight. High-CRI "natural white" LEDs are noticeably better at this, as are true "full-spectrum" fluorescents. Grocers often use high-CRI sources to display meat under.

It's too bad the CRI index is so incomplete, taking only 8 color comparisons for the basis of it's calculations of CRI. And deep red ain't one of them:mecry: Luckily it seems that higher-CRI sources also tend to have more deep red anyway!;)

Or notice this: even in the early morning, before the sun rises, the skylight does a surprisingly good job of rendering wood tones, even compared to our "neutral" artificial sources! And golly, that skylight has to be 20 000 K or more! Sure the spectral power distribution is skewed heavily towards the blue. But since it's coming from a mostly-continuous-spectrum incandescent source, the deeper red is still present in significant amounts!

I recognize that adding deep red makes the source efficacy lower since the red is harder to see, for the same power level, than that orangey drivel that passes for red in a CFL. But in some cases definitely worth the tradeoff!!!

I've noticed that normal-CRI phosphor-converted white LEDs, of all color temps, tend to render non-fluorescent orange objects as yellower than they are under most other sources. Of course this is also due to the preponderance of yellow over red in the spectrum. This can also be observed on varnished wood.

The Kruithof curve results may be due to the fact that in nature higher-CCT sources tend to be brighter (the sun) than lower ones so our brains "expect" that.
 

ElectronGuru

Flashaholic
Joined
Aug 18, 2007
Messages
6,055
Location
Oregon
It's too bad the CRI index is so incomplete, taking only 8 color comparisons for the basis of it's calculations of CRI.

The sources I've found indicate that CRI compares the test light against an incan light source:

"Test Method: The CRI is calculated by comparing the color rendering of the test source to that of a "perfect" source which is a black body radiator for sources with correlated color temperatures under 5000 K"
 

blasterman

Flashlight Enthusiast
Joined
Jul 17, 2008
Messages
1,802
Probably because this has slightly higher lumens due to green looking brighter

+100 :grin2:

A lot of 4100k 'Office lights' I've seen cheat a bit and add some extra red in the mix to give the bulb a slightly rosy cast. This tends to make it aethestically more pleasing than a perfectly neutral bulb.

You have the same issue with LEDs. When you take out the excessive green of cool-whites and balance it with more amber/orange/red the efficiency plummets. Many of the high efficient neutral LEDs I've recently seen are very sterile in this respect. They trim down as much as the spectrum as posssible to get the minimum CRI and 4100k CCT.
 
Top