Fenix Outfitters        
Results 1 to 15 of 15

Thread: Does Lower LED temp equal better Color Rendition?

  1. #1

    Default Does Lower LED temp equal better Color Rendition?

    Hi everyone, I seem to be very confused From what I gather, LEDs with lower temperatures are better at rendering colors at night, especially when it's raining/foggy. But I thought LEDs inherently LACK full color spectrum output. So why would temperature matter at all?

    Reason being, I have a 5 million candlepower HALOGEN spotlight...it's very bright. But the point is, I've noticed when used during rainy nights, it brings out the lovely color of my garden/bushes, whereas with my LEDs, the foilage looks washed out. So... what I'm really asking.. Will a lower temperature LED flashlight help? If so how many Kelvins should I aim for? Thanks!
    Last edited by Zdenka; 05-23-2009 at 07:03 AM.

  2. #2
    Flashaholic*
    Join Date
    Jan 2009
    Location
    Scotland
    Posts
    985

    Default Re: Does Lower LED temp equal better Color Rendition?

    For good colour rendition what you need is full-spectrum output (i.e. a broad range of wavelengths) rather than a particular tint.

    Current LEDs use phosphors to take some of the blue output and re-radiate the energy over a broader band, but there are still plenty of gaps in the spectrum and this is why LEDs don't render colours as well as incandescents.

    So-called "warm" LEDs (paradoxically, lower colour temperature) tend to achieve their tint by having more phosphor to distribute more of the energy over the longer wavelengths. This results in a smoother spectrum which is probably why they appear to have better colour rendition. It also results in slightly lower efficiency.

  3. #3
    Flashaholic Cheapskate's Avatar
    Join Date
    Feb 2008
    Location
    Ireland
    Posts
    346

    Default Re: Does Lower LED temp equal better Color Rendition?

    Maybe all this has something to do with the particular absorption characteristics of chlorophyll, such that warmer tints highlight foliage better because a greater proportion of the light is reflected?

    Since chlorophylls peak absorption is of shorter wavelengths, which includes blue, that might mean foliage appears darker and less defined because it is absorbing a greater proportion of the light and thus reflecting less. I am just guessing here.



    In B&W infrared photos, folliage appears almost white and thus very bright. I presume this is because it is strongly reflective in the longer wavelengths.

    So called warmer LEDs, though I wouldn't call a pale urine colour 'warm', are arguably more unbalanced in their spectral output than neutral white ones. So to say they have better colour rendition is probably inaccurate.

  4. #4
    Flashaholic* LukeA's Avatar
    Join Date
    Jun 2007
    Location
    near Pittsburgh
    Posts
    4,401

    Default Re: Does Lower LED temp equal better Color Rendition?

    Quote Originally Posted by Cheapskate View Post
    So called warmer LEDs, though I wouldn't call a pale urine colour 'warm', are arguably more unbalanced in their spectral output than neutral white ones. So to say they have better colour rendition is probably inaccurate.
    If you think all warm LEDs have a "pale urine color," then you haven't seen an A or D bin XR-E.

    Also, from XR-E datasheet:

    "• Typical CRI for Cool White & Neutral White (3,700 K – 10,000 K CCT) is 75.
    • Typical CRI for Warm White (2,600 K – 3,700 K CCT) is 80. "

    I'm not arguing that CRI is a perfect measure of color rendition, but it is applicable to this.

  5. #5

    Default Re: Does Lower LED temp equal better Color Rendition?

    Not speaking with authority...just offering an opinion -
    As the "bandwidth" of LEDs tend to be narrower than, say, an incan bulb, I'd be inclined to say that that regardless of the tint, LEDs are not going to render a broad spectrum of colors accurately.

    I *think* what is happening is that phosphor is added to an LED for a specific slice of the spectrum. "Warm" LEDs with a "peak" in lower frequency colors such as reds, browns and greens that render organics better than "white" or "cooler" LEDs will look murky on the White Wall and other inorganics that are of a higher color temperature and do not provide the perception or illusion of contrast provided by cooler LEDs.

    Conversely, "white" and "cooler" LEDs that look great on the White Wall and, often, indoors show their weakness in illuminating foliage by rendering organics a ghostly monochrome.

    73
    dim

  6. #6
    Moderator
    Kiessling's Avatar
    Join Date
    Nov 2002
    Location
    Germany, Old World
    Posts
    16,137

    Default Re: Does Lower LED temp equal better Color Rendition?

    So much very very good info is just lost in the CPF database. We had great threads about this with contributions of our most knowledgeable members, with pictures and whatnot.

    Try to find those and you will find your answers.

    bernie
    There is a type of perfection that transcends the quest for lumens. Buying a $250 1-cell light for "lum factor" is like buying a $250 single malt Scotch for the alcohol content.
    - paulr


    It's always darkest just before it goes pitch black.
    My shoes are too tight. But it doesn't matter, because I have forgotten how to dance.

  7. #7

    Default Re: Does Lower LED temp equal better Color Rendition?

    Thanks for feedback guys. So, it's fair to say that LEDs with Warmer tints [ie lower temp] would render organics/foliage more naturally, although at a small sacrifice to luminosity. But they still would NOT match the color rendition of a Halogen lamp because LEDs aren't FULL color Spectrum.

    On the other hand, a neutral white LED [ie higher temp] would be BETTER at rendering colors of indoor objects such as table, chairs, papers etc. I notice how my halogen spotlight can make objects appear more yellowish than white/neutral tint LEDs
    Last edited by Zdenka; 05-23-2009 at 07:08 AM.

  8. #8
    Flashaholic* Tekno_Cowboy's Avatar
    Join Date
    Apr 2008
    Location
    Minnesota
    Posts
    1,679

    Default Re: Does Lower LED temp equal better Color Rendition?

    Check out some of the links in my modding thread, linked in my sigline. I've got some very informative threads of the subject linked there.
    Due to my current schedule being pretty darn hectic, I will not be accepting new modding projects until things settle down.

  9. #9
    Flashaholic Cheapskate's Avatar
    Join Date
    Feb 2008
    Location
    Ireland
    Posts
    346

    Default Re: Does Lower LED temp equal better Color Rendition?

    Hallogens and other incandescents may have higher CRIs, in that they can render more colours, but I don't think they render colours very accurately. They have a huge bias towards orange/red in their output.

    Try taking a photo with daylight balanced film, or using a digital camera with the white balance set to daylight. The resulting images will be an orange mess, colour wise.

  10. #10
    Flashaholic*
    Join Date
    Sep 2007
    Location
    MA
    Posts
    2,932

    Default Re: Does Lower LED temp equal better Color Rendition?

    My answer is 'yes' for some colors, but as mentioned the subject is complex. This thread is a good start on CRI, and has some good graphs and stuff so everyone can try to use the same lingo. Color however is somewhat subjective, not to mention accurate and pleasing are two paths the different people switch beetwen at different points and for different purposes.

    Right now in my Flashlights I'm prefering the Neutral tints around 4000K, but now that we have the lumens, I'd like to see CRI addressed. I hope to see some affordable, mainstream, high CRI leds soon. I've restricted my purchases on LED lights higher than 5000K color temp, and I hope manufactures take note.

    On the path to a CRI of 95, lets start lighting the trail.

  11. #11

    Default Re: Does Lower LED temp equal better Color Rendition?

    Quote Originally Posted by Moonshadow View Post
    For good colour rendition what you need is full-spectrum output (i.e. a broad range of wavelengths) rather than a particular tint.

    Current LEDs use phosphors to take some of the blue output and re-radiate the energy over a broader band, but there are still plenty of gaps in the spectrum and this is why LEDs don't render colours as well as incandescents.

    So-called "warm" LEDs (paradoxically, lower colour temperature) tend to achieve their tint by having more phosphor to distribute more of the energy over the longer wavelengths. This results in a smoother spectrum which is probably why they appear to have better colour rendition. It also results in slightly lower efficiency.
    Well, current white LEDs are very close to full spectrum output. For example many fluorescent lights output in a few narrow spectral bands.

    Unfortunately there's one notch in the spectrum around 500nm where the blue light of the LED and the red curve of the phosphor overlap. Ideally there should be a peak around 500nm. The visual spectrum is from ~400-700nm, however the eye is really most sensitive to 550nm +/- 50nm. Also, most of the color sensitive cells in the retina are red/green sensitive - too much blue light does not help one see contrast.

    So even the warm LEDs have too much blue, balanced by a lot of red.. they still have a notch in the very important green part of the spectrum. A good CRI (one way to quantify color) is possible by balancing the blue (LED) + red (phosphor).. however a spectrographic analysis reveals why these lights may not be ideal under certain conditions.

    This thread has many spectrographic charts for various light sources:
    http://www.candlepowerforums.com/vb/...=220118&page=5

    It depends how one wants to quantify color, but with a little work - perhaps a blue filter and a different phosphor - it should be possible to get a better spectrum from an LED.
    Last edited by FloggedSynapse; 05-23-2009 at 09:28 AM.

  12. #12
    Flashaholic*
    Join Date
    Jun 2005
    Location
    Home of chocolate and chalets
    Posts
    546

    Default Re: Does Lower LED temp equal better Color Rendition?

    I remember a very informative thread over in the McGizmo section where they were discussing the color rendering of the LED for the LunaSol (Osram Dragon LED?). Dig it out and you will find more information then you ever dared to think of, and may be also more then you wished for .

  13. #13

    Default Re: Does Lower LED temp equal better Color Rendition?

    Disclaimer: CRI is a broken measure of color rendering accuracy. There are efforts to establish better indices but AFAIK none of them are really widely used. So I'll mention CRI a few times here, but don't take that as an endorsement; datasheets usually show a spectrograph, as well as a CRI value, and if you can relate your needs/wants to the spectrograph instead, you should.
    Quote Originally Posted by Moonshadow View Post
    So-called "warm" LEDs (paradoxically, lower colour temperature) tend to achieve their tint by having more phosphor to distribute more of the energy over the longer wavelengths. This results in a smoother spectrum which is probably why they appear to have better colour rendition. It also results in slightly lower efficiency.
    There's no such general tendency that I'm aware of. Most neutral- or warm-white LEDs are still single-phosphor, and pick up at most 5 or so on the CRI scale. They have slightly reduced efficiency, because the absorption/reemission process is inherently lossful, and more of the light is undergoing it. The difference between them and cool-white is, IMHO, insignificant; they render some colors differently, but none of them can really be claimed to render colors more accurately in general. Not to say "it's all subjective", as certain circumstances may favor particular wavelengths, and make one or the other better.

    OTOH, there are 2-phosphor, and I think even some 3- or 4-phosphor, LEDs designed for good color rendering; all the neutral/warm P4s are this way, and some Nichia 083s in cool and warm. They're even less efficient, but move from CRI of 70-80 up to 90+. These are actually substantially more accurate at rendering various colors.

    And of course, ability to see through rain, fog, etc. is an entirely different matter from accurate color rendering -- the blue spike in conventional white LEDs causes high levels of backscatter, and while it's dramatically reduced in warmer LEDs, incans are better yet, and a high-CCT incan should school a warm-white LED with somewhat lower CCT (I can't test it myself, as my warmest LED just about matches my Mag61 with full batteries). If penetration through rain is key, an amber LED would be better -- but of course it has much worse color rendition.

  14. #14
    *Flashaholic* Illum's Avatar
    Join Date
    Apr 2006
    Location
    Central Florida, USA
    Posts
    13,050

    Default Re: Does Lower LED temp equal better Color Rendition?

    Quote Originally Posted by LukeA View Post
    If you think all warm LEDs have a "pale urine color," then you haven't seen an A or D bin XR-E.
    oh? is this analogous to an X bin luxeon back in 2006?

  15. #15
    Flashaholic* LEDAdd1ct's Avatar
    Join Date
    Jul 2007
    Location
    Hudson Valley
    Posts
    3,494

    Default Re: Does Lower LED temp equal better Color Rendition?

    Ah yes, the "urine" LEDs, best described at the LED Museum. Brings back memories...
    "...and the diode multiplied and grew in brightness. And God saw that it was good."

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •