To get back to the thread title, I would tend to think that High CRI is much more useful for throw, as cummins4x4 mentioned, warmer color tends to throw further especially in misty / foggy air and I suspect that high CRI's should beat regular neutral/warms in that department too.
I'm not convinced that CRI per se has anything to do with penetrating fog or in general air filled with particulates of some sort. It depends on which wavelengths happen to penetrate better and the spectrum of the emitter in question. Furthermore, which wavelengths penetrate better is largely determined by the size of the particulates--for typical fog or fine white smoke, I do believe that longer wavelengths of visible light penetrate better, so there probably is some truth to what you said regarding overall tint (although the specific spectrum is really what counts). As for CRI, it's really just a broad measure of color rendering accuracy with respect to what the color temperature happens to be, which seems like apples & oranges to me. :thinking: For example, a typical high-CRI emitter of a given color temperature may have more penetrating red than lower-CRI emitters have, but it may have more cyan and green as well, which gets scattered by the particulates, causing glare. :shrug: Whatever the effect is, it's probably not as significant as the overall tint, generally speaking.
What I gathered from the article and from conversations here is that if I want to make detailed observations about what I am looking directly at, then I want the high density of red sensitive cones in the center of my eye to have some red light bouncing off of the object being viewed.
Right, and red light (preferably deep into the red range) has the additional benefit of sparing your rods for scotopic vision (i.e. super-low light vision), which is why amateur astronomers use it; we use our cones, too, for observing many objects, so the red light should be limited to the minimum brightness necessary for visual acuity for whatever we're doing.
If I want the maximum visual stimulation from the faintest light (detecting the presence of a distant object) then blue/green (teal) light should be the most effective. -More true for peripheral vision, in the extremes you'll have a relatively blind spot in your center of focus.
The sensitivity of the rods is in a
bell-shaped curve that peaks near 500 nm, which by coincidence happens to be right around where most nebula emit their light.
It also happens to fill a big gap in the
sensitivity curves of the various cones, which or may not be a coincidence. And if that weren't enough, this is very near where white LEDs typically have a deep "valley" in their spectra (example taken from the CREE XP-G data sheet):
So relatively speaking (no absolutes because there are overlaps between these curves), white LEDs are good for preserving one's scotopic (rod) vision, among white light sources, but not so great at helping people see things at extremely low levels of illumination, all else being equal. By the way, many--though not all--high-CRI LEDs have greater output in this range of the spectrum (it's one reason their CRIs are higher than average).
So cool white might seem brighter, and actually allow you to detect objects further away, having more lumens devoted to the blue/green part of the spectrum.
Actually, cool white LEDs typically have even less output near the peak of the rod sensitivity curve. All they really have is more light concentrated into a rather narrow peak in the blue range near 450 nm. While this peak is within the rod sensitivity curve, it's more than halfway down in sensitivity at that point. Looking at the various XP-G spectra above, it seems that it's a wash between the tints as to which supports scotopic vision the best (per lumen). For comparison, here is one of the better high-CRI spectra I've seen with respect to this topic:
There is a substantial difference, numerically speaking, although the overall curve still looks similar with a big dip in the cyan range--it's the inherent nature of the pairing of a blue LED die with a phosphor or blend of phosphors, and getting more output in the cyan and red ranges is costly in terms of efficiency.
As for why cool white appears "brighter," I think that's more of a psychological than physiological issue--it's why TV screens often come from the factory with too much blue in the picture (makes them appear "brighter"), and some white clothing and even paper is dyed slightly blue, as well. Our regular (photopic) vision is actually more sensitive to other wavelengths, as
this curve shows, so it's not because we're more sensitive to blue light or anything like that.
While warm white might seem dimmer, and actually allow you to make out more detail of what you are looking directly at... (part of the "lower lumens" of warm led output is due to their putting out light in the frequencies around red,
That last part might be true under dim illumination, but I think it's more of a matter of color contrast under most ordinary circumstances--this is because typical cool white LEDs are so deficient in red while others are not, although high-CRI cool white LEDs (a rare breed for some reason) do fine with color contrast. Interestingly, this is where high CRI really makes a big difference, whereas with neutral and warm white it barely makes any difference at all in ordinary use (i.e. the differences are usually subtle), yet hardly anybody makes flashlights with high-CRI cool white LEDs...puzzling.... :thinking::shrug:
while the lumens are biased twards blue due to our eye's sensitivity to blue/green.)
Well, lumens specifically are based on
this curve (also referenced above), which is used to convert radiant flux (in watts) to luminous flux (in lumens). The scotopic curve is different, and I believe that it is not considered in this context. Also, while our rods may be most sensitive to blue/green (or cyan, as I call it) they're not the wavelengths that LEDs are good at emitting--in fact, there's a really deep and fairly wide "valley" right there, as mentioned earlier.
Also natural objects (rocks, bark, dirt, fur, feathers, etc.) Allegedly tend to reflect warm tones better than cool tones.
Generally true, although I think that even greens and blues show up better under neutral or even warm white tints because the balance between various wavelengths is more even, which improves color contrast. Cool white LEDs are sort of a special case because they put out so much blue in a narrow range of wavelengths, which to my eyes tends to overwhelm and wash out other colors, and often causes glare. Note again that high-CRI cool white LEDs do not exhibit this problem so much because at least they have more red output, which helps render "earthy" tones better.
For deliberately creating night blindness by bleaching out someone's rhodopsin (visual purple), the brightest teal light would do the most bleaching... So cool white blaster would be optimal as a defensive light, the green laser "dazzlers" even more so.
For white LEDs, it seems that high CRI would theoretically be better (as long as they have greater output in the cyan range, that is), all else being equal (which is rare in practice). That said, there may be other factors involved, so the only way to really know would be to experiment.