Many studies have been done on visual acuity versus color temperature. Here are a few:
http://www.naturalux.com/High Color Temperature Lighting School Children_Highlighted.pdf
https://www.researchgate.net/public...he_Color_Temperature_of_the_Surround_Lighting
You want higher CCTs for headlights/streetlights to improve visual acuity, depth perception, and peripheral vision. Peripheral vision is important at urban driving speeds of 15 to 30 mph, but it's still useful even at highway speeds.
If we're talking about indoor lighting then in many cases it doesn't matter as much if you're optimizing seeing. Here it simply boils down to aesthetic preference. Obviously the ideal CCT can vary even for the same individual, depending upon the application, but I've found it doesn't vary by much, at least for me. I consider 5000K ideal. No matter where I'm lighting, once I get down to 4500K it starts to seem a little too warm but still tolerable (i.e. the Nichia 219s I modded one of my LED bulbs with). On the other hand, getting much over 6500K with any light source is too cool. I seem to tolerate higher CCT with LED sources as opposed to fluorescent, but 6500K is my limit even with LED. I think for most inidividuals there is a tolerance band. I highly doubt anyone who prefers, say, 2700K can tolerate 5000K at all, regardless of setting. And someone like me who prefers 5000K won't find any setting where they would want 3000K, or even 4000K. Low CCT causes me fatigue, along with diminished ability to see clearly.
Color rendering is another issue. The orthodoxy for headlights/streetlights, which I disagree with, is that CRI doesn't matter so long as it's at least around 70. For example, the blue spike mentioned is more of an issue with lower CRI LEDs. By definition the spike is much smaller as you reach CRI in the 90s for the simple reason it has to be. There's no way to have a large blue spike, and fudge the rest of the spectrum, which still obtaining a 90+ CRI. I suspect the concern with efficiency is the reason why we gloss over CRI for headlights, and especially streetlights. However, anecdotally I've found equivalent seeing at lower lux levels using higher CRI LEDs of the the same CCT as lower CRI. In theory then you could use high CRI, lower the illumination levels, get the same amount of visual benefit, and use about the same amount of power. Even better, the blue spike will now be much smaller due to the spectrum itself, and the lowered intensity. I'm unaware of any studies done on this subject, but I think it lends itself to one. I'd like to see the industry move solely to high CRI LEDs, both for indoor and outdoor lighting. At this point even with efficiency penalties relative to low CRI you're still at ~150 lm/W for CRI 90+.
Those who complain about lousy color rendering of ~5000K LEDs need to try CRI 95+ versions. You'll be able to cook steak just fine, and you'll also be able to render cooler colors accurately, which is something low CCT LEDs fail miserably at.