I'm no expert, I'll merely post my understanding of the issue.
Color Temperature is basically a measure of the light output, how it appears to us. White output LED's are blue LED's with a phosphor coating over the die, that when excited by the light produced by the blue LED, emit a "white" light. A 'single' layer of phosphor coating will result in a standard cold white high color temperature light. Very blue.
Manufacturers then fine tune the phosphor coating (or add additional layers, honestly I'm not quite sure which) to add further layers to convert more of the blue emitted light into a "white" light. This conversion process is nowhere near 100% efficiency, so a lower color temperature LED is likely to be less efficient.
The Lower color temp. (basically the addition of more red, green, etc) appears more natural to us, as our eyes have adapted to the incandescent sun/atmosphere combination. So the addition of more Red light (and green, etc) will make it easier for the eye to determine color, shape, etc.
Color Rendering index is a scale that's graded on how well a given light renders color, when compared to sunlight. So the better the CRI rating, the easier it is for our eyes to distinguish colors, especially very close colors.
From my understanding, the addition of more phosphor layers (and thus the conversion of a greater percentage of blue light into red, green, etc) causes several things: an increase in CRI, a decrease in color temperature, and a decrease in efficiency (remember, you have a certain amount of blue light to begin with, and each additional layer of conversion/filtering reduces this). These can't necessarily be avoided, although I'm sure there's an exception to every rule (wouldn't it be nice if there were R5, 5000*k XP-G's in large enough production numbers).
Note: Again, I'm not an expert at all of this. This is just my understanding of it all, and it may/may not help or be correct.