CRI is based on color temperature in the sense that it is a measure relative to a black body radiator of specific color temperature. A warm or cool color temperature tells you nothing about the CRI. The graph below that I pulled from a Nichia presentation shows the cost in lumens they experience based on color temperature (warmer CCT with the same CRI at the expense of lumens)
To make sense of the graph, I include the additional graph showing how Nichia has chosen to define "Typical", "Moderate" and "High" CRI:
So for example the lumen loss comparing a typical LED of 5000 CCT to another also of 5000 CCT but high CRI would be 100% down to 70%. If you compared the typical LED (5000k) to a warmer and also high CRI LED of 4200k you would go from 100% down to 65%.
I would guess that other LED manufacturers are dealing with similar phosphors and accordingly the same physics as outlined here by Nichia. If that is the case then using Cree LED's I think you could consider the blue or red bars as they relate to lumens based on relative CCT. :shrug:
To my knowledge, Cree does not bin based on CRI nor do they address CRI to the extent that some of the other LED manufacturers are at this point in time. I recently subjected a warm white Cree XP-G to my integrating sphere and measured 4480 CCT, CRI = 79 and 150 lumens (out the front at ~660 mA). The tint was warm for certain but to call this a high CRI light source would be misleading, IMHO. By the standards as detailed above, I would consider the Cree sample I mentioned as a moderate CRI LED.
The OP asks specifically about High CRI LED's. What is a High CRI LED or which of the LED's currently being used would qualify as such? :shrug: