The difference in forward voltage is 0.1 volts approximately at typical operating currents, so call the typical forward voltage at 700mA 3.4 volts. The MCE has an advantage of 0.1/3.4 or roughly 3%.
In a typical flashlight, with each die running at 2.5watts, I likely want my thermal resistance to be on the order of 40C/watt or less. We could even call it 50C/watt or less.
The XRE has a 4C/watt improvement in thermal resistance. That gives it an advantage of 4/40 or 4/50 or somewhere in the 8-10% range or far better than any small reduction in forward voltage.
The increase in die temperature, 10C max on the MCE is only going to reduce the die voltage by 20-30mV, so maybe that 3% goes up to 4%, but since it is a closed loop, the effect is less. 25% of the energy going in is converted to light, but since the efficiency difference is at best slightly better for the MCE due to the low forward voltage... so I would be maybe at 3.5%*(1+3.5%).... want to call that 3.6% ...heck call it 4.1%.
Anyway you look at it, for the same overall heat sink, the 4XRE die will be running at a lower overall temperature. As one designs thermally to either minimize die temp to maximize output and life, and other designs for minimum heat sink for a given die temp, the 4XRE wins just based on bulk heat sink.
A practical implementation means the 4XRE will likely have a better thermal path to the bulk heat sink which means that the 4C/watt advantage may actually be 5-6C/watt or possibly 15% which will be vastly better than the MCE implementation. Let's not forget the 4XRE have a larger IC to board bonding areas so in most practical implementations that may be another 1C/watt or more... so now we are at 6-8C/watt.... getting up there on 20%.
So my analysis stands, the MCE will require more heat sink, and/or the XRE will require less heat sink to meet a maximum die temperature specification at a given current drive.
Semiman