Quick question:
How do thermal resistance ratings change when efficiency increases so much? Let's say you go from 40lm/W to 80lm/W. You've reduced the amount of power that's wasted as heat - so in theory, the thermal resistance rating should go down.
Let's say the light you're producing has a maximum efficacy of 300lm/W.
Going from 40lm/W to 80lm/W takes you from 87% power going to heat, to 73% power going to heat. So for 1W of input, the less efficient dissipates 0.87W to 0.73W for the more efficient.
If the thermal resistance for the less efficient was rated at 17C/W, then the "real" thermal resistance is something like 20C/W (since not all of the power is going into heat).
So, the more efficient LED should now have a thermal resistance rating of ~15C/W, since more of the power is going to light instead of heat.
Is is possible that the Luxeon III LEDs are really built the exact same way that a Lux I is, but that the Lux III LEDs are sorted to be more efficient, thus having a "lower" thermal resistance?
So 'real' thermal resistance is constant - that is, power dissipated as heat always causes the same amount of junction rise per watt.
But
overall thermal resistance depends on LED efficiency. The more light per watt created is less power dissipated as heat, and thus, less input power is converted to heat, causing lower temperature rise per input watt. So without changing packaging, but by simply improving LED efficiency, you improve thermal resistance.
Make sense?