What's the thermal efficiency of an LED?

Curious_character

Flashlight Enthusiast
Joined
Nov 10, 2006
Messages
1,211
If I put 1 watt of electrical power into an LED (say a Luxeon III), what fraction of the power is radiated at any wavelength, and what fraction is converted to heat to be conducted out the pad and leads, and through the package to the air?

I keep hearing that "the new LEDs (Cree and Seoul) are much more efficient than the Luxeon, so they don't require as much heat sinking." Well, if the Luxeon radiates 10% of its input power (at any wavelength) and the Cree radiates 20%, then the Cree is twice as efficient as the Luxeon, but still produces 89% as much heat for a given power input. But what are the real numbers?

c_c
 

2xTrinity

Flashlight Enthusiast
Joined
Dec 10, 2006
Messages
2,386
Location
California
What I think the biggest difference is is that the Crees have made improvements both in thermal conductivity, and in tolerance to being run at higher temperatures. I agree though, until we get into ~50% radiant efficiency range, that efficiency improvements won't really translate into that much less heat. Eventually, that will flip around -- improving efficiency from say 80% to 90% will be only a moderate improvement in lumens per watt, but represent half of the waste heating.

I don't have any definitive numbers, but would estimate that the Crees are around 25% efficient in terms of radiant efficiency. They get around 85 lumens per watt, and I believe that blue LEDs + a phosphor of a similar spectrum would produce around 300 lumens per watt if they were 100% efficient. (a direct conversion based on this is not really appropriate, but it gives a decent approximation)
 

Latest posts

Top