How much of the power consumed by an LED goes towards light vs. heat?

Duster1671

Enlightened
Joined
Oct 16, 2017
Messages
244
Let's say we have an LED that's being driven at 10W of power (e.g. 3.3A at 3V). We know a lot of that ends up as heat, but how much is converted to light?

Is this something that's discernible from the information on data sheets?

Is there a way to calculate the amount of energy emitted from the LED as light based on the luminous flux?
 

LEDphile

Enlightened
Joined
Mar 8, 2021
Messages
316
If you know the spectral power distribution, you can convert the luminous flux (lumens) to radiant flux (watts) and from that obtain the (external quantum) efficiency of the LED (watts/watt). But that's somewhat tedious and doesn't really help in practical applications, where it's generally sufficient for heatsink design to choose a value that's in the ballpark (modern white LEDs have a radiometric efficiency around 50%, but this varies with color. A pessimistic, or perhaps realistic, designer might design around 33% of the power going out as light) and for figuring cooling loads for LED-lit spaces, it's sufficient to consider that all the energy entering the space, whether as heat or light, ends up as heat to be managed.
 

idleprocess

Flashaholic
Joined
Feb 29, 2004
Messages
7,197
Location
decamped
Radiated power is reported for IR and UV applications, less so visible light applications.

We can ballpark this, however. Blue-pumped phosphor LED is thought to have a maximum theoretical efficiency of 300 lumens per watt, itself representing an efficiency of <44%. With production LEDs peaking at around 200 lumens/watt that's something like 29% efficient ([200/300] * 0.439 = 0.2927).
 

LEDphile

Enlightened
Joined
Mar 8, 2021
Messages
316
We can ballpark this, however. Blue-pumped phosphor LED is thought to have a maximum theoretical efficiency of 300 lumens per watt, itself representing an efficiency of <44%. With production LEDs peaking at around 200 lumens/watt that's something like 29% efficient ([200/300] * 0.439 = 0.2927).
Be very careful with those generalizations - I've seen measurements of 90CRI 4000K white LEDs giving ~155 lumens per watt and 50% radiometric efficiency, but a 4000K 80 CRI LED was more like 170 lumens/watt for the same 50% radiometric efficiency. The "theoretical maximum of 300 lumens/watt for phosphor-converted white" makes some pretty strong assumptions about the spectrum of the output that simply aren't true for all white LED types on the market today - the spectral power distribution from that 80 CRI 4000K white LED mentioned above has a theoretical maximum of 335 lumens/watt.

The "efficiency of 44% for 300 lumens/watt" is erroneous, as it assumes a monochromatic source, and not the broadband spectral power distribution you get from a white LED. To convert from photometric flux to radiometric flux (which is required to determine efficiency), you need to apply the luminosity weighting function to the spectral power distribution and take the integral.
 
Last edited:

Duster1671

Enlightened
Joined
Oct 16, 2017
Messages
244
Okay, it sounds like "radiometric efficiency" is the value I'm after.

Luminous efficiency expresses a ratio of luminous flux to total consumed power, and radiometric efficiency is the percentage of the consumed power accounted for by the luminous flux?

So let's take LEDphile's example of a 90CRI 4000K white LED with 155 Lm/W luminous efficiency and 50% radiometric efficiency. We drive the LED at 1W and it produces 155 Lumens of output. A 50% radiometric efficiency means that 0.5W is converted to light and 0.5W is converted to heat, correct?
 
Last edited:

idleprocess

Flashaholic
Joined
Feb 29, 2004
Messages
7,197
Location
decamped
Be very careful with those generalizations
It's a a back-of-the-napkin figure offered as-is with no express nor implied warranty.

The "efficiency of 44% for 300 lumens/watt" is erroneous, as it assumes a monochromatic source, and not the broadband spectral power distribution you get from a white LED.
The linked phys.org source on wikipedia looks to be discussing the state of white LEDs in 2010 that makes the theoretical maximum 260-300 lm/W claim:
By the numbers

Briefly looking at the history of a few different types of light sources helps provide some context for the LED's recent rapid progress. The incandescent bulb, which was developed in 1879, had an initial luminous efficacy of 1.5 lm/W, which improved to 16 lm/W over the next 130 years. Fluorescent bulbs, first developed in 1938, achieved a luminosity increase of 50 to 100 lm/W over the next 60 years. The progress of white LEDs is much more pronounced: since their commercialization in 1996, white LEDs' luminous efficacy has increased from 5 lm/W to today's commercial white LED of 150 lm/W, the highest luminous efficacy of all white light sources. The theoretical limit for white LEDs is about 260-300 lm/W.

Now, the Nichia researchers have taken the white LED's luminous efficacy a step further, achieving values as high as 265 lm/W at 5 mA of current, or 249 lm/W at 20 mA, values that nearly reach the theoretical limit. However, the downside of this specific design is that the luminous flux is quite low, about 14.4 lm. By modifying the design, the researchers demonstrated two other white LEDs: one with values of 203 lm and 183 lm/W at 350 mA, and one (as mentioned above) with values of 1913 lm and 135 lm/W at 1 A. This last white LED was fabricated by connecting four high-power blue LED dies in series.
Emphasis added. This specific claim is not made clear in the abstract, however no mention is made of using the like of RGB nor other methods to generate white, thus the implication is that this is the same blue-pumped yellow technology discussed in the citation that remains in widespread use today.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
Radiated power is reported for IR and UV applications, less so visible light applications.

We can ballpark this, however. Blue-pumped phosphor LED is thought to have a maximum theoretical efficiency of 300 lumens per watt, itself representing an efficiency of <44%. With production LEDs peaking at around 200 lumens/watt that's something like 29% efficient ([200/300] * 0.439 = 0.2927).
That's not the power conversion efficiency, however. It's simply comparing the luminous efficacy of the emitted spectrum to a monochromatic light source of 555 nm which would have an efficiency of 683 lm/W. It's actually a pretty meaningless metric, especially if you're interested in how much of the power going to an LED comes out as heat.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
Let's say we have an LED that's being driven at 10W of power (e.g. 3.3A at 3V). We know a lot of that ends up as heat, but how much is converted to light?

Is this something that's discernible from the information on data sheets?

Is there a way to calculate the amount of energy emitted from the LED as light based on the luminous flux?
A good rule of thumb that I often use is to assume 270 to 330 lm/W for the emitted light, dependent upon CRI. A source with low 80s CRI might be on the high end of that range, while a CRI 95 source is on the low end. For example, let's take a 100W LED equivalent which puts out 1600 lumens, and uses 14 watts. Let's further assume the CRI is typical of the generic versions of these lights, namely in the mid 80s. The amount of input power emitted as light is roughly 1600/330 = 4.85 watts. Let's round up to 5 watts for simplicity. The remainder is heat, so you have 9 watts of heat. The overall efficiency of the bulb in terms of converting input power to light energy is 5/14=36%.

Or as another example, I bought a screw-in garage light emitting 12,000 lumens and using 80 watts. Again, CRI is mid 80s, so 12,000/330, or about 36 watts, comes out as light. Efficiency of the lamp is 36/80=45%.

Typical conversion efficiencies for most available LED lamps running off line current run from the high 20s to perhaps 50%. Remember those losses include driver losses, diffuser losses, as well as losses in the LED emitters themselves. We actually have commercial LED emitters with efficiencies as high as 230 lm/W for the versions with CRI in the 70s. That translates to a power-to-light conversion efficiency of perhaps 70%.
 

xxo

Flashlight Enthusiast
Joined
Apr 30, 2015
Messages
3,010
Couldn't you figure this by power output vs consumption? Lumens are convertale to Joules/secounds or Watts (apparently varies with color wavelength). If you can find the equivalent LED lumens ( in Watts) being produced by the LED and the Watts being drawn from the battery you can calculate the efficiency and the total heat produced (assuming that any losses will eventually turn into heat)?
 

idleprocess

Flashaholic
Joined
Feb 29, 2004
Messages
7,197
Location
decamped
Couldn't you figure this by power output vs consumption?
Radiometric power would make it a simple watts out vs watts in calculation, however that's generally only supplied for monochromatic devices aimed at OEMs. One could integrate spectral graphs and approximate this figure, assuming the Y-axis value can be translated to watts.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
Couldn't you figure this by power output vs consumption? Lumens are convertale to Joules/secounds or Watts (apparently varies with color wavelength). If you can find the equivalent LED lumens ( in Watts) being produced by the LED and the Watts being drawn from the battery you can calculate the efficiency and the total heat produced (assuming that any losses will eventually turn into heat)?
You can approximate it fairly well using the method I described. Not 100% accurate, but good enough to get within the ballpark of how much waste heat you're dealing with.
 

KITROBASKIN

Flashlight Enthusiast
Joined
Mar 28, 2013
Messages
5,450
Location
New Mexico, USA
If an LED emitter is driven as high as possible without generating hardly any heat, how efficient is it functioning in that way? (not including losses in the rest of the circuit)
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
If an LED emitter is driven as high as possible without generating hardly any heat, how efficient is it functioning in that way? (not including losses in the rest of the circuit)
That depends upon the overall drive level and heat sinking. If you're only putting, say, half a watt into an LED it might not appear to generate much heat even if it's horribly inefficient. Good heatsinking can mask inefficient LEDs.

Another thing to note is that there are diminishing returns driving LEDs harder. Eventually the lumens versus current curve reaches an inflection point, and more current actually results in LESS light, not more. You can test this with a light meter and variable power supply. Assuming the batteries or driver aren't already placing hard limits on drive current, it's a good idea to see where this inflection point is, then back off on the current a bit so you're perhaps 20% under the inflection point. There are little gains from going beyond that.
 
Top