#### Duster1671

##### Newly Enlightened

- Joined
- Oct 16, 2017

- Messages
- 110

Is this something that's discernible from the information on data sheets?

Is there a way to calculate the amount of energy emitted from the LED as light based on the luminous flux?

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter Duster1671
- Start date

Help Support Candle Power Forums

- Joined
- Oct 16, 2017

- Messages
- 110

Is this something that's discernible from the information on data sheets?

Is there a way to calculate the amount of energy emitted from the LED as light based on the luminous flux?

We can ballpark this, however. Blue-pumped phosphor LED is thought to have a maximum theoretical efficiency of 300 lumens per watt, itself representing an efficiency of <44%. With production LEDs peaking at around 200 lumens/watt that's something like 29% efficient

Be very careful with those generalizations - I've seen measurements of 90CRI 4000K white LEDs giving ~155 lumens per watt and 50% radiometric efficiency, but a 4000K 80 CRI LED was more like 170 lumens/watt for the same 50% radiometric efficiency. The "theoretical maximum of 300 lumens/watt for phosphor-converted white" makes some pretty strong assumptions about the spectrum of the output that simply aren't true for all white LED types on the market today - the spectral power distribution from that 80 CRI 4000K white LED mentioned above has a theoretical maximum of 335 lumens/watt.We can ballpark this, however. Blue-pumped phosphor LED is thought to have a maximum theoretical efficiency of 300 lumens per watt, itself representing an efficiency of <44%. With production LEDs peaking at around 200 lumens/watt that's something like 29% efficient([200/300] * 0.439 =.0.2927)

The "efficiency of 44% for 300 lumens/watt" is erroneous, as it assumes a monochromatic source, and not the broadband spectral power distribution you get from a white LED. To convert from photometric flux to radiometric flux (which is required to determine efficiency), you need to apply the luminosity weighting function to the spectral power distribution and take the integral.

Last edited:

- Joined
- Oct 16, 2017

- Messages
- 110

Okay, it sounds like "radiometric efficiency" is the value I'm after.

Luminous efficiency expresses a ratio of luminous flux to total consumed power, and radiometric efficiency is the percentage of the consumed power accounted for by the luminous flux?

So let's take LEDphile's example of a 90CRI 4000K white LED with 155 Lm/W luminous efficiency and 50% radiometric efficiency. We drive the LED at 1W and it produces 155 Lumens of output. A 50% radiometric efficiency means that 0.5W is converted to light and 0.5W is converted to heat, correct?

Luminous efficiency expresses a ratio of luminous flux to total consumed power, and radiometric efficiency is the percentage of the consumed power accounted for by the luminous flux?

So let's take LEDphile's example of a 90CRI 4000K white LED with 155 Lm/W luminous efficiency and 50% radiometric efficiency. We drive the LED at 1W and it produces 155 Lumens of output. A 50% radiometric efficiency means that 0.5W is converted to light and 0.5W is converted to heat, correct?

Last edited:

It's a a back-of-the-napkin figure offered as-is with no express nor implied warranty.Be very careful with those generalizations

The linked phys.org source on wikipedia looks to be discussing the state of white LEDs in 2010 that makes the theoretical maximum 260-300 lm/W claim:The "efficiency of 44% for 300 lumens/watt" is erroneous, as it assumes a monochromatic source, and not the broadband spectral power distribution you get from a white LED.

Emphasis added. This specific claim is not made clear in the abstract, however no mention is made of using the like of RGB nor other methods to generate white, thus the implication is that this is the same blue-pumped yellow technology discussed in the citation that remains in widespread use today.By the numbers

Briefly looking at the history of a few different types of light sources helps provide some context for the LED's recent rapid progress. The incandescent bulb, which was developed in 1879, had an initial luminous efficacy of 1.5 lm/W, which improved to 16 lm/W over the next 130 years. Fluorescent bulbs, first developed in 1938, achieved a luminosity increase of 50 to 100 lm/W over the next 60 years. The progress of white LEDs is much more pronounced: since their commercialization in 1996, white LEDs' luminous efficacy has increased from 5 lm/W to today's commercial white LED of 150 lm/W, the highest luminous efficacy of all white light sources.The theoretical limit for white LEDs is about 260-300 lm/W.

Now, the Nichia researchers have taken the white LED's luminous efficacy a step further, achieving values as high as 265 lm/W at 5 mA of current, or 249 lm/W at 20 mA, values that nearly reach the theoretical limit.However, the downside of this specific design is that the luminous flux is quite low, about 14.4 lm. By modifying the design, the researchers demonstrated two other white LEDs: one with values of 203 lm and 183 lm/W at 350 mA, and one (as mentioned above) with values of 1913 lm and 135 lm/W at 1 A. This last white LED was fabricated by connecting four high-power blue LED dies in series.

That's not the power conversion efficiency, however. It's simply comparing the luminous efficacy of the emitted spectrum to a monochromatic light source of 555 nm which would have an efficiency of 683 lm/W. It's actually a pretty meaningless metric, especially if you're interested in how much of the power going to an LED comes out as heat.

We can ballpark this, however. Blue-pumped phosphor LED is thought to have a maximum theoretical efficiency of 300 lumens per watt, itself representing an efficiency of <44%. With production LEDs peaking at around 200 lumens/watt that's something like 29% efficient([200/300] * 0.439 =.0.2927)

A good rule of thumb that I often use is to assume 270 to 330 lm/W for the emitted light, dependent upon CRI. A source with low 80s CRI might be on the high end of that range, while a CRI 95 source is on the low end. For example, let's take a 100W LED equivalent which puts out 1600 lumens, and uses 14 watts. Let's further assume the CRI is typical of the generic versions of these lights, namely in the mid 80s. The amount of input power emitted as light is roughly 1600/330 = 4.85 watts. Let's round up to 5 watts for simplicity. The remainder is heat, so you have 9 watts of heat. The overall efficiency of the bulb in terms of converting input power to light energy is 5/14=36%.

Is this something that's discernible from the information on data sheets?

Is there a way to calculate the amount of energy emitted from the LED as light based on the luminous flux?

Or as another example, I bought a screw-in garage light emitting 12,000 lumens and using 80 watts. Again, CRI is mid 80s, so 12,000/330, or about 36 watts, comes out as light. Efficiency of the lamp is 36/80=45%.

Typical conversion efficiencies for most available LED lamps running off line current run from the high 20s to perhaps 50%. Remember those losses include driver losses, diffuser losses, as well as losses in the LED emitters themselves. We actually have commercial LED emitters with efficiencies as high as 230 lm/W for the versions with CRI in the 70s. That translates to a power-to-light conversion efficiency of perhaps 70%.

Radiometric power would make it a simple watts out vs watts in calculation, however that's generally only supplied for monochromatic devices aimed at OEMs. One could integrate spectral graphs and approximate this figure, assuming the Y-axis value can be translated to watts.Couldn't you figure this by power output vs consumption?

You can approximate it fairly well using the method I described. Not 100% accurate, but good enough to get within the ballpark of how much waste heat you're dealing with.

That depends upon the overall drive level and heat sinking. If you're only putting, say, half a watt into an LED it might not appear to generate much heat even if it's horribly inefficient. Good heatsinking can mask inefficient LEDs.

Another thing to note is that there are diminishing returns driving LEDs harder. Eventually the lumens versus current curve reaches an inflection point, and more current actually results in LESS light, not more. You can test this with a light meter and variable power supply. Assuming the batteries or driver aren't already placing hard limits on drive current, it's a good idea to see where this inflection point is, then back off on the current a bit so you're perhaps 20% under the inflection point. There are little gains from going beyond that.

Wise words. Threads like this are indicative of some of the talent that drop by CandlePowerForums on occasion.