I think there is a bit of a fallacious argument being made here. While it may be true that for a "pure white" source the theoretical maximum efficacy is 320 lm/w, the same is not true of sources that aren't "pure white", whatever that means. For a green light of 555 nm, the max efficacy is 683 lm/w.

So for any 'white' LED with say 130 lm/w, the only thing you can say with confidence about its optical power output is that it is definitely more than 130/683 and probably less than 130/320 times the electrical input power.

Hi DIWdiver,

the science/math supporting my logic is explained in detail in the "waste heat" thread contained in my original reply to the OP. I'm not going to repeat it here - if you are really interested I would suggested reading the thread and the links contained within the thread.

The OP stated he was interested in calculating waste heat in reference to a Cree XP-G2 driven at 700mA yielding an efficacy of 143 lumens per watt. He did not supply the CCT so I again pointed him to the "waste heat" thread, specifically the posts dealing with the Cree XM-L. The logic contained within can be used to fully characterize the XP-G2 LED he is working with.

The following discussion will not include AC/DC power conversion losses.

The LEDs are standard Royal Blue source pumps with YAG phosphor.

*(The values below are all of the top of my head. The actual values are in the "waste heat" thread I referenced earlier).

The theoretical max efficacy of a white LED, utilizing the above Royal Blue/YAG LED is around 340 l/w.

The theoretical max efficacy of white light produced by a RGBA LED is around 440 l/w.

So much for theory. What about characterizing the maximum efficacy of a RB/YAG LED? Here we use the luminous Efficacy of Radiation(LER) value. The LER value does not reflect the source - only the actual optical output. To calculate LER we use a spectroradiometer to characterize the power output of the individual wavelengths(depending on the units resolution) of the optical output. We then integrate(multiply) the power output at each wavelength by the LER value for this specific wavelength(CIE 1931).

I showed this type of integration the other day for the calculation of PPFD but the same logic can be used here to calculate efficacy. Instead of multiplying by uMoles/Watt we would multiple by the LER value for the specific wavelength(s).

http://www.candlepowerforums.com/vb...on-regarding-LED-lights&p=4501025#post4501025
(using CIE 1931)

For example, if the optical output at 555nm(LER 683 l/w) yielded 1/10th of a watt, then:

683 * .1 = 68.3 lumens

If the optical output at 660nm(LER 41.66) yielded 1/10 of a watt then:

41.66 * .1 = 4.166 lumens.

If you continue to perform this calculation across the entire wavelength range of the optical output - then you will derive the total and therefore maximum number of lumens(LER) possible with this specific range of wavelengths at these absolute power levels.

In my reply to the OP, I estimated the LER value for a 4000K XP-G2 to be around 320 lumens per watt. If we divide his actual efficacy value by the LER value then we are calculating the wall plug efficiency(WPE) of the LED.

I have studied dozens of LM-79 reports that detail all of the information I have discussed. The only caveat is that the reports are for bulbs or luminaires. The lens/diffuser modifies the characterization of the output. In other words , the bare LED die will have a LER value slightly different from the bulb/luminaire containing said LED.

In summary, the difference between the LER value for this specific optical output and the stated efficacy of the LED is due to the inefficiencies within the LED itself and the Stokes losses associated with the YAG phosphor.

If we had a bare(no phosphor) LED with 100% WPE then the efficacy of the LED would be equal to the calculated LER value.

Just my $.02.