[ QUOTE ]
jtr1962 said:
[ QUOTE ]
js said:
Just a mathematical point about a 400 percent increase. A 100 percent increase would be 2x as efficient. Thus a 400 percent increase means 5x as efficient. IIRC, current Luxeons are like 10 percent efficient (???) is that right? Or maybe it's higher. But say it's 10 percent. This 400 percent increase would then mean an LED that was 50 percent efficient at converting electrical energy to visible light.
[/ QUOTE ]
I'll add that Cree's best blue chip, the XT-27, produces 27 mW of 460 to 470 nm light with a typical input power of 64 mW (20 mA x 3.2V). This chip is already 42% efficient, and it is actually being mass-produced (although I don't know what LEDs use it). Anyway, a 400% increase there is impossible. At best you might get a 100% or so increase.
Another point is that phosphor-based white LEDs will never approach 100% efficiency becauses of Stokes losses due to the down conversion to white light. In fact, these losses are more with a UV LED and RGB phosphors (the likely approach to future white LEDs) than with a blue LED and yellow phosphor because of the difference in input versus output wavelengths. Therefore, potentially the only way we'll have white light with a near 100% efficiency will be with the RGB color-mixing approach. This of course assumes that we can make red, green, and blue LEDs all with near 100% efficiency. Red and blue are already almost halfway there, but the best green I've seen (Cree's 525 nm, 9 mW chip) is only about 14% efficient. The 400% increase touted in the article could potentially bring this to 70%, and the red and blue to nearly 100%. This would mean white light with ~90% efficiency. With a CRI of 80 and a color temp of 4000K this translates to roughly 360 lm/W. For a CRI of 100 for critical applications we would get roughly 180 lm/W. Both numbers are roughly 3 to 3.5 times what the best comparable fluorescents do.
[/ QUOTE ]
There are two sets of light units: the physical, and the physiological. One set measures light output in Watts, and the other, in Lumens. The lumen is a unit of measure arrived at by creating an "ideal eye" (the V-lambda curve) which is more or less an average of many people's eyes' response to light stimulous.
IIRC, at 555 nm the eye is most sensitive to light, and 1 watt of 555 nm light would become 683 lumens of light seen by the eye. 400 and 700 nm are more or less the upper and lower limits of "visible" light.
So, the issue is that just because an LED transforms 43 percent of the electrical energy it receives into photons, i.e. light energy, this in no way translates to a 43 percent luminous efficiency. Do you follow me?
Because if it did, then incandescent filaments would have to be considered to be MUCH more efficient than people here on the forum consider them. Even if we just naively said "All light between 400 and 700 nm counts as 'visible' " even then filaments would have to be considered around 40+ percent efficient. The black body peak of a tungsten filament driven hard is actually only just below the red limit at 750 nm.
However, this is not the way the V-lambda curve and lumens calculations work. At approximately 510 and 610 nm, for example, the V-lamda curve is .5 on the vertical axis. This means that even if a 610 nm LED completely converted electrical energy to photons at that wavelength, it would only be at 50 percent of the theoretical maximum luminous efficiency.
I've been thinking about this for some time, and it's kind of a drag because, who would want a 555 nm yellow-green only light source, right? Well, some people certainly would, I suppose, especially if it were 100 percent efficient.
But, you see my point. The 683 L/w theoretical maximum is a highly artificial number from a conceptual standpoint.
Still, this is the system of calculation that is used when we talk about luminous efficiency. At least I for one do not know of any other.
So to return to the topic at hand, this LED spoken of in the quoted text above as being 43 percent efficient at 460 to 470 nm, would actually only have a luminous efficiency of 4 percent of the theoretical maximum, because the V-lambda curve is about .1 at this frequency. Which means that light at this frequency needs to be 10 times more intense than light at 555 nm to create the same level of optical sensation at the eye.
This is certainly unfair to this LED, and I do not mean to slight it or anything. My intent was to point out that if we are talking in these types of units (mW instead of lumens) then incandescents are pretty kick-*** efficient, as almost all of the electrical power is converted to electromagnetic radiation of some wavelength or other. Granted, a lot of it is in the IR region, but hey, it's still photons. And a whole lot of it is in the low red (but still "visible") region, where it only counts at 10 percent or less in lumens calculations.
OK. Nuf said.