Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.Theoretical maximum for white light is somewhere around 300lm/W. So Cree XM-L driven at 0.35A will have 50% efficiency. Of course it goes down with increasing current.
close but still confusing, possibly misleading. The only sense organ in the body that directly detects electromagnetic frequencies is the eyes, 'heat' as felt by the skin, is an indirect sense from the heating of the skin itself. I.E. a thermometer doesnt sense electromagnetic frequencies, but it you shine a 1W laser on it, its still gonna go up.I agree but can I reword it as "Electromagnetic radiation within certain ranges of frequencies can be detected by the human body as either 'light' or 'heat' with varying sensitivities".
Lotta confusion going on in here with numbers and what is and isnt heat.
Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.
:nana:
The point of this experiment was to answer the frequently-asked question "I'm feeding x watts to my LED, how much heat will I have to remove."
Ahh ok, I thought you were referencing a theoretical ideal LED vs the actual led, not the lm/w value for its emisison spectrum vs its actual output. That makes more sense and is correct then, though there would be some error introduced by the color temperature, but thats not as significant.Not sure where you are coming from but if the Luminous Efficacy of Radiation(LER) value for the XM-L at 350ma is 300 lumens and the LED emits 150 lumens then the electrical efficiency of the LED is 50%(watts in to watts out).
Stephen Lebans
Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.
I'm afraid that I'm not the one who is confused here
300lm/W is roughly theoretical maximum for white light (100% energy converts to white light). So white LED with 150lm/W emits 50% energy as white light and 50% goes to waste heat.
That link is dead. Here's a direct link to the .pdf you're referring to:
http://lib.semi.ac.cn:8080/tsh/dzzy/wsqk/SPIE/vol5530/5530-88.pdf
There is an entire body of research devoted to converting low-level waste heat into electricity via improved thermoelectrics. The laws of thermodynamics say that even best case you won't be able to convert much waste heat from an LED because conversion efficiency increases with source temperature. An LED by definition can't run at a high enough temperature to make converting waste heat worthwhile. On the flip side, if someone were to invent relatively efficient thermoelectric convertors which could survive being placed in close proximity to a lamp filament, then you could potentally recover a large percentage of waste heat in theory. In practice this wouldn't make much sense. Even with waste heat recovery, overall the lamp would still be less efficient than LED or fluorescent sources. The main application for waste heat recovery is power plants where even a 1% efficiency increase translates into millions of dollars.Someone needs to come up with a driver that regeneratively converts heat back into electricity. I hope if someone reads this, then does it, that I AT LEAST get some credit, if not a nice fat check in the mail. Thanking you in advance....
There is an entire body of research devoted to converting low-level waste heat into electricity via improved thermoelectrics. The laws of thermodynamics say that even best case you won't be able to convert much waste heat from an LED because conversion efficiency increases with source temperature. An LED by definition can't run at a high enough temperature to make converting waste heat worthwhile. On the flip side, if someone were to invent relatively efficient thermoelectric convertors which could survive being placed in close proximity to a lamp filament, then you could potentally recover a large percentage of waste heat in theory. In practice this wouldn't make much sense. Even with waste heat recovery, overall the lamp would still be less efficient than LED or fluorescent sources. The main application for waste heat recovery is power plants where even a 1% efficiency increase translates into millions of dollars.
The theoretical maximum efficiency of a heat engine is 1 - Tc/Th, where Tc is the cold side temperature and Th is the hot side temperature. LEDs generally need to keep the die at less than 100*°C ( 373K) for decent life. Assuming you use some fancy setup where the cold side can be close to room temperature ( ~290K ), then the maximum possible efficiency of a thermoelectric is ~22%. Now I've been studying the development of thermoelectrics for a long time as a personal interest of mine, mostly for their refrigeration capabilities as opposed to use for heat recovery (although the same device can do both). Most studies I read point to achieving 50 to 60% of Carnot efficiency as "ambitious and probably unlikely". Assuming we reach this lofty goal (and people have been trying for the last 50 years), then you're talking about recovering ~13% of the waste heat at the upper limit of the LED's operating temperature. Since the goal here is more efficient conversion of power to light, you can achieve about the same increase by simply bringing the die temperature close to room temperature via a better thermal path. In fact, generally power savings is mainly a concern for general lighting where the LEDs must operate at lower die temperatures for long life. These lower die temperatures are even less conducive to heat recovery. For example, at a 60°C maximum die temperature, which is often a goal for long-life general lighting, maximum possible Carnot efficiency drops to not much over 10%. A practical device might not recover more than 5%. OK, 5% might still make a difference, but only if the cost of the device exceeds the power savings over the life of the LED. If we have a 100 watt LED, then we save 5 watts. Over the LED's 100,000 hour life that's 500 kW-hr, or about $50 at today's average electric rate of 10 cents per kilowatt-hour. Can we make an efficient thermoelectric capable of recovering 5 watts at such a low temperature differential, along with the associated heat sinks/pipes/fans to get the cold side as close to ambient as possible, all for $50 or less? I doubt it. Might as well just use that same heatsinking setup on the LED itself. If you can drop the die temperature by that same ~25°C temperature difference over which the thermoelectric is recovering heat, you increase output (i.e. efficiency) by about 6 or 7 percent. In short, it makes more sense to just bring die temps as close to ambient as possible. In fact, regardless of the die temperature, it makes more sense to just cool the LED better than to try and recover waste heat. Don't forget that waste heat recovery systems by definition use huge heat sinks to get the cold side as close to ambient as possible.I'm pretty sure that the efficiency of the converter is less relevant than the cost of the converter relative to the savings it provides. In other words, if someone comes out with a miniature converter that costs $3 and saves 100 ma per hour on a 10 watt led, you've got progress.
No, actually the output of the amber Luxeon I tested varies MUCH more with temperature than an XP-G would. An XP-G would only increase output by perhaps 8-9% if die temperature were reduced from 60°C to 25°C.i followed your link . thx for sharing.
i have some questions please.
does that sort of efficiency curve hold for xpg and for temperatures from 25 deg to 60deg ?
and does the lost output all become heat ?
If we can get Carnot efficiency up, then there is talk of replacing the A/C compressor with thermoelectrics. There is also some talk of increasing mpg by converting waste heat from the engine into electricity. I'm personally dubious of this second possibility because the car would need an electric motor to make use full of this generated power, and in my opinion internal combustion engines will be obsolete for ground transport within a decade anyway due to improved batteries for EVs. Nevertheless, using thermoelectrics for A/C will remain viable regardless of the vehicle's source of motive power, and I think we'll see this.are we going to see thermoelectrics in motor vehicles.