LEDs waste 75% as heat

dellayao

Newly Enlightened
Joined
Apr 10, 2011
Messages
12
it is "ideal",so it is hard to achieve. we cann't avoid the power to convert into heat.:duh2:
 

CKOD

Enlightened
Joined
Aug 3, 2010
Messages
708
Lotta confusion going on in here with numbers and what is and isnt heat.

Theoretical maximum for white light is somewhere around 300lm/W. So Cree XM-L driven at 0.35A will have 50% efficiency. Of course it goes down with increasing current.
Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.

I agree but can I reword it as "Electromagnetic radiation within certain ranges of frequencies can be detected by the human body as either 'light' or 'heat' with varying sensitivities".
close but still confusing, possibly misleading. The only sense organ in the body that directly detects electromagnetic frequencies is the eyes, 'heat' as felt by the skin, is an indirect sense from the heating of the skin itself. I.E. a thermometer doesnt sense electromagnetic frequencies, but it you shine a 1W laser on it, its still gonna go up.
Various frequencies of the elctromagnetic spectrum are absorbed by the body at differing rates, so heating varies. Everything from UV down to a few GHZ will be absorbed at least partially by the skin, causing local heating, felt as warmth. UV and above is ionizing and bad and starts to just zing right though you, breaking DNA along the way and causing cancer, but not inducing much heating, stuff below a few GHz penetrates the body in varying depths, causing internal heating, but thats not felt as there isnt heat sense organs inside your body, so by time you do feel it, damage via heating can be done. And lower frequencies have wavelengths larger then you and pass though without a care in the world.


Yes, longwave IR cameras are traditionally called "heat vision", but they just see the IR emissions from the warm object. Kind of like how incan lamps emit light, you emit IR light also, just in the 310K spectrum. IR just falls into a sort of niche where we cant see it, we can feel the heating effects due to it being absorbed by the skin very quickly, its too high frequency for us to do direct emissions via an antenna (Getting there! THz transmitters arent stuff of fiction anymore)

I would post a certain picture of jackie chan right now, but I know the moderators on here wouldnt be too fond of it :nana:
 

slebans

Enlightened
Joined
Mar 1, 2010
Messages
457
Location
Moncton, NB Canada
Lotta confusion going on in here with numbers and what is and isnt heat.


Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.
:nana:

Not sure where you are coming from but if the Luminous Efficacy of Radiation(LER) value for the XM-L at 350ma is 300 lumens and the LED emits 150 lumens then the electrical efficiency of the LED is 50%(watts in to watts out).

Stephen Lebans
 

Potato42

Newly Enlightened
Joined
Feb 9, 2010
Messages
106
The point of this experiment was to answer the frequently-asked question "I'm feeding x watts to my LED, how much heat will I have to remove."

excellent experiment MikeAusC. It's nice to have a rough guideline for this sort of thing. Do you have any idea how similar the results would be with other LED's? Would results correspond to the relative efficiencies of the LED tested vs the XM-L T6?
 

CKOD

Enlightened
Joined
Aug 3, 2010
Messages
708
Not sure where you are coming from but if the Luminous Efficacy of Radiation(LER) value for the XM-L at 350ma is 300 lumens and the LED emits 150 lumens then the electrical efficiency of the LED is 50%(watts in to watts out).

Stephen Lebans
Ahh ok, I thought you were referencing a theoretical ideal LED vs the actual led, not the lm/w value for its emisison spectrum vs its actual output. That makes more sense and is correct then, though there would be some error introduced by the color temperature, but thats not as significant.
 

jirik_cz

Flashlight Enthusiast
Joined
Jul 29, 2007
Messages
1,605
Location
europe
Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.

I'm afraid that I'm not the one who is confused here :D

300lm/W is roughly theoretical maximum for white light (100% energy converts to white light). So white LED with 150lm/W emits 50% energy as white light and 50% goes to waste heat.
 

slebans

Enlightened
Joined
Mar 1, 2010
Messages
457
Location
Moncton, NB Canada
I'm afraid that I'm not the one who is confused here :D

300lm/W is roughly theoretical maximum for white light (100% energy converts to white light). So white LED with 150lm/W emits 50% energy as white light and 50% goes to waste heat.

It is not accurate to state that "300 lm/w is the theoretical maximum for white light. For a detailed explanation go to the DOE website and read the current LED Roadmap report. Or for a specific(but somewhat dated) reference see:
Color Rendering and Luminous Efficacy of White LED Spectra

Stephen Lebans
 

jirik_cz

Flashlight Enthusiast
Joined
Jul 29, 2007
Messages
1,605
Location
europe
Stephen do you mean this NIST document? (your link doesn't work)

Most of the current white LEDs are actually blue LEDs with YAG phosphor. According to this document theoretical maximum for LED with YAG phosphor, 6800K CCT and CRI 81 is 294 lm/W.
 

slebans

Enlightened
Joined
Mar 1, 2010
Messages
457
Location
Moncton, NB Canada
That link is dead. Here's a direct link to the .pdf you're referring to:

http://lib.semi.ac.cn:8080/tsh/dzzy/wsqk/SPIE/vol5530/5530-88.pdf

THe link was fine but it seems to be the "Add Link" feature within the posting interface that did not correctly parse the URL I entered:
http://lib.semi.ac.cn:8080/tsh/dzzy/wsqk/SPIE/vol5530/5530-88.pdf

I am still relatively new to this interface so perhaps I am doing something wrong. In the future I will test the links within the Preview window to ensure they are parsed correctly.

Thank you for posting the correction.

Stephen Lebans
 
Last edited:

beerwax

Enlightened
Joined
Mar 12, 2011
Messages
447
so , if we had a new led, type W, that converted 90 % of the input electrical energy to radiation of a non visible wavelength , it would generate little heat and would be rated 'very inefficient'. see no reason not to like mikes concept.
 

onetrickpony

Enlightened
Joined
Mar 10, 2011
Messages
262
Someone needs to come up with a driver that regeneratively converts heat back into electricity. I hope if someone reads this, then does it, that I AT LEAST get some credit, if not a nice fat check in the mail. Thanking you in advance....
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
Someone needs to come up with a driver that regeneratively converts heat back into electricity. I hope if someone reads this, then does it, that I AT LEAST get some credit, if not a nice fat check in the mail. Thanking you in advance....
There is an entire body of research devoted to converting low-level waste heat into electricity via improved thermoelectrics. The laws of thermodynamics say that even best case you won't be able to convert much waste heat from an LED because conversion efficiency increases with source temperature. An LED by definition can't run at a high enough temperature to make converting waste heat worthwhile. On the flip side, if someone were to invent relatively efficient thermoelectric convertors which could survive being placed in close proximity to a lamp filament, then you could potentally recover a large percentage of waste heat in theory. In practice this wouldn't make much sense. Even with waste heat recovery, overall the lamp would still be less efficient than LED or fluorescent sources. The main application for waste heat recovery is power plants where even a 1% efficiency increase translates into millions of dollars.
 

onetrickpony

Enlightened
Joined
Mar 10, 2011
Messages
262
There is an entire body of research devoted to converting low-level waste heat into electricity via improved thermoelectrics. The laws of thermodynamics say that even best case you won't be able to convert much waste heat from an LED because conversion efficiency increases with source temperature. An LED by definition can't run at a high enough temperature to make converting waste heat worthwhile. On the flip side, if someone were to invent relatively efficient thermoelectric convertors which could survive being placed in close proximity to a lamp filament, then you could potentally recover a large percentage of waste heat in theory. In practice this wouldn't make much sense. Even with waste heat recovery, overall the lamp would still be less efficient than LED or fluorescent sources. The main application for waste heat recovery is power plants where even a 1% efficiency increase translates into millions of dollars.

I'm pretty sure that the efficiency of the converter is less relevant than the cost of the converter relative to the savings it provides. In other words, if someone comes out with a miniature converter that costs $3 and saves 100 ma per hour on a 10 watt led, you've got progress.
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
I'm pretty sure that the efficiency of the converter is less relevant than the cost of the converter relative to the savings it provides. In other words, if someone comes out with a miniature converter that costs $3 and saves 100 ma per hour on a 10 watt led, you've got progress.
The theoretical maximum efficiency of a heat engine is 1 - Tc/Th, where Tc is the cold side temperature and Th is the hot side temperature. LEDs generally need to keep the die at less than 100*°C ( 373K) for decent life. Assuming you use some fancy setup where the cold side can be close to room temperature ( ~290K ), then the maximum possible efficiency of a thermoelectric is ~22%. Now I've been studying the development of thermoelectrics for a long time as a personal interest of mine, mostly for their refrigeration capabilities as opposed to use for heat recovery (although the same device can do both). Most studies I read point to achieving 50 to 60% of Carnot efficiency as "ambitious and probably unlikely". Assuming we reach this lofty goal (and people have been trying for the last 50 years), then you're talking about recovering ~13% of the waste heat at the upper limit of the LED's operating temperature. Since the goal here is more efficient conversion of power to light, you can achieve about the same increase by simply bringing the die temperature close to room temperature via a better thermal path. In fact, generally power savings is mainly a concern for general lighting where the LEDs must operate at lower die temperatures for long life. These lower die temperatures are even less conducive to heat recovery. For example, at a 60°C maximum die temperature, which is often a goal for long-life general lighting, maximum possible Carnot efficiency drops to not much over 10%. A practical device might not recover more than 5%. OK, 5% might still make a difference, but only if the cost of the device exceeds the power savings over the life of the LED. If we have a 100 watt LED, then we save 5 watts. Over the LED's 100,000 hour life that's 500 kW-hr, or about $50 at today's average electric rate of 10 cents per kilowatt-hour. Can we make an efficient thermoelectric capable of recovering 5 watts at such a low temperature differential, along with the associated heat sinks/pipes/fans to get the cold side as close to ambient as possible, all for $50 or less? I doubt it. Might as well just use that same heatsinking setup on the LED itself. If you can drop the die temperature by that same ~25°C temperature difference over which the thermoelectric is recovering heat, you increase output (i.e. efficiency) by about 6 or 7 percent. In short, it makes more sense to just bring die temps as close to ambient as possible. In fact, regardless of the die temperature, it makes more sense to just cool the LED better than to try and recover waste heat. Don't forget that waste heat recovery systems by definition use huge heat sinks to get the cold side as close to ambient as possible.

On another note, thermoelectrics which operate at a good fraction of the Carnot efficient might make sense to use to cool the LED die. As a refrigerator, Carnot efficiency is Tc/Th-Tc. Note how this can be much larger than 1 if Th-Tc is small. For example, if we bring the LED die down from 50°C to 25°C, we'll obtain the aforemented 6-7 percent increase in output. Maximum Carnot efficiency doing this would be 11.92 which is ~12. Assume the 100 watt LED is 50% efficient at converting power to light, so the heat load would be 50 watts. Operating at half Carnot efficiency we would need 50/6 = 8.33 watts. This is about 8% more power in order to obtain a 6-7% increase in light. It's almost worthwhile. If you can approach Carnot efficiency then you actually end up using less power to cool the LED compared to the increase in output. Still probably not worthwhile from an economics standpoint, but at least here you stand to actually increase overall LED efficiency by using a thermoelectric (in theory anyway). And a while back I did an experiment which vividly demonstrates the effects of cooling LEDs. Even with the much more temperature sensitive output of an amber LED, I concluded that cooling LEDs with today's thermoelectrics has no practical value.
 

beerwax

Enlightened
Joined
Mar 12, 2011
Messages
447
i followed your link . thx for sharing.
i have some questions please.
does that sort of efficiency curve hold for xpg and for temperatures from 25 deg to 60deg ?
and does the lost output all become heat ?

are we going to see thermoelectrics in motor vehicles.
 

monkeyboy

Flashlight Enthusiast
Joined
Mar 7, 2006
Messages
2,327
Location
UK
What about radiant heat loss from the resistor? Is this enough to throw off the results?

If we know the temperature of the resistor and the exposed area, we could use the black body approximation with the stefan-boltzmann law to calculate an order of magnitude...

OK I've lost interest already :ohgeez:
 

jtr1962

Flashaholic
Joined
Nov 22, 2003
Messages
7,505
Location
Flushing, NY
i followed your link . thx for sharing.
i have some questions please.
does that sort of efficiency curve hold for xpg and for temperatures from 25 deg to 60deg ?
and does the lost output all become heat ?
No, actually the output of the amber Luxeon I tested varies MUCH more with temperature than an XP-G would. An XP-G would only increase output by perhaps 8-9% if die temperature were reduced from 60°C to 25°C.

are we going to see thermoelectrics in motor vehicles.
If we can get Carnot efficiency up, then there is talk of replacing the A/C compressor with thermoelectrics. There is also some talk of increasing mpg by converting waste heat from the engine into electricity. I'm personally dubious of this second possibility because the car would need an electric motor to make use full of this generated power, and in my opinion internal combustion engines will be obsolete for ground transport within a decade anyway due to improved batteries for EVs. Nevertheless, using thermoelectrics for A/C will remain viable regardless of the vehicle's source of motive power, and I think we'll see this.

On another note, I've been waiting since the early 1990s for improved thermoelectrics. What's available commercially now isn't a whole lot better than what was available then. I've read some papers, such as this one which talk of great improvements over today's devices, but I've yet to see any reach production. Page 2 for example shows that a single stage cooler based on their approach could reach 130K. I would love to get my hands on a thermoelectric with that kind of performance. With what exists nowadays, I'm lucky to approach 200K (and that's with a two-stage setup, water cooling, and very little heat load). The best I've done with bulk cooling is to get my temperature chamber down to -58°F ( -50°C = 223K ).
 

Bright+

Newly Enlightened
Joined
Dec 5, 2008
Messages
170
If you were to express how efficient LEDs are in VISIBLE light in PERCENT, you will have to measure how many watts of radiant energy is coming out within the visible light spectrum. Anything outside of this band is a waste, just like infrared from incandescent lamps.
 
Top