PK Design Lab
Page 1 of 4 1234 LastLast
Results 1 to 30 of 101

Thread: LEDs waste 75% as heat

  1. #1
    Flashaholic*
    Join Date
    Jul 2010
    Location
    Sydney, Australia
    Posts
    995

    Default LEDs waste 75% as heat

    The ideal LED would convert all the electrical energy into light, but it's clear that a lot gets converted to heat.

    Unlike filament bulbs, most of the heat needs to be conducted out of the back of the LED to keep the LED cool - the hotter it gets, the less efficient it is.

    I've seen a few attempts on CPF at working out how much heat an LED puts out, but I think the only reliable method is substitution - find out how much heat will raise the same heatsink to the same temperature. 100% of the electrical energy that goes into a resistor gets converted to heat - it's very hard to make an inefficient electrical heater !

    I found two identical unadonised aluminium finned heatsink - 10 x 5 x 3.5 cm.

    On one, I mounted an XM-L T6 on a 20mm star from www.ledsales.com.au. On the other, I mounted two 10 watt 1 ohm resistors in aluminium extruded housings with two screw mounts.


    LED 8.42 watts
    - efficiency was 18% - heat output from resistors was 6.86 watts = 82%
    - heatsink temperature = 56.7 degC, ambient = 25.5 degC
    - LED 3.12v, 2.70 amp
    - Resistor 3.71v, 1.85amp


    LED 2.95 watts
    - efficiency was 31% - heat output from resistors was 2.04 watts = 69%
    - heatsink temperature = 35.5 degC, ambient = 23.7 degC
    - LED 2.95v, 1.00 amp
    - Resistor 2.02v, 1.01 amp


    So over a normal operating range, modern high power LEDs waste around 75% of the power going in to them.

    Of course this testing method ignores radiant heat loss from the LED - when I hold my hand 1cm in front of the LED it gets a lot warmer than holding it 1cm in front of a resistor putting out the same heat. EDIT- as pointed out by jtr1962, the warmth I feel would be due to the light being converted to heat when it strikes my skin - the energy has to go somewhere.
    Last edited by MikeAusC; 05-04-2011 at 02:17 AM.

  2. #2
    *Flashaholic*
    Join Date
    Nov 2003
    Location
    Flushing, NY
    Posts
    5,886

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by MikeAusC View Post
    Of course this testing method ignores radiant heat loss from the LED - when I hold my hand 1cm in front of the LED it gets a lot warmer than holding it 1cm in front of a resistor putting out the same heat.
    Good work but don't confuse the phenomenon you're describing with radiant heat loss. In order to have enough radiant heat loss to feel it with your hand 1 cm away, the LED die would need to be a few thousand degrees. What you're feeling is the light energy from the LED absorbed by your hand, and turned into heat. I first noticed this with a Rebel where if I put black electrical tape very close to the LED dome, it would become hot enough to start smoking. Even my lighter colored finger quickly gets too hot to hold above the dome. It's an interesting phenomenon.

    Incidentally, the 10 watt resistors end up slight increasing the surface area from which heat is dissipated compared to when the LED is mounted, so your experiment might be overestimating the amount of power required for a given heat sink temperature rise. At 1 amp, a T6 XM-L should be outputting about 370 lumens. This might be equivalent to around 1.1 watts of light energy, so the heat would be 1.85 watts instead of 2.04 watts. A slight refinement then might be to put insulation over the resistor body so that there is as little heat dissipation there as practical. Other than that, great experiment and interesting results!

  3. #3
    Flashaholic*
    Join Date
    Jul 2010
    Location
    Sydney, Australia
    Posts
    995

    Default Re: LEDs waste 75% as heat

    I had thought of using the Dichroic Reflector that's used with 50mm Halogen Bipin lamps - they're designed to reflect the light forward, but let the heat pass throught the reflector, to avoid setting fire to the object being lit up. If you look at the bulb filament from the back of the reflector, you can see it only lets a small amount of the light through.

    So you could arrange the LED so the radiant heat which passes through dichroic reflector also heats up the heatsink, but the reflected light radiates into space.
    Last edited by MikeAusC; 03-23-2011 at 08:56 PM.

  4. #4
    Flashaholic*
    Join Date
    Jul 2010
    Location
    Sydney, Australia
    Posts
    995

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by jtr1962 View Post
    . . . .What you're feeling is the light energy from the LED absorbed by your hand, and turned into heat. I first noticed this with a Rebel where if I put black electrical tape very close to the LED dome, it would become hot enough to start smoking. Even my lighter colored finger quickly gets too hot to hold above the dome. . . .
    Good point - I'd forgotten that even light will get converted to heat. If you have a light inside an opaque container, the only way the energy can escape, is as heat from the outer surface of the container.

  5. #5
    Flashaholic* srfreddy's Avatar
    Join Date
    Sep 2010
    Location
    New England
    Posts
    918

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by MikeAusC View Post
    Ito avoid setting fire to the object being lit up.
    We wouldn't want that now would we? An "ideal" LED could never convert all electricity into light.

  6. #6
    *Flashaholic*
    Join Date
    Nov 2003
    Location
    Flushing, NY
    Posts
    5,886

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by MikeAusC View Post
    I had thought of using the Dichroic Reflector that's used with 50mm Halogen Bipin lamps - they're designed to reflect the light forward, but let the heat pass throught the reflector, to avoid setting fire to the object being lit up. If you look at the bulb filament from the back of the reflector, you can see it only lets a small amount of the light through.

    So you could arrange the LED so the radiant heat which passes through dichroic reflector also heats up the heatsink, but the reflected light radiates into space.
    Interesting idea. I'm reasonably sure though the amount of energy leaving an LED as radiant heat can be measured in microwatts, and is thus entirely negligible for the purposes of this experiment. Remember that radiant heat is proportional to absolute temperature to the fourth power. If the LED die were at incandescent lamp temperatures, it might radiate a couple of watts. However, at perhaps 50°C, it'll only radiate about 1/10000th the power.

    In any case, your assessment that only about 25% of the input power is converted to visible light seems quite correct and reasonable for an LED operated at medium to high currents. You can approach or even exceed 50% at very low currents, but at the expense of using more LEDs. I expect by the end of the decade we'll be pushing efficiencies of 75% to 80%. Such efficiencies have already been reached in the lab.

  7. #7

    Default Re: LEDs waste 75% as heat

    what sort of thermal path was there from the resistors to the aluminium extrusion housing the resistors ?

  8. #8
    Flashaholic*
    Join Date
    Jul 2010
    Location
    Sydney, Australia
    Posts
    995

    Default Re: LEDs waste 75% as heat

    These were commercial high-power resistors and I assume they are cemented into the aluminium housing. They have a flat base so I used Arctic Silver compound, just like under the star.

  9. #9

    Default Re: LEDs waste 75% as heat

    This is hard to measure, and your methods are somewhat shot-in-the-dark for accuracy.

    Just look it up. Lumens are a measure of total light power, rather than a light intensity that increases with focusing. Unfortunately, rather than a fixed relationship between lumens and power, lumens is a scale compensated for human eye response. 5mW of green laser appears 10x brighter than 5mW of red laser. A green LED putting out 100mW of green light will score about 10x the lumens of a red LED putting out 100mW of red light.


    Given a particular wavelength or white color temp, you can look up how many watts of light energy = 1 lumen. So look up the spec sheet's "typical" for current, voltage, and lumens and you can calc the efficiency straight-out.

  10. #10

    Default Re: LEDs waste 75% as heat

    i tried to figure a margin of error for mikeausc s work, but figured it didnt matter or affect the conclusion.
    i think he clearly demonstrated that while it seems leds are quite efficient (energy to light 25 percent seems a long way from a candle or a heated wire) theres still plenty of room for improvement. so we are probably not at the peak just yet.

    theres no substitution for practical experiment.

  11. #11
    Flashaholic*
    Join Date
    Jul 2010
    Location
    Sydney, Australia
    Posts
    995

    Default Re: LEDs waste 75% as heat

    The point of this experiment was to answer the frequently-asked question "I'm feeding x watts to my LED, how much heat will I have to remove."

    If anyone has a more accurate basis for answering this, please enlighten me.

  12. #12

    Default Re: LEDs waste 75% as heat

    20%-25% is about right for LEDs. Maybe a bit better now.

    There are theoretical limits which would keep LEDs far from being able to 100% efficiency no matter how perfect they are, at least with AlInGap and InGaN technology.

  13. #13
    Enlightened Lego995743's Avatar
    Join Date
    Apr 2011
    Location
    Williamsburg VA
    Posts
    43

    Default Re: LEDs waste 75% as heat

    it is impossible for an led to be 100% light because light is heat energy.

  14. #14
    Flashaholic* Walterk's Avatar
    Join Date
    Jan 2010
    Location
    Netherlands
    Posts
    746

    Default Re: LEDs waste 75% as heat

    Very clarifying. I dont longer have to think
    I'm feeding x watts to my LED, how much heat will I have to remove.

  15. #15

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by Lego995743 View Post
    it is impossible for an led to be 100% light because light is heat energy.
    That... what??

  16. #16

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by Lego995743 View Post
    it is impossible for an led to be 100% light because light is heat energy.
    Yeah, um, no. It's not. It's light energy. Photons, which according to quantum mechanics, leave and arrive as particles, but travel as waves. Yada, yada, yada.

    EDIT: And, in fact, if you want to SEE heat energy, you can throw on some infrared goggles.
    Last edited by onetrickpony; 04-19-2011 at 04:29 PM.

  17. #17
    Flashaholic
    Join Date
    Jul 2010
    Location
    Australia
    Posts
    341

    Default Re: LEDs waste 75% as heat

    What you see as light ... what you feel as radiated heat - its all the same thing. Just different frequencies.

  18. #18

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by bbb74 View Post
    What you see as light ... what you feel as radiated heat - its all the same thing. Just different frequencies.
    Yeah, but I feel that is oversimplifying things. That's like saying that a candle and a plasma torch are the same thing.

    Light and heat may fall under the electromagnetic umbrella, but they are definitely different as far as how humans perceive them. Aka, you can see heat, and you can feel light, but you're much more likely to be able to perceive them vice versa.

    Light IS a form of energy. Various sources of both will emit the other, but let's be clear, heat is NOT light energy.

  19. #19
    Flashaholic
    Join Date
    Jul 2010
    Location
    Australia
    Posts
    341

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by onetrickpony View Post
    That's like saying that a candle and a plasma torch are the same thing.
    Getting OT here .. but thats not what I mean, although I think we are agreeing with each other though A plasma torch will radiate infrared heat which you can feel, but it also has a cutting flame of hot plasma, which yes is obviously a very different kettle of fish. The plasma does the cutting but it also loses some energy as radiant (IR) heat which you can feel. Applying the torch to your hand results in a different method of energy transfer (and would be a bit painful).

    Quote Originally Posted by onetrickpony View Post
    Light IS a form of energy. Various sources of both will emit the other, but let's be clear, heat is NOT light energy.
    I agree but can I reword it as "Electromagnetic radiation within certain ranges of frequencies can be detected by the human body as either 'light' or 'heat' with varying sensitivities".

    Not sure why I'm making this post now...

  20. #20
    Flashaholic*
    Join Date
    Jul 2007
    Location
    Central Europe
    Posts
    1,578

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by Oznog View Post
    20%-25% is about right for LEDs. Maybe a bit better now.

    There are theoretical limits which would keep LEDs far from being able to 100% efficiency no matter how perfect they are, at least with AlInGap and InGaN technology.
    Theoretical maximum for white light is somewhere around 300lm/W. So Cree XM-L driven at 0.35A will have 50% efficiency. Of course it goes down with increasing current.

  21. #21

    Default Re: LEDs waste 75% as heat

    it is "ideal",so it is hard to achieve. we cann't avoid the power to convert into heat.

  22. #22

    Default Re: LEDs waste 75% as heat

    Lotta confusion going on in here with numbers and what is and isnt heat.

    Theoretical maximum for white light is somewhere around 300lm/W. So Cree XM-L driven at 0.35A will have 50% efficiency. Of course it goes down with increasing current.
    Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.

    I agree but can I reword it as "Electromagnetic radiation within certain ranges of frequencies can be detected by the human body as either 'light' or 'heat' with varying sensitivities".
    close but still confusing, possibly misleading. The only sense organ in the body that directly detects electromagnetic frequencies is the eyes, 'heat' as felt by the skin, is an indirect sense from the heating of the skin itself. I.E. a thermometer doesnt sense electromagnetic frequencies, but it you shine a 1W laser on it, its still gonna go up.
    Various frequencies of the elctromagnetic spectrum are absorbed by the body at differing rates, so heating varies. Everything from UV down to a few GHZ will be absorbed at least partially by the skin, causing local heating, felt as warmth. UV and above is ionizing and bad and starts to just zing right though you, breaking DNA along the way and causing cancer, but not inducing much heating, stuff below a few GHz penetrates the body in varying depths, causing internal heating, but thats not felt as there isnt heat sense organs inside your body, so by time you do feel it, damage via heating can be done. And lower frequencies have wavelengths larger then you and pass though without a care in the world.


    Yes, longwave IR cameras are traditionally called "heat vision", but they just see the IR emissions from the warm object. Kind of like how incan lamps emit light, you emit IR light also, just in the 310K spectrum. IR just falls into a sort of niche where we cant see it, we can feel the heating effects due to it being absorbed by the skin very quickly, its too high frequency for us to do direct emissions via an antenna (Getting there! THz transmitters arent stuff of fiction anymore)

    I would post a certain picture of jackie chan right now, but I know the moderators on here wouldnt be too fond of it

  23. #23
    Flashaholic
    Join Date
    Mar 2010
    Location
    Moncton, NB Canada
    Posts
    402

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by CKOD View Post
    Lotta confusion going on in here with numbers and what is and isnt heat.


    Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.
    Not sure where you are coming from but if the Luminous Efficacy of Radiation(LER) value for the XM-L at 350ma is 300 lumens and the LED emits 150 lumens then the electrical efficiency of the LED is 50%(watts in to watts out).

    Stephen Lebans

  24. #24

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by MikeAusC View Post
    The point of this experiment was to answer the frequently-asked question "I'm feeding x watts to my LED, how much heat will I have to remove."
    excellent experiment MikeAusC. It's nice to have a rough guideline for this sort of thing. Do you have any idea how similar the results would be with other LED's? Would results correspond to the relative efficiencies of the LED tested vs the XM-L T6?

  25. #25

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by slebans View Post
    Not sure where you are coming from but if the Luminous Efficacy of Radiation(LER) value for the XM-L at 350ma is 300 lumens and the LED emits 150 lumens then the electrical efficiency of the LED is 50%(watts in to watts out).

    Stephen Lebans
    Ahh ok, I thought you were referencing a theoretical ideal LED vs the actual led, not the lm/w value for its emisison spectrum vs its actual output. That makes more sense and is correct then, though there would be some error introduced by the color temperature, but thats not as significant.

  26. #26
    Flashaholic*
    Join Date
    Jul 2007
    Location
    Central Europe
    Posts
    1,578

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by CKOD View Post
    Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.
    I'm afraid that I'm not the one who is confused here

    300lm/W is roughly theoretical maximum for white light (100% energy converts to white light). So white LED with 150lm/W emits 50% energy as white light and 50% goes to waste heat.

  27. #27
    Flashaholic
    Join Date
    Mar 2010
    Location
    Moncton, NB Canada
    Posts
    402

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by jirik_cz View Post
    I'm afraid that I'm not the one who is confused here

    300lm/W is roughly theoretical maximum for white light (100% energy converts to white light). So white LED with 150lm/W emits 50% energy as white light and 50% goes to waste heat.
    It is not accurate to state that "300 lm/w is the theoretical maximum for white light. For a detailed explanation go to the DOE website and read the current LED Roadmap report. Or for a specific(but somewhat dated) reference see:
    Color Rendering and Luminous Efficacy of White LED Spectra

    Stephen Lebans

  28. #28
    Flashaholic*
    Join Date
    Jul 2007
    Location
    Central Europe
    Posts
    1,578

    Default Re: LEDs waste 75% as heat

    Stephen do you mean this NIST document? (your link doesn't work)

    Most of the current white LEDs are actually blue LEDs with YAG phosphor. According to this document theoretical maximum for LED with YAG phosphor, 6800K CCT and CRI 81 is 294 lm/W.

  29. #29
    *Flashaholic*
    Join Date
    Nov 2003
    Location
    Flushing, NY
    Posts
    5,886

    Default Re: LEDs waste 75% as heat

    That link is dead. Here's a direct link to the .pdf you're referring to:

    http://lib.semi.ac.cn:8080/tsh/dzzy/...30/5530-88.pdf

  30. #30
    Flashaholic
    Join Date
    Mar 2010
    Location
    Moncton, NB Canada
    Posts
    402

    Default Re: LEDs waste 75% as heat

    Quote Originally Posted by jtr1962 View Post
    That link is dead. Here's a direct link to the .pdf you're referring to:

    http://lib.semi.ac.cn:8080/tsh/dzzy/...30/5530-88.pdf
    THe link was fine but it seems to be the "Add Link" feature within the posting interface that did not correctly parse the URL I entered:
    http://lib.semi.ac.cn:8080/tsh/dzzy/...30/5530-88.pdf

    I am still relatively new to this interface so perhaps I am doing something wrong. In the future I will test the links within the Preview window to ensure they are parsed correctly.

    Thank you for posting the correction.

    Stephen Lebans
    Last edited by slebans; 04-20-2011 at 04:37 PM. Reason: missing "for"

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •