LED flashlight heat dissipation: Is a hot head a good thing? Your opinon..

HighlanderNorth

Flashlight Enthusiast
Joined
Sep 15, 2011
Messages
1,593
Location
Mid Atlantic USA
There seem to be 2 schools of thought when it comes to figuring out whether a certain LED flashlight is doing a good job of keeping the internals of the light cool, and dissipating heat. I'll call them theory #1 and #2. Theory number #1 seems to be a popular one. Theory #1's idea is that if a light is getting and staying HOT, then that proves that the flashlight's design is good, and the light is removing heat away from the internals like it should be doing, and the heat itself is proof that it is working properly. I agree with this about 20%.

Theory #2 is that if a light is getting and staying HOT, that means the opposite of Theory #1. If the light is getting and staying HOT, then it isnt doing its job of removing heat away from the internals and dissipating it like it should be. I agree with this theory 80%. Its not so cut and dry either way.

Most lights are made of aluminum, and I learned early on at my 1st job at a machine shop that aluminum does a GREAT job of dissipating heat. It does a MUCH better job of it then steel, titanium, magnesium, etc. Heat flows through and out of it quickly, which is probably one of, if not the biggest reasons it is used for LED flashlights. But aluminum alone isnt always enough, you need a good heat sink, but you also need as much aluminum surface area as possible. Thats why manufacturers machine cooling fins into their higher lumen lights, to increase surface area, so that more heat is able to dissipate out and away from the light.

Here's a parallel analogy. Lets say you built up a '69 Chevelle with a big block, high horsepower V-8, but you installed an aluminum radiator from a 600cc motorcycle, instead of the large aluminum aftermarket high performance car radiator you should have installed. That little radiator IS made of aluminum, and it will dissipate SOME heat, but its not large enough, and doesnt have enough surface area to handle the large amount of heat, so the engine will overheat.

Well, according to theory #1^, it's a good thing thats its so hot, because that shows its keeping heat away from the internals(engine in this case). Well, thats not true here. Its NOT dissipating heat fast enough, and the high heat is proof of it....

I used that analogy^ to explain the same basic problem with some of the smaller, but high lumen lights on the market now, like the ZL SC600, or the Jetbeam PA-10 with a 14500 Li Ion battery on turbo. These lights are small, have small diameter heads, and they dont have cooling fins, so there is a lack of surface area to adequately extract and dissipate enough heat to keep the internals cool when running at the highest setting. But if you take another larger light that runs at 650-750 lumens, and it has a larger head with cooling fins, it will do a better job of keeping the internals cooler. The larger light with the cooling fins WONT get as hot as the ZL SC600 or the JB PA-10. But since the larger lights arent getting as hot, does that mean the larger lights arent doing as as good a job at dissipating heat? No.... They are doing a better job, which is proven by the fact that they ARE NOT as hot as the smaller lights.

I said I agreed with theory #1 20%. Well, to some degree, if you can feel heat in the head of the light, that does mean its pulling heat away from the internals, but only to a certain point. If it gets really hot, and keeps getting hotter, then its NOT doing a good job, as its not properly dissipating that heat.

I just compared my ET G25C2 and SWM T20CS on turbo recently, as I was trying to lower the voltage of the 2 18650's I was using in the 2 lights so I could safely store those 2 batteries. I turned both on turbo and left them on my nightstand. The T20CS was getting hot quickly, whereas the G25C2 was slower. The SWM after 25 minutes was probably 140 degrees F, and still getting hotter minute by minute, whereas the ET was only about 95 degrees F after 25 minutes, and wasnt getting any hotter. I removed the battery tubes and felt deep inside to see which one was hotter, and clearly the SWM was the hotter one. The ET was only warm. They both have similar size heads, but the ET has larger cooling fins, and a thicker body(more aluminum).

Whats your opinion on this subject?
 
Last edited:

Colonel Sanders

Flashlight Enthusiast
Joined
Aug 17, 2010
Messages
1,022
Location
ROLL TIDE!
Where you will really start to understand this is when you put an IR gun to two lights and discover that the one that felt the coolest is actually by far the hottest.

Take two identical high powered small lights except on is bare aluminum and the other is something else like titanium or ceracoated. Run them on the same level one in one hand and one in the other. You will find that the aluminum light feels hotter but the IR gun will state otherwise. The aluminum does a good job of transferring the heat to your hand...and that's why you feel it.
 
Last edited:

Colonel Sanders

Flashlight Enthusiast
Joined
Aug 17, 2010
Messages
1,022
Location
ROLL TIDE!
...Then, wrap the aluminum light in electrical tape and try again. Wow...sure is (well, feels) cool! Yeah, that electrical tape is a poor conductor of heat compared to the aluminum. Put the IR gun to it and see...it's HOT!
 

saabluster

Flashlight Enthusiast
Joined
Oct 31, 2006
Messages
3,736
Location
Garland Tx
If the outside is not hot then the heat is inside and that is bad. Surface area is critical but so is the spreading of heat through the flashlight's body. This requires fine tuning of the wall thicknesses that taper as you move away from the main thermal transfer point connecting the inner path to the outside shell. I will be posting thermal images when I have some more time and if I can remember to do so. Essentially you want(in a traditional flashlight design) the heat to be as homogenous as possible with no real "hot spots".
 

eebowler

Flashlight Enthusiast
Joined
Dec 18, 2003
Messages
1,735
Location
Trinidad and Tobago.
I just compared my ET G25C2 and SWM T20CS on turbo recently, as I was trying to lower the voltage of the 2 18650's I was using in the 2 lights so I could safely store those 2 batteries. I turned both on turbo and left them on my nightstand. The T20CS was getting hot quickly, whereas the G25C2 was slower. The SWM after 25 minutes was probably 140 degrees F, and still getting hotter minute by minute, whereas the ET was only about 95 degrees F after 25 minutes, and wasnt getting any hotter. I removed the battery tubes and felt deep inside to see which one was hotter, and clearly the SWM was the hotter one. The ET was only warm. They both have similar size heads, but the ET has larger cooling fins, and a thicker body(more aluminum).

It's not a simple matter of more mass and the presence of fins or not. If you had circulating air in the room, yes the fins would help but in stagnant air, there would likely be little difference. I don't think (not a professional here,) the little difference between the masses of the lights would make a difference in temp of about 45F. I'm thinking that the drive levels of the LEDs, would be the biggest factors. Note: the Eagletac don't stay on turbo for long so that's another factor there.
 

dy5

Newly Enlightened
Joined
Oct 29, 2002
Messages
74
Location
College Park, MD
Interesting question that got me doing some reading. I'm not sure this all relates directly to the OP's question, but maybe it will stimulate more discussion.
- the rate if heat dissipation is not constant. The hotter the flashlight gets, the faster it loses heat. Radiative loss increases with the fourth power of temperature, i.e.very fast, and conduction depends on the temperature difference between the flashlight and the air. This means that ...
- the flashlight will not continue to get hotter and hotter beyond a certain temperature where the rate of loss equals the rate of production. The more effectively the heat is transferred to the air, the cooler the flashlight, including the LED, will stay (which seems pretty obvious ...)
- in use, the flashlight is moving around and it will stay much cooler than when just sitting a table. The air moving around the light carries the heat away (convective loss). As Colonel Sanders said, the difference between feeling hot and actually being hot tells you about how effectively the metal transfers heat to your hand. I think this also means that holding a warm light should help keep the LED cooler.
My guess is that if the heat from the LED is not transferred to the body of the flashlight, then the the cool body indicates that there is a problem. If the heat from the LED is effectively transferred to the body, then cooler temperature would indicate lots of heat dissipation to the air, which is good. Exactly how hot the flashlight gets reflects the balance between how much heat the LED produces and how effectively the body of the light transfers the heat to the surroundings.
(Sorry if this all sounds 'teacher-ly' but I can't help myself - I spend most of my time with students (a joy ...) rather than flashlight experts.)
 
Last edited:

HighlanderNorth

Flashlight Enthusiast
Joined
Sep 15, 2011
Messages
1,593
Location
Mid Atlantic USA
If the outside is not hot then the heat is inside and that is bad. Surface area is critical but so is the spreading of heat through the flashlight's body. This requires fine tuning of the wall thicknesses that taper as you move away from the main thermal transfer point connecting the inner path to the outside shell. I will be posting thermal images when I have some more time and if I can remember to do so. Essentially you want(in a traditional flashlight design) the heat to be as homogenous as possible with no real "hot spots".



Here is one thing I dont fully understand. I thought that an IR heat gun simply measured primarily external heat. Or at least, even if it is capable of measuring internal heat, if there is high external heat, that external heat would skew the IR's ability to measure internal temps. In other words, if you were to point an IR gun at a flashlight, or engine, that is hot on the outside, the IR gun's measurements would be higher because the high external temps would cause a high reading, regardless of the internal temps. How can an IR device measure through the hot external temps to get an accurate reading of the internal temps? I've watched shows where they used these IR devices to measure the human body's loss of heat while wearing certain clothing in a cold environment, and the IR device only measured the radiant heat that was being lost externally through the clothing. It wasnt measuring internal temps, so I'm not sure how it could accurately read through the high or low external temps of a flashlight to read the internal temps. They seem to just pick up whatever heat is being dissipated externally( the ones I've seen on tv documentaries)


I figured that heat dissipation from an LED flashlight wasnt just a product of cooling fins or surface area, but also could be affected by the design of the body. I posted a thread about my comparison of the ET G25C2 and SWM T20CS about 10 days ago or so. I mentioned in there that one of the reasons I thought the G25C2 seemed to deal with heat better is that its body is thicker and more substantial in the battery tube area, as well as the head., whereas the T20CS has a very thin battery tube. I 'assume' that having more(thicker) aluminum, which is a great conductor of heat, will help spread out the heat instead of concentrating it in a smaller area, while its being dissipated. I'm not sure I'm doing a good job of putting the idea in my head into words here, but I cant think of a better way to describe the idea in this last paragraph right now.
 

HighlanderNorth

Flashlight Enthusiast
Joined
Sep 15, 2011
Messages
1,593
Location
Mid Atlantic USA
Interesting question that got me doing some reading. I'm not sure this all relates directly to the OP's question, but maybe it will stimulate more discussion.
- the rate if heat dissipation is not constant. The hotter the flashlight gets, the faster it loses heat. Radiative loss increases with the fourth power of temperature, i.e.very fast, and conduction depends on the temperature difference between the flashlight and the air. This means that ...
- the flashlight will not continue to get hotter and hotter beyond a certain temperature where the rate of loss equals the rate of production. The more effectively the heat is transferred to the air, the cooler the flashlight, including the LED, will stay (which seems pretty obvious ...)
- in use, the flashlight is moving around and it will stay much cooler than when just sitting a table. The air moving around the light carries the heat away (convective loss). As Colonel Sanders said, the difference between feeling hot and actually being hot tells you about how effectively the metal transfers heat to your hand. I think this also means that holding a warm light should help keep the LED cooler.
My guess is that if the heat from the LED is not transferred to the body of the flashlight, then the the cool body indicates that there is a problem. If the heat from the LED is effectively transferred to the body, then cooler temperature would indicate lots of heat dissipation to the air, which is good. Exactly how hot the flashlight gets reflects the balance between how much heat the LED produces and how effectively the body of the light transfers the heat to the surroundings.
(Sorry if this all sounds 'teacher-ly' but I can't help myself - I spend most of my time with students (a joy ...) rather than flashlight experts.)



That mostly makes sense and relates directly to this subject. You touched on something I noticed, and mentioned in the OP, which is when I compared my ET G25C2 to my SWM T20CS(both with similar size, same battery, similar power), that the ET G25C2 only got up to around 90-100 degrees F, then stopped getting hotter after about 15-20 minutes, but the T20CS seemed to keep getting hotter for the full 25 minutes that they ran. It was significantly hotter than the G25C2, and I'm guess at the temps, but the T20CS was right at that point where you could hold its head in your hand for about 7-10 seconds before the heat started to feel a bit uncomfortable. I'm guessing thats around 130-140 degrees F. Whereas the ET was what I'd call warm, certainly not hot. So if what you are saying is true, and the ET wasnt getting hotter, then it should be doing a good job, whereas the SWM was still getting hotter, meaning it wasnt doing an adequate job. The idea that if its staying cooler, that its doing a better job of conducting and dissipating heat(theory #2) seemed like simple common sense to me at first, but if you think about theory #1, it also makes sense up to a certain point, or to a certain amount of heat. So it is kind of a quandary as to what/where that 'certain point' is.
 

Colonel Sanders

Flashlight Enthusiast
Joined
Aug 17, 2010
Messages
1,022
Location
ROLL TIDE!
"Here is one thing I dont fully understand. I thought that an IR heat gun simply measured primarily external heat."

Hang on....no one ever said the IR gun was measuring internal temperature. What I meant was it will give you the actual external temperature rather than the false temp reading that your hand gives you. The signal that the nerves in your hand send to your brain reflects heat transfer rather than actual temperature. So, you can have one object (A) that is much hotter than another object (B) but that feels cooler to your hand due to poor heat transfer. However, if you zap it with an IR gun then the truth will known that (A) is hotter than (B). The IR gun does not measure heat transferred to it (since there is none transferred to it.) It tells the surface temperature.

FWIW, the IR gun I use is an Extech bought at Lowe's. It's a very good unit, IMO. If you don't have one, I highly recommend it. I use mine all the time for many reasons.
 

AnAppleSnail

Flashlight Enthusiast
Joined
Aug 21, 2009
Messages
4,200
Location
South Hill, VA
[An IR thermometer] will give you the actual external temperature rather than the false temp reading that your hand gives you. The signal that the nerves in your hand send to your brain reflects heat transfer rather than actual temperature.
An IR thermometer does not measure surface temperature. In fact, it may be impossible to directly measure temperature without contact. An IR thermometer measures emitted radiation. Try this experiment:

Paint half of an aluminum bar black, leave the other shiny. Apply heat uniformly. Your IR thermometer will probably record a rather higher temperature for the black half. This is because black paint has higher emissivity than aluminum does. There are some IR thermometers with calibration tables to account for these differences, but you must consider emissivity.

It's thermodynamics. First, define your system. I take mine to be: LED and flashlight. The rate of removing heat from the LED cannot be greater than the rate of rejecting heat to the environment for very long. This imbalance leads to heating of the flashlight body, and is corrected most effectively by cooling the body better. In air, a reasonably-sized flashlight will become definitely warm if it is removing a lot of heat from the LED. After that it's mostly Newton's Law, where twice the temperature difference (Ambient to light) gives twice the heat rejection.
 

Hondo

Flashlight Enthusiast
Joined
Oct 26, 2005
Messages
1,544
Location
SE Michigan
I think maybe the point of what Colonel Sanders is trying to say is that if you soak a block of aluminum and a block of wood in an oven at 150 deg. F, which one would you want to hold in your hand? You could probably tolerate the wood, but not the aluminum. If they were both painted the same color, the IR thermometer would probably give pretty close to the same reading on both, but you would perceive the aluminum as being hotter.

As to the original question, I think what is of first importance is getting the heat from the LED into the body of the light. If the LED base is stranded on a plastic part, or hanging in the air, it is going to take a VERY long time for the heat to reach the aluminum body. Therefore, what I look at first when evaluating effectiveness of the thermal path is HOW FAST does the light body get hot?

Now, if it reaches an alarming temperature, I may conclude that it is best not to run for long periods, or without convective cooling (i.e. good for a bike light). If it heats up quickly, and stabilizes at a moderate temperature, it has a good thermal path, and is safe to run at that level for extended periods. If it very slowly heats to a fairly high temperature, or worse, the LED is cooking in it's own heat.
 
Last edited:

shelm

Flashlight Enthusiast
Joined
Dec 8, 2011
Messages
2,047
very simple OP: let's assume that you have two identical torches (e.g. Eagtac D25LC2) and leave them on on Turbo-mode and tailstanding. touching the 2 samples after 5mins, sample1 be hot, sample2 not. This means that sample1 is functioning as it should and sample2 has a "broken thermal path", e.g. the LED star is not connected properly to the copper heat sink.

In another example with bigger mass, e.g. Klarus P2A .. this torch doesnt feel hot because the produced heat has a larger heat sink.

You cannot compare different torches directly because we dont know their exact heat generation rate at the LED which depends on the momentary LED operating conditions (voltage, amperage, temperature, emitter model, emitter specimen), and because we cant see the heat conduction relevant construction parts.

As this short post demonstrates, both a hot torch signals optimal functioning (D25LC2) and a warm torch (P2A). So if one compares the P2A with the D25LC2, there is no conclusion to be made with respect to the OP ("Is hot head a good thing?"). The D25LC2 is designed to get hot (because of copper pill and thin aluminum body) and the P2A is designed to stay cool (because of massive heavy think aluminum construction).

Generally speaking this: a hot head is a good thing, period.
 
Last edited:

^Gurthang

Flashlight Enthusiast
Joined
Jul 2, 2009
Messages
1,071
Location
Maine, deep in the Darkness of the North
+1 on Shelm's answer. A small EDC [like the D25] light isn't made to run on "ultra-mega-turbo" for long periods so it will get hot, in fact it can get too hot to hold [literally]. The P2A and its brethern are made for running on high for long periods so it's designed to operate at a sustainable temperature and not damage the emitter or driver.
 

fyrstormer

Banned
Joined
Jul 24, 2009
Messages
6,617
Location
Maryland, Near DC, USA
It's not a simple matter of more mass and the presence of fins or not. If you had circulating air in the room, yes the fins would help but in stagnant air, there would likely be little difference. I don't think (not a professional here,) the little difference between the masses of the lights would make a difference in temp of about 45F. I'm thinking that the drive levels of the LEDs, would be the biggest factors. Note: the Eagletac don't stay on turbo for long so that's another factor there.
Fins work in stagnant air too. The reason is, a hot flashlight will generate convection, which makes the air move without needing a breeze or a fan. Once the air is moving, the fins have more surface area and will dissipate heat more effectively.

EDIT: Also, when using a flashlight, the user almost never holds the flashlight perfectly still, so the flow of air around the moving flashlight helps cool it, and fins are useful in this circumstance as well.
 
Last edited:

Bl@ckR0ck

Newly Enlightened
Joined
Jul 3, 2012
Messages
26
The head temp alone isn't enough information to know whether it's "good" or not for the flashlight. A light could run with an extremely hot head temp yet transfer enough heat to maintain an acceptable operating temp for the LED, batteries and other pieces. It could go the other way too, a light could have poor heat transfer which would cause operating temps to continually rise until something broke. It doesn't really matter where the heat dissipates just whether or not it's able to transfer enough of that heat away from the temp sensitive components.

I think the real question is whether your light can transfer enough heat to operate without failure under your required conditions. To figure that out you'd need more info on the light and components used. Everything is going to have a temp limit and many things start degrading and working less efficient when hot too.

I'd say the LED itself could be the limiting factor or first point of failure when a light overheats. You'd have to figure out how hot the LED and control unit can get and still operate then you'd simply need to measure the temp of that piece while the light was on and running at the desired output. An IR gun is probably going to do a good job. It's going to give you the surface temp of wherever you're pointing it, but the temp gradient from the inside of the light by the LED to the outer surface of the light probably isn't going to be that big. So if you read 160* on the hottest part of the light it's a good bet the LED at the center of that is at least 160 if not maybe 5-10* hotter. The questions then are how hot is your light running, is it able to maintain that temp at the given output over time or will it just continue to heat until it reaches the temp limits of the LED and cause failure.
 

Fireclaw18

Flashlight Enthusiast
Joined
Mar 16, 2011
Messages
2,408
Another thing that apparently helps heat dissipation is the external color of the flashlight body. Dark colors, especially black supposedly do a much better job at radiating the heat to atmosphere than light colors or bare metal.

I have Sipik 58 budget light I modded. This started out as a small single AA size light with a zoomable aspheric lens. I replaced the emitter with an XM-L T6 neutral and the driver with a 2.8 amp custom programmed Nangj 105c driver, running exclusively on an AW IMR 14500. I removed the rather scratched looking Type II black anodize with Greased Lightning and then buffed the bare aluminum.

Result: Light looks great and is really bright. But it gets really really hot if left running for any period of time especially when not in the hand. It accidentally turned on in my pocket a few days ago and I didn't notice for maybe 10 minutes. When I pulled it out the entire light was too hot to touch (both inside and out).
Too bad I didn't know about the benefits of the black anodize for heat dissipation before I stripped it off. Oops!:oops: Bare metal might be fine for conducting heat to your hand but apparently isn't very good for radiating it to atmosphere.

After I finally managed to unscrew the light and pull out the battery (which was also quite hot), I then let it cool down, which took another 10-15 minutes or so. Amazingly, both the light and the battery appear to be undamaged and both still work fine.
 
Last edited:

Colonel Sanders

Flashlight Enthusiast
Joined
Aug 17, 2010
Messages
1,022
Location
ROLL TIDE!
I'm sure you're technically spot on about what an IR gun reads. However, my point is that how hot something feels is not necessarily representative of how hot it really is. Bare aluminum will feel somewhat hotter than most other materials and MUCH hotter than some even though the actual temperature is lower. One might get the idea that the aluminum light is running hotter and they'd be wrong. In my experience, an IR gun (by whatever means it measures) is a good tool to use to get a good idea of what's really going on.

I'll have to try fooling the IR gun with different colors. That's interesting. Thanks. :thumbsup: As much as I've used one I've never gotten the idea that this is the case but then, I haven't been looking for that. I'll be very easy to test though.

An IR thermometer does not measure surface temperature. In fact, it may be impossible to directly measure temperature without contact. An IR thermometer measures emitted radiation. Try this experiment:

Paint half of an aluminum bar black, leave the other shiny. Apply heat uniformly. Your IR thermometer will probably record a rather higher temperature for the black half. This is because black paint has higher emissivity than aluminum does. There are some IR thermometers with calibration tables to account for these differences, but you must consider emissivity.

It's thermodynamics. First, define your system. I take mine to be: LED and flashlight. The rate of removing heat from the LED cannot be greater than the rate of rejecting heat to the environment for very long. This imbalance leads to heating of the flashlight body, and is corrected most effectively by cooling the body better. In air, a reasonably-sized flashlight will become definitely warm if it is removing a lot of heat from the LED. After that it's mostly Newton's Law, where twice the temperature difference (Ambient to light) gives twice the heat rejection.
 

Colonel Sanders

Flashlight Enthusiast
Joined
Aug 17, 2010
Messages
1,022
Location
ROLL TIDE!
"Another thing that apparently helps heat dissipation is the external color of the flashlight body. Dark colors, especially black supposedly do a much better job at radiating the heat to atmosphere than light colors or bare metal."

"I removed the rather scratched looking Type II black anodize with Greased Lightning and then buffed the bare aluminum.

Result: Light looks great and is really bright. But it gets really really hot if left running for any period of time especially when not in the hand."


You have just illustrated exactly what I've been talking about. The light is actually shedding the heat (rather than holding it in) much quicker now that it is bare aluminum. The result is that it's now doing a BETTER job of getting rid of the heat and that's why it feels hotter quicker. And when you hold it in your hand it transmits the heat quickly to your hand thereby making you think "WOW! This thing is HOT!"...Though as a whole the light is cooler than it would have been when anodized.
 
Top