To cool, or not to cool, during Runtime?

What do you think?


  • Total voters
    44

EngrPaul

Flashlight Enthusiast
Joined
Sep 28, 2006
Messages
3,678
Location
PA
I recently read suggestions on how to cool a flashlight during a runtime test.

When I see a runtime graph, I'd like to see how the light performs in use. My opinion is that the test setup should be as-is, without any additional cooling methods.

If the emitter tends to get dim due to being overdriven, or because it's poorly heat sinked, I'd like to know about it. The same could be assumed regarding the driver and the cells.

Anone else feel this way? :tinfoil:

P.S. I voted wrong and can't fix it. :green:
 
Last edited:

TOTC

Newly Enlightened
Joined
Nov 12, 2004
Messages
161
I voted "cool it" since I rarely set a flashlight down when I'm using it in real situations for extended times. Simply holding the light offers a degree of added heatsinking.

Until someone makes a hand out of ballistics gel that can hold the light for runtime graphs, I'm fine with people using other cooling methods
grinser2.gif


But seriously, it already takes an investment of time and money from the people that put these together for us. Asking them to do it in a manner that might permanently damage their lights seems to be pushing it just a little...
 

Archangel

Flashlight Enthusiast
Joined
May 29, 2005
Messages
1,182
Location
PA, USA
I always have a light fan blowing on it to simulate my hand and whatever breeze might happen to be blowing were i to actually be using it. I don't feel the purpose of a run-time test should be to weed out poorly-designed torches.
 

alphazeta

Enlightened
Joined
Mar 22, 2007
Messages
292
Location
NYC
I voted let it ride, because I primarily use my lights indoors & tend to leave them candlestanding in rooms for a few hours each time.

*EDIT* However, for runtime graphs I think some type of passive cooling would be wise. (i.e. - perhaps something like resting the light on a half pipe of copper?) Active cooling such a fan might skew the test too much (but, of course it's usage is definitely understandable.)

Perhaps a brief synopsis on the test environment should accompany each runtime chart?
 
Last edited:

amanichen

Enlightened
Joined
Apr 23, 2006
Messages
335
Location
Virginia
Until someone makes a hand out of ballistics gel that can hold the light for runtime graphs
Ballistics gel has similar density and elasticity to human flesh.

It does not however have similar thermal conductivity, nor does it have a circulatory system which carries away additional heat.
 

jtice

Flashaholic
Joined
May 21, 2003
Messages
6,331
Location
West Virginia
When I did my runtimes, I would place a 120mm computer fan a few inches away.
Lights can get VERY hot if just left sitting like that.
Plus, in real use, if you hold the light in your hand, it will wick away heat.

~John
 

TOTC

Newly Enlightened
Joined
Nov 12, 2004
Messages
161
Ballistics gel has similar density and elasticity to human flesh.

It does not however have similar thermal conductivity, nor does it have a circulatory system which carries away additional heat.
I see that my weak attempt at a joke was taken a bit too literally.
 

McGizmo

Flashaholic
Joined
May 1, 2002
Messages
17,292
Location
Maui
If the reason for runtime is to try to get a feel for what the runtime would be in real use, cooling the light will likely reduce the runtime length and this may be more indicative of runtime in real use. If you want a relative comparison with another light then it is more important to be consistant with your ambient conditions. I see value in relative comparisons but wouldn't count on a measured constant on runtime as an indicator of what I could expect with typical intermittent use. Some may recall some tests I did running the same light at the same temp in air and water and the significant difference this had on runtime! The operating temp of the battery plays a major role in runtime.

If your runtime test is primarily focused on evaluating light output over the run then head temp is significant if the light is regulated to the point that Vin can vary with little output effect.

Operating temperature will effect the outcome of these tests. You can alter the temp with forced cooling but unless you know what the resulting operating temp is, you can't really quantify or qualify your test, it would seem to me. :shrug:

If you want to give the light a break while testing it, certainly cooling it is friendly! :) If you want to subject it to a litle abuse and see how it takes it then don't cool it.
 

EngrPaul

Flashlight Enthusiast
Joined
Sep 28, 2006
Messages
3,678
Location
PA
Thanks for the comments. I see the reasons for doing it both way.

I believe conduction into the blood stream is overrated. By the time the flashlight is hot enough to transmit significant heat into your hand, you will probably want to let go.

I guess it all depends on what you want runtime to mean. Are you testing battery life from occasional use, or are you evaluating the performance of the light?

For instance, I think it's important that we know the Maglight brand drop-in has a significant reduction in output after a few minutes of operation. Had the drop-in been cooled artificially, we would not see that the flashlight has a severe design issue in real use. Personally, I think this is important.

:popcorn:
 

2xTrinity

Flashlight Enthusiast
Joined
Dec 10, 2006
Messages
2,386
Location
California
Thanks for the comments. I see the reasons for doing it both way.

I believe conduction into the blood stream is overrated. By the time the flashlight is hot enough to transmit significant heat into your hand, you will probably want to let go.
I disagree, based on the following case -- the L0D-CE on 10440. If I drain a cell with the light sitting unattented, it reaches a high temperature fast because the light is so small, that the heat doesn't normally have anywhere to go. In cases where I've picked up that light after it's been set down, it's been fairly uncomfortable to handle. However, If I run the light in my hand the entire time, it gets warm, but never uncomfortably so -- it never reaches as high a temperature because of the cooling effect of holding the light in-hand. That is an extreme case, as there are no other lights I know of that consume over 3 watts yet at the same time are so small, but it does demonstrate that conduction into the hands can make a very significant difference.

For instance, I think it's important that we know the Maglight brand drop-in has a significant reduction in output after a few minutes of operation. Had the drop-in been cooled artificially, we would not see that the flashlight has a severe design issue in real use. Personally, I think this is important.

:popcorn:
Cooling the outer shell of the light would do absolutely nothing -- the problem with the maglite is that there is no thermal pathway from the bulb holder to the outer skin of the light. I don't even think submersing the light in ice water would make any difference. In my opinion however, I believe that just about all these high-end flashlights should have thermal feedback, not just weird cases like the MagLED, in case they are turned on in a hot vehicle, or jacket pocket or something -- specifically to prevent damage to the emitter, or to the batteries.

IMHO the best way to test runtime would be to run the light in say 20 minute intervals, then allow the light to cool (or even force-cool it) in between -- that is a closer simulation of real-world use IMO. In the case of the MagLED, the light will actually have much much longer runtime if it is run for extended intervals -- as initially, output drops due to heating, then after that, power draw is intentionally lowered. Measuring the runtime in intervals might better reflect that.
 

jtice

Flashaholic
Joined
May 21, 2003
Messages
6,331
Location
West Virginia
I believe conduction into the blood stream is overrated. By the time the flashlight is hot enough to transmit significant heat into your hand, you will probably want to let go.

I would have to disagree here also.
Remember, we arent really talking about picking up a light that has been running a while on a table top.
We are talking about holding the light the entire time.

Although you can so that also,
I have picked up a light after it has sat and ran a while, and was quite hot.
The tests I have done have proven to take a considerable amount of heat away from the light in a short amount of time.

I have even held some lights up toward the head on purpose to keep them running cooler,
such as my Surefire L4, man, that thing is a hand warmer in the winter. (yes, I've done it)

So, what is more accurate/realistic for a runtime?
That is still debatable. I dont think just setting the light on a table with no cooling at all would be all that accurate, nor good for some lights.
Though too much cooling could certainly through the runtime way off.

I even experimented with laying about a 1/4" thick layer of wet paper towels on the lights, with no fan.
That actually worked fairly well, and acted a bit more like the human hand carrying away the heat.
But the towel would eventually totally saturate with heat.

~John
 

PhantomPhoton

Flashlight Enthusiast
Joined
Jan 15, 2007
Messages
3,116
Location
NV
Honestly in the tradition of CPF I say do both! :twothumbs
Do it in the name of Science!

...But I know that this isn't always possible or practical so my vote goes for let it ride.

The best representation of real world use would likely be in your hand, but again this just isn't an easily accomplished task; especially when you're testing an 18650 run high efficiency LED on low power. :laughing:

I definitely would like whatever the testing method it to be stated. This is a very interesting point you bring up. If the light was tested while its head was encased in ice... well obviously someone's trying to inflate their numbers a bit.
 

Sub_Umbra

Flashlight Enthusiast
Joined
Mar 6, 2004
Messages
4,748
Location
la bonne vie en Amérique
Interesting discussion.

I think that real-world light usage in the wild is so varied that unless the reviewer has a certain user demographic in mind he should probably just do whatever he feels would be most appropriate for a given light and it's intended users. He should always explicitly disclose in the review exactly what he did or did not do to cool the light. That way the reader may still attempt to enterpret the reviewers findings in a way that may give him some idea of how the same light may perform under his own set of circumstances.

The same model light may be used by cavers in the cold depths or by photographers in the jungle. No one methodology will adequately serve users in all situations. I think full disclosure is a reasonable compromise.
 

NoFair

Flashlight Enthusiast
Joined
Dec 22, 2004
Messages
1,556
Location
Norway
I voted cool it.

Unless you mostly leave your lights in candle-mode I think cooling the light gives more realistic runtimes than not.

I'm not thinking leaving it on a block of ice, but using a fan, glass of water, wet cloth or similar to prevent the light from getting very hot.

I've never had a light get really hot in my hand, but bright led lights can get very hot if left alone for an entire run.

Unless it steps down when it gets hot, like HDS lights, some lights could probably damage themselves or the batteries if left uncooled an entire run.

Sverre
 

Bertrik

Newly Enlightened
Joined
Aug 3, 2005
Messages
102
Location
Netherlands
I agree with full disclosure of the testing method, but I voted to let it ride (passive cooling only) unless there's a compelling reason to do otherwise.

IMO, it is important to have a uniform method of testing to be able to compare runtime test results from different people for different flashlights. If active cooling or duty cycle (x minutes on, y minutes off) becomes part of the testing method, some kind of standard needs to be defined and I think it may be tricky to reach consensus on the exact method. When a standard is eventually defined it may be too complicated for some people to perform (e.g. due to lack of materials).

The simplest method then IMO is just to let it ride. Although not representative of real-world use, it should least give a limit for worst-case use.
 

The_LED_Museum

*Retired*
Joined
Aug 12, 2000
Messages
19,414
Location
Federal Way WA. USA
I voted "Let it ride", as active cooling might skew the test results; thus queering the test.
If an emitter becomes damaged by thermal means, that simply means the light under test was improperly designed; it should have not have failed in that manner. :green:
 

KingGlamis

Banned
Joined
Jun 10, 2007
Messages
745
Location
Mesa, AZ
I wonder how my lights are affected here in AZ? 110 degrees in the day right now and 85 as the low at night. I left an LED light outside yesterday in the hot weather and when I brought it inside and tried to use it it was having issues. Once I let it cool down it was fine. Hmmm...
 
Top