# Luxeon 5-Watt UNDERdriven On 6V...

#### MR Bulk

##### Flashaholic
...in a well-heatsunk big metal flashlight, like say a D-cell Mag -- what would the temps be like? Wayne? Don? Peter? Anybody?

#### dat2zip

##### Flashlight Enthusiast
I can say this that my Terra Destroyer levels off around 110F and I read a peak around 115F which now that I have an IR temp probe can calibrate that it's around the temperature that I consider near HS level and almost unbearable to grab.

I'm not driving the full 5W but I'm around 6.8V and 660mA which is 4.48W for the LED and with 85% efficiency the converter adds another 0.673W if I did my math correct. The total not including the heat from the batteries would be 5.16W. The 123 batteries complain quite a bit and get warm or hot as well adding to the heat equation.

You can perform simple heat experiments to determine the Heat resistance of any metal object which will help determine what the flashlight body temperature will be.

Wire up a Luxeon emitter thermally glued to the inside portion of the battery like the open space area around the bulb for the MAG D style.

Bring out the wires and hook the LED to a suitable power supply.

Now apply power till you get 1W. Volts * current = Watts. You want 1W so a typical Luxeon around 3.3V is .303A. This will mean the LED is dissipating 1W. Let the tube stabilize to the new temperature (5-15minutes).

Now use the following equation to ballpark what the flashlight you are testing has in thermal resistance.

Ambient - Flashlight_body_temperature = Rfl

Where Ambient is the room temperature.
Flashlight_body_temperature = temperature of flashlight.
Rfl = thermal resistance in F/Watt (convert this to C/Watt for most equations are degrees C/Watt

How to use the number.

Simply multiply this Rfl by the total Watts the LED and all other components are consuming inside the flashlight.

Let's say for example: I measure 110F on my flashlight. From my calculations above I'm pumping 5.16W. The surefire M2 with aluminum head insert and electonics has a thermal coefficient of 110F - 77F = 33F/5.16W or 6.39F/Watt.

Assume now I put a 1W LED (1W total including converter) would mean the flashlight body would rise 6.39 degrees Farenheight above the room temperature.

Degrees_F/W * Watts = Temperature in F rise above ambient.

Does this help?

WayneY

#### Doug S

##### Flashlight Enthusiast
Wayne offers a good test procedure above. I have couple of additional comments. For a massive flashlight like a D cell Mag, it will take longer to reach steady state temp than the 5-15 minutes suggested if tested at a power of 1W. Just check the temp every so often until you feel the continued rise is small enough to be "good enough". You don't have to use a luxeon. A resistor will be just as good. You can reach steady state temperature faster by removing the batteries [less mass to heat up]. You can get somewhat greater resolution by testing at higher power level. Just remember to divide your temperature rise by the input watts to get your thermal resistance in degrees/watt.

#### McGizmo

##### Flashaholic
Wayne and Doug,

Thanks for these useful pieces of information! this is one of those posts that it's a bummer that it will slide off the chart and this info will be hard to find later on.

Doug, how do you suggest imparting the resistor heat to the battery tube? Will thermal grease suffice? Wont some of the heat dissipate into the air in this case as opposed to a closed flashlight where the heat will one way or another need to pass through the body of the light? Probably want to seal off the light in as close to similar fashion as when the light will be in functional form?

- Don

#### Doug S

##### Flashlight Enthusiast
Originally posted by McGizmo:
Wayne and Doug,

Thanks for these useful pieces of information! this is one of those posts that it's a bummer that it will slide off the chart and this info will be hard to find later on.

Doug, how do you suggest imparting the resistor heat to the battery tube? Will thermal grease suffice? Wont some of the heat dissipate into the air in this case as opposed to a closed flashlight where the heat will one way or another need to pass through the body of the light? Probably want to seal off the light in as close to similar fashion as when the light will be in functional form?

- Don
<font size="2" face="Verdana, Arial">As long as you seal the light as you suggest [to prevent any heat transfer by air exchange], there is no need to make any provision to thermally couple the heat source to the flashlight. This is the very subject that we flogged in another thread recently. As long as the heat is generated inside, it will get out.

#### MR Bulk

##### Flashaholic
Hi guys, and Thanks! I am not a number cruncher nor do I "do" formulae that well (although I can figure out in about 2 seconds what the fine will be for traveling exactly how many miles over the speed limit you were going, heh heh).

But seriously, all I know is that one time, I ran Elektrolumens' Blaster (3C cells overdriving an LS) for like 64 hours and it was fine. The outside got slightly, mildly, actually comfortably warm, and as the hours wore on it got cooler (as the LS dimmed, of course).

The other day I ran a 3AA direct drive LS Legend for about 20 minutes, and the outside of the barrel registered 118 F right at the heatsink-to-body junction according to my IR thermometer. However, it was actually not too warm to hold for extended periods of time, and in fact when I did that the temperature actually dropped to 114 F.

Buh thass' all ah know's...so going by that (or by them four-mew-lahs), do you think a 2D MagLite would be able to drive a 5-watt Luxeon at 6V safely/comfortably?

#### Gransee

##### Flashlight Enthusiast
Btw, be sure to use your IR thermometer on the die itself. Measuring the case temp is fine, but that is not the most critical measurement.

There will be quite a few luxeon flashlights manufactured by xyz companys that seem to be running quite cool and efficient because the case is cool to the touch. But inside, the LED is being destroyed.

Putting a luxeon in a plastic flashlight housing is not generally a good idea...

Since it requires surgery to the LED to get a straight shot to the die, the next best thing is the casing of the LED. On the LSx series, we use a special hollow bezel (without optics) that allows us to run the LED while inside the housing to be tested and still get a good shot of the LED for the pyrometer.

To answer your question specifically Mr. Bulk, I recommmend you measure the temp of the LED if at all possible. Flashlight housing temp can be deceptive.

I would guess that the Mag light has sufficient surface area for that power level. The problem area would be to insure suffcient heat flow between the LED and the housing.

Personally, I flunked math and even so, the variables to account for the interface between the die and each progressive layer of material to get to the outside surface, is a bit complex. The possibility of misrepresenting one of the interfaces is good and it would make the results deceptive.

To get a good picture of how the die is doing in your maglight, remove the lexan lens and shoot the top of the LED with your IR thermometer. Move the sensor around to find the maximum reading.

This is the best method I recommend.

Peter Gransee

#### Doug S

##### Flashlight Enthusiast
Don/McGizmo: My original answer to your question leaves something to be desired. I oversimplified the model too much.

Originally posted by Doug S:
</font><blockquote><font size="1" face="Verdana, Arial">quote:</font><hr /><font size="2" face="Verdana, Arial">Originally posted by McGizmo:
Wayne and Doug,

Thanks for these useful pieces of information! this is one of those posts that it's a bummer that it will slide off the chart and this info will be hard to find later on.

Doug, how do you suggest imparting the resistor heat to the battery tube? Will thermal grease suffice? Wont some of the heat dissipate into the air in this case as opposed to a closed flashlight where the heat will one way or another need to pass through the body of the light? Probably want to seal off the light in as close to similar fashion as when the light will be in functional form?

- Don
<font size="2" face="Verdana, Arial">As long as you seal the light as you suggest [to prevent any heat transfer by air exchange], there is no need to make any provision to thermally couple the heat source to the flashlight. This is the very subject that we flogged in another thread recently. As long as the heat is generated inside, it will get out.</font><hr /></blockquote><font size="2" face="Verdana, Arial">My original answer is assuming that heat conduction within the metal is so good that temperature gradients within the length of the flashlight body can be ignored. This isn't a very good assumption. Those of you that are playing with these metal flashlights know this as the head of the flashlight feels warmer than the opposite end. If you perform the type of testing that Wayne and I are suggesting, it makes sense to measure the flashlight temperature in the immediate area of the heat source and to thermally bond the heat source to the inside of the flashlight body in a way that at least somewhat approximates the proposed ultimate arrangement for transferring the heat to the flashlight body. Your suggestion of using a thermal compound is a good one.

#### McGizmo

##### Flashaholic
Doug S.,

I intentionally set myself up for that one. ;-) Of course it may not be easy to totally seal the light with two wire leads coming out so I suppose that a good thermal path from resistor to housing would encourage the transfer of heat to housing direct. The better the thermal path, the quicker steady state housing temp is reached from a cold start, correct? If I'm wrong on this as well, I think it's time for me to find other pursuits......

Peter G,

I have a IR thermometer as well as a thermacouple accessory for my DMM. I have found that invariably i get higher temp readings with the thermacouple if I'm measuring the temp on the aluminium housings. If you shoot the LED itself with the IR, do you think there is any significant error from reflected IR on the lens or even IR eminating from the LED?

dummy in the back of the class

#### Doug S

##### Flashlight Enthusiast
I strongly argee with one thing Peter says above and *very strongly* disagree with another.

Originally posted by Gransee:
Personally, I flunked math and even so, the variables to account for the interface between the die and each progressive layer of material to get to the outside surface, is a bit complex. The possibility of misrepresenting one of the interfaces is good and it would make the results deceptive.

Peter Gransee
<font size="2" face="Verdana, Arial">I strongly agree with this [ not the math part, Peter, only you know that for sure
] From much of the discussion on the board here, it is apparent that there is likely much misunderstanding/misrepresentation of the various thermal interfaces between the die and ambient. On the positive side, we have been having an ongoing discussion the past few weeks on how to obtain and use data for meaningful thermal modeling of our mods. I hope that these discussions will lead to some useful results.

Originally posted by Gransee:
To get a good picture of how the die is doing in your maglight, remove the lexan lens and shoot the top of the LED with your IR thermometer. Move the sensor around to find the maximum reading.

This is the best method I recommend.

Peter Gransee
<font size="2" face="Verdana, Arial">Strongly disagree. Two problems here. The viewing aspect ratio of most inexpensive IR thermometers [the type the typical CPFer member would likely have] is not narrow enough to look at only the LED. More importantly, even if the IR thermometer was looking at the top of the LED only, what it is looking at is the surface temperature of the plastic housing of the LED. The materials of the housing are likely poor thermal conductors and thus, at the power levels that we are typically concerned with, the measured temperature of the LED housing will be much lower than the die temperature.

#### Gransee

##### Flashlight Enthusiast
Originally posted by McGizmo:
Peter G,

I have a IR thermometer as well as a thermacouple accessory for my DMM. I have found that invariably i get higher temp readings with the thermacouple if I'm measuring the temp on the aluminium housings. If you shoot the LED itself with the IR, do you think there is any significant error from reflected IR on the lens or even IR eminating from the LED?

dummy in the back of the class
<font size="2" face="Verdana, Arial">I interpret the IR data in two ways. Relative and absolute. Absolute to make sure I am not too close to the 135C limit. In the LS3, I measured 85C max pointing the sensor directly at the top of the LED dome.

And relative. Using the same LED, I try it with different heat sink configurations and compare the readings.

I think that everyone already has realized that the readings from the pyrometer (IR thermometer) is not as accurate as a direct measurement. Much of this has to do with type of surface you are reading. Bare aluminum does not provide a very good reading for example. I use a piece of milky colored scotch tape to increase the accuracy.

Placing

#### Doug S

##### Flashlight Enthusiast
Originally posted by McGizmo:
Doug S.,

I intentionally set myself up for that one. ;-) Of course it may not be easy to totally seal the light with two wire leads coming out so I suppose that a good thermal path from resistor to housing would encourage the transfer of heat to housing direct.
<font size="2" face="Verdana, Arial">Right. Adequate sealing should not be hard. A couple of cotton balls or a dirty sock should do it.

Originally posted by McGizmo:

The better the thermal path, the quicker steady state housing temp is reached from a cold start, correct? If I'm wrong on this as well, I think it's time for me to find other pursuits......
<font size="2" face="Verdana, Arial">N0! Please don't go.
Actually, as I do the math, you are theoretically correct but the differences are *very* small for the actual scenarios we are discussing.

Regarding you experience with IR vs TC measurements, see my post above commenting on Peter's post.

#### Gransee

##### Flashlight Enthusiast
Originally posted by Doug S:
Strongly disagree. Two problems here. The viewing aspect ratio of most inexpensive IR thermometers [the type the typical CPFer member would likely have] is not narrow enough to look at only the LED. More importantly, even if the IR thermometer was looking at the top of the LED only, what it is looking at is the surface temperature of the plastic housing of the LED. The materials of the housing are likely poor thermal conductors and thus, at the power levels that we are typically concerned with, the measured temperature of the LED housing will be much lower than the die temperature.
<font size="2" face="Verdana, Arial">It would be narrow enough if you held it at point blank.
Also, any other hot items in the field will not be as hot as the LED which will be represented in the majority of the reading.

To be sure, and as I alluded earlier, removing the plastic dome from the top of the LED would provide a more accurate reading. This will surely contaminate the die so this is hardly the "best" way to measure the die temp. Another method is to measure the tempurature of the LED foot (the metal surface on the back of the LED housing), but this is impractical because a good heatsink would use all of that foot. So essentialy you would be back to measuring some part of your heatsink.

Finally, placing a piece of tape on the dome would skew the readings as well since the tape would trap more light and therefore add heat.

So, although not ideal (in a perfect world), I still feel that measuring the dome with a pyrometer is the "best method". At least until something "better" comes along.

What I am going to do to increase my confidence in this method is to sacrifice a luxeon. I will let you know what the difference was between dome on and dome off.

Peter Gransee

#### McGizmo

##### Flashaholic
Doug,

Thanks, I like being theoretically correct even if in reality I suffer from cranial-rectal insertion.

For a one off, it probably isn't worth the effort but if one really wanted to know what was happening, would a good, real world test be the following?

Mill or drill a cavity for thermocouple in the Aluminum heat sink pad of the host and glue the emitter to sink pad with thermocouple contained in this "laminar" junction. If arctic silver epoxy were used for instance, it seems to me that you would be getting as close to the source as possible. Comments?

Peter G,

Thanks for reminding me about the tape! I have been measuring off dull HA surfaces primarily but after Roger read the instructions that came with my IR unit, where they also recommend masking tape, we found that this did provide a higher reading; still not as high as the thermacouple but certainly within a few degrees. My IR unit has a lazer pointing device so you really feel like you are a MR. Science wannabe when you use it.

- Don

PS. I don't have any socks but I can always pull the cotton out of my ears. EDIT:[ I guess Peter and I were composing at the same time; thanks for your best case, real world solution. Good luck removing the lens w/o pulling the die leads! (or whatever they are called)]

#### Gransee

##### Flashlight Enthusiast
OK, here they are. Luxeon sacrificed was a 1W batwing.

Dome on. 29C.

Dome off. 31C.

No tape used on dome in both measurments. DVM was used in parellel with PS so voltage was the same to two digits after the decimal. Watch was used to so that both readings where taken after 10 seconds of operation so they where both similiary heated. Luxeon current was 130mA.The die was allowed to cool down between readings. Room temp was 25C and the dome measured 26C with the LED off for 5 minutes.

Of course, to increase the accuracy of the experiment, I should take readings at different power levels for each luxeon, sacrifice 10 luxeons, average the tempurature differences, try it with and without tape, use a thermocouple on the die, etc.

Good science says the experiment must be done as least 10 times with similiar results and survive independant scrutiny (among other things). I encourage other CPFers to verify these results.

Without claiming my experiment is the last word in die measurment, I can say that I feel the accuracy of the method is usefull in ensuring that the LEDs in Arc flashlights are not being driven at dangerous levels.

Peter Gransee

#### Doug S

##### Flashlight Enthusiast
Originally posted by Gransee:
</font><blockquote><font size="1" face="Verdana, Arial">quote:</font><hr /><font size="2" face="Verdana, Arial">Originally posted by Doug S:
Strongly disagree. Two problems here. The viewing aspect ratio of most inexpensive IR thermometers [the type the typical CPFer member would likely have] is not narrow enough to look at only the LED. More importantly, even if the IR thermometer was looking at the top of the LED only, what it is looking at is the surface temperature of the plastic housing of the LED. The materials of the housing are likely poor thermal conductors and thus, at the power levels that we are typically concerned with, the measured temperature of the LED housing will be much lower than the die temperature.
<font size="2" face="Verdana, Arial">It would be narrow enough if you held it at point blank.
Also, any other hot items in the field will not be as hot as the LED which will be represented in the majority of the reading.

</font><hr /></blockquote><font size="2" face="Verdana, Arial">I'm not sure agree with this either but won't belabor it here.

Originally posted by Gransee:
To be sure, and as I alluded earlier, removing the plastic dome from the top of the LED would provide a more accurate reading So, although not ideal (in a perfect world), I still feel that measuring the dome with a pyrometer is the "best method". At least until something "better" comes along.

What I am going to do to increase my confidence in this method is to sacrifice a luxeon. I will let you know what the difference was between dome on and dome off.

Peter Gransee
<font size="2" face="Verdana, Arial">You are going to look at the die only and not the slug its mounted on? Dimensions of the die are on the order of 1mm. I would be surprised if your equipment can do this. If your equipment is good enough to see only the die plus slug, you could use the approximation that the measured temperature is the slug temperature [this approximation works since the slug area is large relative to the die area in the view of the instrument] and then use the die to slug thermal resistance to calculate the die temperature [15C/W in the case of the 1W emitters]

If you really must use IR measurement methods, a better way to "calibrate" your technique would be to take a "Star" configuration device and measure the backside under the emitter, calculate the die temp using the Luxeon datasheet thermal resistance, and then compare to your IR measurement of the front of housing. Measurements should be made after thermal steady state has been reached.

For other discussion of thermal measurement methods, see the link below:

http://www.candlepowerforums.com/cgi-bin/ultimatebb.cgi?ubb=get_topic;f=14;t=000376# 000019

#### McGizmo

##### Flashaholic
Peter,

Thanks for all the trouble and a couple of degrees C is close enough! What, you encourage us to trash Luxeons as well??? I DON"T THINK SO.

- Don

#### dat2zip

##### Flashlight Enthusiast
My initial caculations and formulas for detemining the heat rise or case temperature is a very simplified method.

As you have been discussing there are other issues with this.

One thing I do want to point out is that stacking or layering materials will result in the source to get hotter.

For example, A 1W resistor thermally bonded to the inside of the MAG body running 1W may have a temperature rise of the metal tube only of say 15C. (Example only). If the resistor is perfectly bonded to the metal tube the resistor will be at the same temperature rise at 15C + ambient.

Now, Let's take the resistor and thermally bond it to a small series of bricks made of aluminum. Each aluminum brick has a thermal resistance which we won't worry too much about it except note there is some thermal resistance. We bond the brick to the inside of the MAG barrel and repeat the test.

What will happen???

Well, As Doug has hit me on the head once for is the battery tube will again rise 15C above ambient for the same 1W of power. But, the resistor is not bonded to the battery barrel and is on the other side of the brick. The resistor in this case will be at a much higher temperature.

Each thermal resistive boundry of the brick will increase the temperature by it's thermal resistive coeeficient. So, let's say the brick has three thermal resistive layers each with a 20C/W coefficient. In this case we add the three up in series to get the resistors rise + add the barrel thermal resistance.

That would be 20C/W * 3 + 15C/W or 75C temperature of the Resistor.

Summary, The battery barrel will have a fixed temperature rise given a fix power. The hand touching the barrel will not give any indication that the LED is glowing red and melting down or running cool at the same temperature as the barrel.

Hope this helps.

WayneY

PS. Peter, you pics make feel right at home. Looks like my workbench

#### Doug S

##### Flashlight Enthusiast
Originally posted by McGizmo:
Doug,

For a one off, it probably isn't worth the effort but if one really wanted to know what was happening, would a good, real world test be the following?

Mill or drill a cavity for thermocouple in the Aluminum heat sink pad of the host and glue the emitter to sink pad with thermocouple contained in this "laminar" junction. If arctic silver epoxy were used for instance, it seems to me that you would be getting as close to the source as possible. Comments?
<font size="2" face="Verdana, Arial">This would be an excellent test. To get actual die temperature you would still need to add in the effect of the thermal resistance from your point of measurement to the die. In the case of the 1W devices this would be 17 C/W and for the 5W devices 11 C/W.

#### Doug S

##### Flashlight Enthusiast
Originally posted by Gransee:
Without claiming my experiment is the last word in die measurment, I can say that I feel the accuracy of the method is usefull in ensuring that the LEDs in Arc flashlights are not being driven at dangerous levels.

Peter Gransee
<font size="2" face="Verdana, Arial">Peter, Peter, I hardly know where to start. I will be brief. I fear that you are giving yourself a sense of security that is not justified by your experiment. Even if we accept your data as accurate [and there are good reasons not to] you are showing a 50% greater temperature rise over ambient between your two measurements. Assuming an ambient of 25C, your 85C measurement in your LS3 now becomes 115C. The problem with your data, is that you are trying to draw conclusions about a 2C difference in temperature when your instrument only has a resolution of 1C. As you know, IR measurements are also affected by the emissivity of the surfaces being measured. In your experiment you are measuring different surfaces not known to have similar emissivities. A further complication is with the IR instrument pictured in your photos. As I recall, it has a specified aspect ratio of 6:1. This however breaks down at very close measurements. It is hard to know exactly what your instrument is looking at in your measurements.