Ideas for Evaluating the Effectiveness of Heat Sink of LED's and Drivers

EngrPaul

Flashlight Enthusiast
Joined
Sep 28, 2006
Messages
3,678
Location
PA
How does one evaluate how well heat sinking is working for an LED, and perhaps it's driver?

I know of two obvious subjective methods that require no equipment:

(A) Take the flashlight apart, and evaluate how it was put together.
(B) Put the flashlight on high and watch for a tint change.

Are there any quantitiative methods of determining how well heat is handled? These requires some level of equipment.

Here are some ideas:

(1) IR Temperature monitoring of components.
(2) Measuring efficiency change over warm-up using a constant voltage source (lumen output vs. input power over time)
(3) Accurate measure of tint changes (spectrograph, etc)

Any ideas or examples of such methods?

TIA
 
I purchased a CREE P4 star to play with. The thermal resistance of the CREE is listed as 8 degrees C per watt from junction to back of the emitter. The thermal resistance of the MCPCB was not listed.

I wanted to connect it to a heat sink and use it as a light in my basement. After some searching I found that the CREE voltage drop due heating is 2mv/C.

Using a constant current source of 1A I measured the voltage across the LED as soon as I turned on the supply and about 5 minutes later. I got 140mv change. I calculated a junction temperature rise of 70C.

I attached the star to a 5"X5"X0.125" aluminum plate and repeated the measurements. 20mv change for a 10C rise.

I know I am missing some heating at the very beginning of the turn on and I may not get real accurate results but it is good enough for what I am doing.

At 2 watts I calculate a 30C rise and my basement never gets above 25C. With a junction temperature of about 55C the LED should last about 150,000 hours (17 years continuous use) before the output drops by 30% and I notice a difference.

BTW I have found another document that states the CREE voltage drop is 3mv/C so the heating may be less than my original calculations.
 
Hi,

I use the junction voltage change too in order to
approximate the temperature rise, so can understand
how well a heatsink AND the thermal compound between
heatsink and LED are working.
I use 2mv per deg C also, and it seems to work ok.
It's especially good for comparing two different heat sinks
or other comparisons.
 
How does one evaluate how well heat sinking is working for an LED, and perhaps it's driver?

I know of two obvious subjective methods that require no equipment:

(A) Take the flashlight apart, and evaluate how it was put together.
(B) Put the flashlight on high and watch for a tint change.

Are there any quantitiative methods of determining how well heat is handled? These requires some level of equipment.

Here are some ideas:

(1) IR Temperature monitoring of components.
(2) Measuring efficiency change over warm-up using a constant voltage source (lumen output vs. input power over time)
(3) Accurate measure of tint changes (spectrograph, etc)

Any ideas or examples of such methods?

TIA

You may want to take a page from the computer world. May I recommend reading over the test methodology at www.overclockers.com ? They have a review suite that (for LED purposes) is probably overkill, but it is proven and quite effective.

They have a die simulator for both large and small contact chips, similar in size to a star, but the heat load would be much lower than what you've got.

I've written one of the authors there a couple of times and he's been very responsive. I've also since suggested he take a look at the LED applications for his heat pipe work-there's a market there for heat pipes that is pretty much untapped in the LED world.
 
Top