what IS a 1-watt LED emitter ?

RustyShackleford1

Newly Enlightened
Joined
Jan 29, 2015
Messages
45
I'm a bit confused about LED power designations. I believe the Cree XP emitters (e.g. the XPG2 line) are considered "one watt" LEDs, or at least the "typical" specs list something like 350ma @ 2.8v, and in the data sheet, "relative luminous flux" is listed relative to 100% being at 350ma. But it also shows a maximum forward current of 1500ma. So why is 350ma chosen as the nominal value ?

I guess what I really want to know is, is there any reason I can't drive one of these LEDs with up to 1500ma, without any penalty in longevity, as long as I deal with thermal management adequately, IOW, design so that the thermal resistance from junction to ambient is low enough to keep the junction within spec (for example, as per the graph on page 8 of http://www.cree.com/~/media/Files/C...-Modules/XLamp/Data-and-Binning/XLampXPG2.pdf) ?
 

calipsoii

Flashlight Enthusiast
Joined
Apr 21, 2010
Messages
1,412
It's an old term from when 5mm LEDs were prevalent and 90mA was considered a high drive current. I want to say Luxeon was the first to use it, mainly as a marketing device to distinguish their new emitters from the familiar low-power 5mm's of the day.
 

Steve K

Flashlight Enthusiast
Joined
Jun 10, 2002
Messages
2,786
Location
Peoria, IL
I'm a bit confused about LED power designations. I believe the Cree XP emitters (e.g. the XPG2 line) are considered "one watt" LEDs, or at least the "typical" specs list something like 350ma @ 2.8v, and in the data sheet, "relative luminous flux" is listed relative to 100% being at 350ma. But it also shows a maximum forward current of 1500ma. So why is 350ma chosen as the nominal value ?

Cree's datasheet doesn't say anything about being a 1 watt device.
Back in the old days, 350mA was a common current to define the LED's characteristics, and it seems that some habits die hard. Cree does define some characteristics at additional currents too, so there's not really any implication that it is only good up to 350mA.

I guess what I really want to know is, is there any reason I can't drive one of these LEDs with up to 1500ma, without any penalty in longevity, as long as I deal with thermal management adequately, IOW, design so that the thermal resistance from junction to ambient is low enough to keep the junction within spec (for example, as per the graph on page 8 of http://www.cree.com/~/media/Files/C...-Modules/XLamp/Data-and-Binning/XLampXPG2.pdf) ?

Yep... keep that junction temperature in the comfortable zone, and you'll get the advertised lifetime from the LED.

There is a rule of thumb that says that lower temperature leads to higher reliability, but that has more to do with impurities drifting through the crystal lattice and such. It's still a good idea, though. :)
To some degree, it can even affect the reliability of solder joints, since you get less expansion and contraction as things heat up and cool off.
The Cree datasheet shows 150C as the max temperature, but it also mentions that the bins were selected at 85C. I think 85C would be a good max temperature to design for. If the heatsink can be touched by people, I certainly wouldn't want it to be close to 85C.
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
I think 85C would be a good max temperature to design for. If the heatsink can be touched by people, I certainly wouldn't want it to be close to 85C.

I'm sure Steve knows this, but just to be clear for everyone, a junction temperature of 85C is a good design goal. The heatsink surface temperature would be well below that. Conversely, if the heatsink is at 85C, the junction is considerably hotter!
 

RustyShackleford1

Newly Enlightened
Joined
Jan 29, 2015
Messages
45
... there's not really any implication that it is only good up to 350mA.

... keep that junction temperature in the comfortable zone, and you'll get the advertised lifetime from the LED.
Ok, thanks for clarifying that, folks.

As far as thermal design, I believe I've read that thermal resistance is about 5 degrees per watt (for the Cree XP on a good MCPCB). So if I wanted to go balls-to-the-wall and run at 1500ma, I'd want to keep the MCPCB to 60C or so (if I'm optimistically aiming for the 85C junction temperature). Seems like the thermal resistance from MCPCB to ambient is a lot harder to calculate, so it'd be swell to just measure the MCPCB temperature directly. I wonder what's the best way to do that ? The IR guns don't really seem precise enough to get a reliable reading on such a small target.
 

petrochemicals

Newly Enlightened
Joined
May 28, 2013
Messages
78
Location
U.K.
In efficiency terms 1w seems to be the most standard emmitter power value, after that the efficiency drops considerably, I cant vouch for cob leds as it gets a little complicated. I suppose 1w is still considered as low as you can go with reasonable output, whilst still having an efficiency reading. 85 degrees centigrade is a standart temperature to take readings at(25 was considered unfeasible). Trouble is that it doesnt give you efficiency at higher drive currents which I suspect most leds are driven at these days. I suppose there are industry standards for most things, moisture content seems to be a large factor, having read data about baking opened leds to remove moisture. Humidity of the air or complete vacuum for the measurements, distance from emmitter, spectrum measured etc etc etc.
 

Steve K

Flashlight Enthusiast
Joined
Jun 10, 2002
Messages
2,786
Location
Peoria, IL
I'm sure Steve knows this, but just to be clear for everyone, a junction temperature of 85C is a good design goal. The heatsink surface temperature would be well below that. Conversely, if the heatsink is at 85C, the junction is considerably hotter!

A good point.. I was just worried about getting too far into a separate subject and losing the main subject of discussion.

It does bring up the whole discussion of heatsinks and the potentially loose relationship between heatsink temperature and junction temperature. With a small heatsink and low thermal resistance between the LED and heatsink, the heatsink could be very close to the junction temperature In this case, you wouldn't want to touch the heatsink if the junction temperature was 150C.

On the other end of extremes, the heatsink might be very large, but there might be a large thermal resistance between the LED and the heatsink. In this case, the junction temperature could be 150c and the heatsink temperature could be close to the temperature of the air. The cool temperature of the heatsink could fool you into thinking that the junction temperature was also very low.

A good practice is to measure the temperature of the LED close to where it connects to the circuit board or heatsink or whatever it is mounted to. A thermal imager might be useful too, but I haven't used one and can't speak with any authority.

As a rule of thumb.. I like to have the heatsink be cool enough to touch without thinking that it is really uncomfortable to hold for a length of time. Hmmm... it guess that is literally a rule of thumb. :)

and let me add the disclaimer that this advice is targeted at the hobbyist, as I'm sure that DIWdiver knows this already.

to go back to the discussion about proper thermal design, the datasheet is a bit vague about what junction temperature is best. They do point the user towards their document titled Cree XLamp Long-Term Lumen Maintenance....
http://www.cree.com/~/media/Files/C...Application Notes/XLamp_lumen_maintenance.pdf
This document goes into great detail about what contributes to loss of light output and what sort of degradation occurs in each part of the LED. Pretty neat stuff (for the tech nerds among us).

Looking through the datasheet, I'm struggling to find where it specifies the L80 life, or any other specific discussion of lifetime. The graph for thermal design on page 8 mentions "optimal life and optical characteristics", but that seems to be the only mention of lifetime.

The datasheet also links to their Thermal Management document....
http://www.cree.com/~/media/Files/C... Application Notes/XLampThermalManagement.pdf
Page 3 references a LM-80 summary for XLamp LEDs, and has a link to a TM-21 Calculator that lets the user calculate the LM-80 lifetime under different conditions.
It's nice that they provide this info, but it seems like they should at list some typical number for the LM-80 lifetime in the datasheet and then direct the user to go through the Thermal Management document for detailed info.
(disclaimer: I'm writing this early in the morning and may have missed some important detail)
 

Steve K

Flashlight Enthusiast
Joined
Jun 10, 2002
Messages
2,786
Location
Peoria, IL
Ok, thanks for clarifying that, folks.

As far as thermal design, I believe I've read that thermal resistance is about 5 degrees per watt (for the Cree XP on a good MCPCB). So if I wanted to go balls-to-the-wall and run at 1500ma, I'd want to keep the MCPCB to 60C or so (if I'm optimistically aiming for the 85C junction temperature). Seems like the thermal resistance from MCPCB to ambient is a lot harder to calculate, so it'd be swell to just measure the MCPCB temperature directly. I wonder what's the best way to do that ? The IR guns don't really seem precise enough to get a reliable reading on such a small target.

the MCPCB is really just intended to be the interface between the LED and the heatsink. I've used MCPCBs from Bergquist, and their datasheets do provide a value for thermal resistance.
http://www.bergquistcompany.com/pdfs/LED Config Sht_Rev 4.pdf
This part has a thermal resistance a bit below 5 deg C/watt.
You'll want to look into heatsinks and read their datasheet or product info. They are designed to be the place where the heat is dumped into the air.
 

RustyShackleford1

Newly Enlightened
Joined
Jan 29, 2015
Messages
45
Wow, thanks for the great discussion and links, guys ...

I'm actually doing two projects with these emitters. In one, I'll max the current out (so about 2.5watts), but the heat sink will be huge (the stars bonded to aluminum roofing flashing, only one emitter per 8-10" or so of length, and several inches wide). The other is more problematic: well under a watt, but no heatsink at all, other than the star itself (20mm Bergquist).

I'm actually probably going to be using an Osram device for these projects, as Cree does not have a 3v emitter between 2200 and 2700 CCT, and light tone is key for me; 2400 is the sweet spot and Osram has: http://www.mouser.com/ds/2/311/LCW CR7P.EC-318280.pdf. Max current is 800ma, but I can live with that.

The good news: they seem to suggest 135C is an acceptable long-term junction tempertaure. The bad news: the thermal-resistance junction-to-solderpoint is a big higher, perhaps 11C/W.

I guess the slightly less than 5C/W number Bergquist quotes is for solder point to heatsink.

I like the rule-of-thumb that the heatsink shouldn't be painful to touch; I use it myself :) I'm not sure exactly what temperature that is though (where metal becomes painful to the touch); I imagine it's well below 100C though (thinking of accidentally touching something metal in a sauna, which is around 70-80C, I believe).

It's also interesting to note that (at least according to Cree) a significant fraction of the electrical energy is converted to light (duh !) so one can very conservatively assume that the heat to be dissipated is 75% of V*I, likely a bit less (a direct function of the lumens-per-watt efficiency, I'd imagine).
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
A couple of things. First, aluminum flashing is too thin to make a good heatsink. Once you get half an inch or so from the MCPCB, you'll find there's little heat getting there. Think of it like water flowing through a pipe. Take a big pipe that can carry lots of water, then squash it flat. It will carry a lot less water.

I haven't read their information carefully, but it doesn't seem plausible that you can operate an Osram at 135C junction temp and expect a lifetime anywhere close to what you can get from a Cree (or an Osram!) running at 85C. It may be reasonable from a reliability standpoint, but not from a lifetime standpoint.

I can hold a mid-sized aluminum flashlight that's about 60C, but it's like a hot potato. You don't grab it hard or long at that temp. My wife would cry out in pain and drop it on her foot if I handed such a thing to her. Everybody's different. A sauna may be hotter than that, but notice there are no metal objects in it for you to touch!

You're correct that the 25% number is conservative. Well, maybe. Modern high-quality LEDs reaching 150+ lm/W have radiometric efficiency more like 40%. Some exceed 50% under certain conditions. But some LEDs (especially COBs) are still under 100 lm/W, and in some cases 25% is optimistic. The exact number depends on details of the spectral content, but somewhere in the 325-350 lm/W range is 100% radiometric efficiency for white light, IIRC.

To be a bit pedantic, Lumens per Watt is properly termed efficacy. Efficiency should always be a dimensionless number, the ratio of power out to power in. Because lumens are weighted by the sensitivity of the human eye to various wavelengths, they do not correlate directly to radiometric power without consideration of the spectral content. This is rather inconvenient to thermal calculations, because efficiency is what we want, but efficacy is what we all talk about and have better tools for estimating. However, it is possible to convert from one to the other using anything from a rough estimate to a careful calculation based on spectral content.
 

RustyShackleford1

Newly Enlightened
Joined
Jan 29, 2015
Messages
45
Thanks for all the info, DIW ... lotta useful stuff there. Efficacy vs. efficiency is especially interesting.

If you have a chance to look at p.4 of the Osram datasheet from Mouser that I linked, I wonder if the difference between "real" thermal resistance (from junction to solder point) and the lower "electrical" thermal resistance, is factoring in efficacy ?
 

Illum

Flashaholic
Joined
Apr 29, 2006
Messages
13,053
Location
Central Florida, USA
1-Watt LED emitter in todays standards is typically a high powered LED that has a forward voltage of about 3-3.3V at 350mA and blows at 1000mA :)
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
Okay, I looked at the datasheet. Indeed it does look like the "electrical" thermal resistance takes into account the efficiency. In fact, they say right there "with efficiency = 35%", and the electrical thermal resistance is exactly 65% of the "real" thermal resistance. It would be more clear to say "real" resistance is Kelvins per Watt of heat generated, while "electrical" resistance is Kelvins per Watt of electrical input. That's actually a fantastic thing for anyone trying to do heatsink calculations. Kudos to Osram!

I also see that 135C is the "maximum junction temperature". Based on the fact that they also specify an "absolute maximum junction temperature" of 160C, and that this is verified by testing 30 units to have at least 1000 hour life (L70B50), I would surmise that they are promising 1000 hour life at 135C based on testing at 160C. L70B50 is the time it takes for 50% of units to fall below 70% of initial output.

A lifetime of 1000 hours is great for most flashlights, but sucks for most lamps, especially LED lamps, IMHO.
 

RustyShackleford1

Newly Enlightened
Joined
Jan 29, 2015
Messages
45
A couple of things. First, aluminum flashing is too thin to make a good heatsink. Once you get half an inch or so from the MCPCB, you'll find there's little heat getting there.
You don't happen to have an equation for computing overall thermal resistance of a disk-shaped heatsink (say, with a point source of heat at the center) ? I tried trying to figure out the integral, but realized it's not the electrical-analog situation where you might have one electrode at the center and one circular electrode around the edge. Instead, if you think of the thing as a bunch of concentric annular rings (in the limit, for the integral), each ring has a resistance to ambient on its top and bottom edges, in parallel with the resistance of the portion of the disk surrounding it.
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
You don't happen to have an equation for computing overall thermal resistance of a disk-shaped heatsink (say, with a point source of heat at the center) ? I tried trying to figure out the integral, but realized it's not the electrical-analog situation where you might have one electrode at the center and one circular electrode around the edge. Instead, if you think of the thing as a bunch of concentric annular rings (in the limit, for the integral), each ring has a resistance to ambient on its top and bottom edges, in parallel with the resistance of the portion of the disk surrounding it.

Actually, it is the same calculation as the electrical resistance, and the integral of concentric rings would be the way to calculate it. That method doesn't work right under the LED, or within one thickness or so, but beyond that it's pretty accurate.

But you can get decent enough results for an estimate by considering just a few finite concentric rings. Start with an inner diameter approximately the size of your LED (or the star it's on), and a width of maybe 1/8". Approximate that by a strip 1/8" wide and as long as pi*D, where D is the average diameter of the ring. The resistance across the width of that strip is simple to calculate. Do just one and I think you'll be amazed how high the number is, because of how thin the material is.
 

RustyShackleford1

Newly Enlightened
Joined
Jan 29, 2015
Messages
45
Actually, it is the same calculation as the electrical resistance, and the integral of concentric rings would be the way to calculate it.
I understand the concept of an electrical analog to heat transfer (with voltage differential corresponding to temperature differential, and electrical current corresponding to heat flow).

But I believe you're mistaken that the calculation is the same. In the electrical situation - point electrode at the center and ring-shaped electrode around the outside of the disc - current in one concentric ring can only flow into the surrounding concentric ring (not into the air above and below the disc). In the heatsink, heat in one ring can flow two ways: into the surrounding concentric ring (as in the electrical analog) but also into the air above and below the ring (after all, heat flow into the air, by the various modes of convection, conduction, and radiation*, is THE only way heat ultimately flows into the ambient environment).

I'm not doubting your intuition - based, presumably, on far more experience than mine - that a very thin heatsink (like roof flashing) will not be effective. But I'd like to do an actual calculation, to see how bad it is, and how thick the metal would need to be for better results.

* Radiant transfer can, of course, flow into a vacuum - no air need be present - as in the sun's heating of the solar system.
 
Last edited:

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
Ah, you are right. I'm used to doing that estimate for a diving flashlight, in which there is negligible transfer anywhere except the outer ring.

But I still think that if you do the calculation for just one ring, which would have small (though perhaps not trivial) dissipation, it would be very enlightening.
 

RustyShackleford1

Newly Enlightened
Joined
Jan 29, 2015
Messages
45
I guess I'll just bond my stars to some random piece of metal that's quite a bit thicker (than the flashing I was planning on using), probably using something like the Bergquist BP100 thermal double-sided tape. For the one project where I'm driving the LEDs hard, I want LEDs every 8-10" spaced out over 5-6ft, so a strip of metal from the scrapyard should work.

However, I was shocked, when I looked up the thermal conductivity of various metals, at how varied and counter-intuitive the values are. For example, stainless steel is dreadful, and carbon steel isn't that great. Unsurprisingly, copper is the best (but brass is bad), and aluminum is pretty decent:

https://neutrium.net/heat_transfer/thermal-conductivity-of-metals-and-alloys/

Gold and silver are great though, if you just happen to have some of that lying around :)
 
Last edited:

Steve K

Flashlight Enthusiast
Joined
Jun 10, 2002
Messages
2,786
Location
Peoria, IL
.....

Gold and silver are great though, if you just happen to have some of that lying around :)

just in case you have a bunch laying around, don't forget the excellent thermal conductivity of diamond!

http://chemistry.about.com/od/geochemistry/f/Is-Diamond-A-Conductor.htm
"Diamond conducts heat well as a result of the strong covalent bonds between carbon atoms in a diamond crystal. Thermal conductivity of natural diamond is around 22 W/(cm·K), which makes diamond five times better at conducting heat than copper".
 

Latest posts

Top