100W CHIP LED NOOB QUESTION

evanpnz

Newly Enlightened
Joined
Oct 31, 2014
Messages
1
I've been looking at leds and i guess most of them run a 3w at 1.8w so they last. Would a 100w chip run good on a 80w driver and would it hurt lumens or how do i or can i turn down driver on 100w driver to run at 90%? if it runs max current at 3.5a would it be better to get a driver with max current of 3a so leds last longer?

A good rule of thumb with any electronic component (LEDs no exception) is to never excede 80% of any maximum rating. if you want it to last forever, such as in a vehicle or similar application, never excede 60% of any rating. The one most people underestimate is temperature. Temperature builds up quickly, especially in a sealed enclosure. If you are using a sealed enclosure, try to go with a metal one, and clamp the wall of the enclosure between the LED base and heat-sink, with the heat sink outside the enclosure, taking care to remove paint from all sandwiched surfaces and also using thermal paste or adhesive between all surfaces of the sandwich. Get a cheap multimeter with a thermocouple probe and test the temperature of your creation, as close to the LED as possible, running for the maximum time you would ever conceivably use it, on a warm day. That's the way to make bulletproof designs.
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
I've been looking at leds and i guess most of them run a 3w at 1.8w so they last. Would a 100w chip run good on a 80w driver and would it hurt lumens or how do i or can i turn down driver on 100w driver to run at 90%? if it runs max current at 3.5a would it be better to get a driver with max current of 3a so leds last longer?

The trick is to run things cooler. The real enemy here is temperature. A rule of thumb that's been used in electronics for decades is that for every 10 degrees C increase in temperature, you cut the lifetime in half. That may not be exactly correct for LEDs, but the rule is something like that.

There are several ways to run your parts cooler.

Running them at lower power is probably the most obvious. For a given design, cutting the power by 20% also cuts the temperature rise by 20%. Temperature rise is the difference between 'die' temperature (temperature of the active element deep inside the LED, IC, transistor, or whatever part you are thinking about) and the 'ambient' temperature (temperature of the air, water, or whatever is the ultimate place the heat eventually goes). Die temperature is what determines the life of the part.

Better heatsinks and better thermal paths reduce the temperature rise. Sloppy design, poor materials, and bad construction increase the temperature rise. Innovative designs, good materials, and proper construction reduce the temperature rise.

All of these factors must be considered when trying to predict the lifetime of any design, including LED lighting.

LED lighting covers a vast array of features and specifications. To suggest that "most of them run a 3w at 1.8w" would be oversimplifying the situation. While that would increase the lifetime of the LED, you'll find some on CPF running 3W at 5W, and not expecting great life. You'll also find things between. You have to be aware of what all the design goals are before you can evaluate whether a specific design meets those goals.
 

SemiMan

Banned
Joined
Jan 13, 2005
Messages
3,899
In most fixtures you may be surprised to find out that reducing the power by 20% reduces the thermal rise by less than 20%.

Thermalman
 
Top