I understand that many of us here overdrive our LEDs beyond the maximum rated current based on the CREE (or other mf'rs) datasheets, but has anyone ever contacted CREE to find out the reasons for not driving an LED beyond the rated current? What I am really asking, is the current/voltage (or wattage/amount of energy passing through the LED) the part that is damaging to the LED or is it the junction temperature that they are concerned about?
We're all familiar with the tests done on an XML, for example, that is directly mounted to a giant block of copper and actively cooled while driving it to upwards of 6A without many adverse affects. Additionally, an XPG driven at 1.5A without the proper heatsinking will also see some color shift over an extended period of time. This leads me to believe that it is the heat that is the main problem.
On the other hand, turning up the voltage to an LED will quickly damage the LED and cause color temperature shifts and other problems.
I'm really just trying to get a better understanding of how CREE or other mf'rs choose the "maximum current ratings" for a certain LED and why they choose that particular number (all ratings seem to curiously land on very round amperages -- 350mA, 700mA, 1.0A, 1.5A, 3A, etc.). Does anyone have any first-hand information from the manufacturers on this matter? Opinions and speculations are also welcome, however like anything else, taken with a grain of salt.
Thank you all in advance!
We're all familiar with the tests done on an XML, for example, that is directly mounted to a giant block of copper and actively cooled while driving it to upwards of 6A without many adverse affects. Additionally, an XPG driven at 1.5A without the proper heatsinking will also see some color shift over an extended period of time. This leads me to believe that it is the heat that is the main problem.
On the other hand, turning up the voltage to an LED will quickly damage the LED and cause color temperature shifts and other problems.
I'm really just trying to get a better understanding of how CREE or other mf'rs choose the "maximum current ratings" for a certain LED and why they choose that particular number (all ratings seem to curiously land on very round amperages -- 350mA, 700mA, 1.0A, 1.5A, 3A, etc.). Does anyone have any first-hand information from the manufacturers on this matter? Opinions and speculations are also welcome, however like anything else, taken with a grain of salt.
Thank you all in advance!