heat considerations in design

Genzod

Banned
Joined
Apr 25, 2017
Messages
392
I'm not a builder, but I'd like to understand thermal limitations as a design constraint in something tiny like a light weight aluminum body AA headlamp.

I do understand that about 500 lm in a AA lamp builds a lot of heat and if there is thermal regulation, it ramps the output down.

I know you get a lot of heat when you lower the tint temperature, and max outputs tend to be substantially lower than in neutral and cool white. When you combine 93-95 high CRI with a low tint, do those have a synergistic effect on heat build up? It seems to me the more you introduce or enhance longer wavelengths of visible light, the more heat you get compared to the same boost in higher frequencies of light, is that correct?

Any insights in this regard would be appreciated.
 
Last edited:

archimedes

Flashaholic
Joined
Nov 12, 2010
Messages
15,780
Location
CONUS, top left
Ok, I'll give this a try :thinking:

Since we actually have real experts here on CPF, please bear in mind that I am neither a physicist nor an engineer ... and the following is an extremely (and perhaps excessively) simplified explanation.

When you push current through an LED, it generates both heat and light. The more light generated by a given amount of power, the more efficient that emitter would be considered to be, since light is the desired output.

Limiting our discussion to white-light phosphor-converted LEDs, the emitters themselves are producing blue (or nearby) wavelengths, and coated with a phosphor material which absorbs the blue and re-emits a broad range of wavelengths through a Stokes shift effect.

This reduces luminous efficiency in several ways ... the phosphor absorbs and blocks some of the blue light, it releases some additional heat (as energy must be conserved with the shift of the blue light to less energetic wavelengths) , and some of the light from the phosphor is emitted in the wrong direction ( "backwards" )

Again, vastly oversimplifying things, but higher CRI and/or warmer tint generally requires thicker phosphor layers. This, of course, reduces efficiency. Or, to state it another way, these require more power to create a similar output.

And more power running through an LED means more heat generated.

I am uncertain as to the relative contribution of each of those various factors to the total heat load, but I would guess that the majority (by far) is due to the greater current being driven through the high CRI emitter (and not that added by the Stokes shift, etc)

Does that help answer your question ?


Aaaand, I'm sure others will be by very soon, to correct and expand upon my jabberings above, lol :)
 
Last edited:
Joined
Feb 1, 2008
Messages
889
Location
US
Keep in mind that all heat is generated in size of pin head and it takes time for heat to reach the headlamp surface ....and you are limited with optical output from die for collecting heat.

Your mission is to ensure the best possible operating conditions for that led which working by producer spec. , including tint.

Changing the wavelength by temperature is not a good idea because there are involved many factors that are difficult or not can be monitored such as the material aging , effect from the electric current on semiconductor ,....

These things are not so important in some headlights but could effect on final result in pro photography and in special machines like '' liquid spectroscope'' .
 
Last edited:

Genzod

Banned
Joined
Apr 25, 2017
Messages
392
Ok, I'll give this a try :thinking:

Since we actually have real experts here on CPF, please bear in mind that I am neither a physicist nor an engineer ... and the following is an extremely (and perhaps excessively) simplified explanation.

When you push current through an LED, it generates both heat and light. The more light generated by a given amount of power, the more efficient that emitter would be considered to be, since light is the desired output.

Limiting our discussion to white-light phosphor-converted LEDs, the emitters themselves are producing blue (or nearby) wavelengths, and coated with a phosphor material which absorbs the blue and re-emits a broad range of wavelengths through a Stokes shift effect.

This reduces luminous efficiency in several ways ... the phosphor absorbs and blocks some of the blue light, it releases some additional heat (as energy must be conserved with the shift of the blue light to less energetic wavelengths) , and some of the light from the phosphor is emitted in the wrong direction ( "backwards" )

Again, vastly oversimplifying things, but higher CRI and/or warmer tint generally requires thicker phosphor layers. This, of course, reduces efficiency. Or, to state it another way, these require more power to create a similar output.

And more power running through an LED means more heat generated.

I am uncertain as to the relative contribution of each of those various factors to the total heat load, but I would guess that the majority (by far) is due to the greater current being driven through the high CRI emitter (and not that added by the Stokes shift, etc)

Does that help answer your question ?


Aaaand, I'm sure others will be by very soon, to correct and expand upon my jabberings above, lol :)

It expands on the physics of a higher CRI and lower temperature tint emitter, which I find new, interesting and helpful, learning it for the first time. Clearly the energy for the shift is coming from the electrical energy of the battery. It seems the more CRI you require and at lower tint temperatures, demands more energy given a certain level of emitter efficiency. I'm still wondering though about a synergistic effect. "
When synergistic parts work together, they accomplish more than they could alone". I'm thinking along the lines that the improved efficiency of an emitter isn't enough in such cases, and perhaps more weight from conductor metal and greater surface area of the case is required to draw away that heat for it to realize its potential.
 
Top