Can LED's be designed to produce more light with less heat?

HighlanderNorth

Flashlight Enthusiast
Joined
Sep 15, 2011
Messages
1,593
Location
Mid Atlantic USA
There wasn't enough room in the title area to ask my question in a way to make it more specific. Obviously, when humans have invented new sources of light, one of the problems is making the light energy efficient, because much of the energy is being converted into heat instead of light, which is wasteful, and if the light source becomes too hot, it can damage components. But my concern here is more related to the ability to dramatically increase run time(AC or DC) and potential brightness, while using less energy.

What I am wondering is whether it'll eventually be possible to produce LED's in the future that will effectively trade excessive heat energy for increased light energy, or will that require a whole different and new light source other than LED's?
 

mattheww50

Flashlight Enthusiast
Joined
Jun 24, 2003
Messages
1,048
Location
SW Pennsylvania
LED's are like many semi conductor products. They more or less follow Moore's law. That means we can expect continual improvements in LED efficiency for the foreseeable future. There are now numerous relatively high power LED's that exceed 100 lumens per watt. That was considered the 'Holy Grail' just a few years ago. It is very hard to guess where the efficiency will end, but even at 100+ lumens per watt, we aren't anywhere near 100% efficiency, so there is still a lot of room for improvement. The big names like Cree are 'tweaking' their products and processes on essentially a daily basis continually improving the efficiency of LED's. The results in ever more attractive Binning. When the XM-L was introduced, T6 was about the highest efficiency you could obtain. By the time XM-L production ended, they were producing U3's which provide about 14% more lumens per watt than a T6. The XP-L V6 bin is considerably more lumens per watt than the XM-L U3 bin.
 

eightbitpotion

Newly Enlightened
Joined
Jul 24, 2010
Messages
31
Like a few others on here, I work in IT, and what I'm seeing within the LED world is almost exactly what we saw in the mid to late 90's in processors. If that plays out the same way, then within a few years we should easily see current day lumens popping out at quadruple run times with very little heat. I've been in this community for quite a few years, and I remember when 120lm was hot as hell, burned for like 45mins, and were considered unbelievable; I don't expect the trend to slow down when the consumer market is expanding into home and automotive lighting solutions. Perhaps I'm trying to compare apples to oranges though.
 

parametrek

Enlightened
Joined
Apr 3, 2013
Messages
578
Actually you can calculate where efficiency will end. According to https://en.wikipedia.org/wiki/Luminous_efficacy the maximum efficiency for a phosphor-based white LED is 260-300 lumens per watt. It is possible to do better with multiple hues of emitters, but even then the absolute maximum for a zero-CRI light source of any type is 680 lumens per watt.

Now what I don't quite understand is how Cree has claimed to make white LEDs that perform at better than 300 lumens per watt: http://www.cree.com/News-and-Events/Cree-News/Press-Releases/2014/March/300LPW-LED-barrier But that was two years ago and you still can't buy LEDs that efficient. (There are some theories that because LEDs get more efficient at lower powers, you can instead make a giant LED, run it at 1% output and have it produce useful light at amazing efficiency.)

Moore's Law doesn't quite apply to LEDs because nothing is shrinking. However there is a law just for LEDs: https://en.wikipedia.org/wiki/Haitz's_law In a nutshell every decade the cost per lumen falls 90% while the lumens per LED increases 20-fold. However this does not say anything directly about efficiency. More light with less heat is what efficiency is all about.

LEDs have been progressing at a breakneck pace. We are already very close to the theoretical physical limits.
 

CuriousOne

Enlightened
Joined
Oct 14, 2012
Messages
813
300lm/w leds have CRI less than 60. And LEDs with CRI>90 are around 60lm/w only. I'm speaking of ones that you can buy right now, not some vaporware announcements.
 

HighlanderNorth

Flashlight Enthusiast
Joined
Sep 15, 2011
Messages
1,593
Location
Mid Atlantic USA
300lm/w leds have CRI less than 60. And LEDs with CRI>90 are around 60lm/w only. I'm speaking of ones that you can buy right now, not some vaporware announcements.

Why are poor color LED's literally 500% more efficient than high CRI Led's? Is it just not possible to produce a heat efficient Led that renders colors well AND doesn't look blue-ish in the spill beam and yellow-greenish in the hot spot?
 

CuriousOne

Enlightened
Joined
Oct 14, 2012
Messages
813
If I remember correctly, it took about 50 years, to develop high CRI phosphors for luminescent lamps. So same with leds - current phosphors provide either low CRI & high efficiency or high CRI and low efficiency.
 

parametrek

Enlightened
Joined
Apr 3, 2013
Messages
578
Why are poor color LED's literally 500% more efficient than high CRI Led's? Is it just not possible to produce a heat efficient Led that renders colors well AND doesn't look blue-ish in the spill beam and yellow-greenish in the hot spot?

Correct, it is impossible using the current technology. Modern white LEDs are a blue LED combined with phosphors to produce the other colors. The phosphors convert light from one wavelength (color) to another. But they can only go from short to long and they throw away energy in the process. The bigger the step the more energy is thrown away. Blue to green isn't too wasteful.

So cold white (mostly blue) will be more efficient. Warm white is less effecient because it has more of the red tones, and converting blue to red means throwing away almost half of the energy. High CRI typically is based on royal blue (shorter wavelength than normal blue) and it is even less effecient to convert royal blue to red.
 
Top