Do lumens equal heat?

BeastFlashlight

Flashlight Enthusiast
Joined
Mar 3, 2013
Messages
1,276
Location
Boston
Is it a mathematical constant that 1 lumen will always equal 'X' amount of heat? Obviously as technology advances lumen counts rise, but can technology improve upon the 'heat per lumen' factor or is that something that will always remain constant?

Here's what I'm driving at, it won't really matter in 20 years from now if one LED emitter can dish out 10,000 lumens all by itself, who cares if you'll still need a massive tank of a metal host to dissipate all that heat!? What advantage would that be? But, can technology make it so that 10,000 lumens 20 years from now only produces the heat content that 500 lumens produces today?
 

GordoJones88

Flashlight Enthusiast
Joined
Nov 26, 2011
Messages
1,157
Location
Tennessee
No. Heat is wasted energy.

A 60 watt light bulb is 10% efficient with 90% of the energy being wasted as heat. A light bulb works by heating up a filament that burns bright, so it's obvious there is a lot of heat loss.

An LED is 90% efficient with only 10% of the energy being lost as heat.

I don't if they can make a circuit 100% efficient and what would happen.
 

BeastFlashlight

Flashlight Enthusiast
Joined
Mar 3, 2013
Messages
1,276
Location
Boston
Oh ok, so a lumen is the 'good' part of the energy/battery conversion, and the heat is the unwanted 'bad' part of the energy conversion. So wow LED has already passed 90% marker of efficiency!! So there's already a greater than 90 to 10 ratio of lumens to heat. It makes you think that we must be slowly closing in on our limits. It's not like our LEDs are at 50%, if we've passed 90% we gotta figure that our largest curve of improvement is definitely behind us, especially since we know that hitting 100% is a fairly tale that can never happen.
 

inetdog

Enlightened
Joined
Mar 4, 2013
Messages
442
Is it a mathematical constant that 1 lumen will always equal 'X' amount of heat? Obviously as technology advances lumen counts rise, but can technology improve upon the 'heat per lumen' factor or is that something that will always remain constant?

Authoritative figures are remarkably hard to come by, especially since there are two ways of measuring the performance of incandescents and LEDs. One if the ratio of optical watts (actual energy in the form of light within the visible spectrum) to input electrical watts. In this area good incandescents with a reasonable life come in somewhere around 10%.
An LED should be at 35% or more. But more than 60%, when you count the losses in the driver circuit as well as the LED itself is not seen in consumer products.
The other measure is how many lumens, a measure of the amount of light that is leaving the bulb without taking into account the parts of the visible spectrum in which the eye is most efficient, are produced per watt of electrical input.

In this area, the best bare LED chip in the lab has gotten to ~230 lumens per watt. But packaged as a bulb replacement and including losses in the driver circuitry, an LED might only produce 60-70 lumens per watt.
A 60 watt 120 volt incandescent will be around 14 lumens per watt and a halogen bulb of the same size would be about 19 lumens per watt. A T8 fluorescent will be up to 80 lumens per watt.
There is a long way to go in LED efficiency, but they have an advantage that it is easier to get the light from the LED to end up where you want it than it is for an incandescent.

There are currently LEDs in the lab that will convert electrical energy into MORE optical watts. But they do it emitting low infrared and you have to supply heat to the LED from its surroundings as well as from the electric current.
It is sort of similar to the heat pump where more heat is moved to the room to be warmed than would be produced by sending the electricity through a resistance heater.

Right now these produce non-usefult light (except for infrared cameras) and only at extremely low power and high temperature.
 

TEEJ

Flashaholic
Joined
Jan 12, 2012
Messages
7,490
Location
NJ
If a light is 100% effective, there would be ZERO HEAT.

ALL the energy put IN would be light coming OUT.

:D

When I look at an LED flashlight with an infrared camera for example, I can see that the lens (Where the light is coming out) is NOT the hottest part, the head housing the electronics is the hottest part.
 

Yoda4561

Flashlight Enthusiast
Joined
Jan 22, 2007
Messages
1,265
Location
Florida, U.S.A.
In real world consumer products right now, we're hitting around 100-130 lumens per watt with top of the line fixtures designed with efficiency as a priority. That means we're only around 34% light/66% heat. The advances in the lab stuff at over 200 lumens per watt (and even a couple 200 l/w production models at low power) mean that in a few years time we'll see that reversed to 66% light 34% heat, and it'll only get better from there. (edit: drivers are usually around 90% efficient, I guess those could get a little better but I'm not sure how they'll do it cheaply enough for consumer lighting)


There will be limits related to heat, but mostly in the real of high power compact searchlight type illumination where they'll be trying to drive a tiny LED as hard as they can for maximum throw and output, in which case even at 100% efficiency, when you put out that much energy SOMETHING will absorb part of it and at some point you'll get enough heat that way to cause problems. IIRC you can actually experience this with a 400+ lumen bare power LED and placing your finger close but not on the led itself.
 

BeastFlashlight

Flashlight Enthusiast
Joined
Mar 3, 2013
Messages
1,276
Location
Boston
In real world consumer products right now, we're hitting around 100-130 lumens per watt with top of the line fixtures designed with efficiency as a priority. That means we're only around 34% light/66% heat. The advances in the lab stuff at over 200 lumens per watt (and even a couple 200 l/w production models at low power) mean that in a few years time we'll see that reversed to 66% light 34% heat, and it'll only get better from there.


I'm no guru with this stuff but I would like to be able to peek in on the efficiency progress from time to time. Is there any website or anything where I can go for a quick & easy look at an emitter's light to heat percentage? I look up XML2 but i'm not seeing a spec for light to heat ratio. I'd basically like to poke back into this forum from time to time, find out what the latest & greatest emitter is, then see how far we've improved on light vs heat. CAN'T WAIT till we advance to the point of reversing it to 66 to 34 ratio.
 

JCD

Enlightened
Joined
Apr 12, 2010
Messages
892
To further confuse matters, given a red LED and a green LED with identical radiant flux, the green LED will have superior luminous efficacy. The luminous efficacy depends not only on how efficient the emitter is as producing light, but also on which wavelengths of light the emitter produces.
 

rmteo

Flashlight Enthusiast
Joined
Feb 11, 2009
Messages
1,071
Location
Colorado, USA
LED Circuit: with efficiency of up to 95%


What does 95% specifically refer to and mean?
That is the conversion efficiency of the driver circuit. It means that what comes out of the driver (and into the LED) is 95% of what is supplied by the power source (ie. battery), 5% is heat loss.
 

paul.allen

Newly Enlightened
Joined
Feb 27, 2013
Messages
62
Okay, well as long as we got some smart people in here.

Eagletac lists on it's website:

LED Circuit: with efficiency of up to 95%


What does 95% specifically refer to and mean?


It also is Ideal, hence the "up to" part. If you look at the data sheet for the drivers used in these circuits you will see plots that show efficiency at various voltages in, different loads, it also depends on the caps and inductors you choose. It's a balance between cost vs size vs efficiency. Again "up to" is the key word.
 

MikeAusC

Enlightened
Joined
Jul 8, 2010
Messages
995
Location
Sydney, Australia
Modern LEDs put out around 100 Lumens per electrical watt input.

They also put out between 0.7 and 0.8 watts of heat per watt of electrical input - based an actual measurements.
 

seecol

Newly Enlightened
Joined
Mar 20, 2013
Messages
3
Watts is actual power usage. "Lumens" is how much light is provided from a specified amount of power. The typical light bulb converts some of the electricity driven through it into light, and the rest into heat. The more efficient the light bulb, the less power per lumen will be required.

That is why newer bulbs can be purchased that are 13 watts and advertised as equivalent to 60 watt incandescent bulbs - the light output is roughly equivalent (lumens), but the power usage is substantially different.
 

JCD

Enlightened
Joined
Apr 12, 2010
Messages
892
Watts is actual power usage. "Lumens" is how much light is provided from a specified amount of power.

Not quite. Radiant flux is how much light is provided by an emitter. The unit for radiant flux is also the watt.

Luminous flux, for which the unit is the lumen, is similar to radiant flux, but is weighted for the varying sensitivity to different wavelengths that we humans have. Two lights with the same luminous flux can have substantially different radiant flux.
 

Curious_character

Flashlight Enthusiast
Joined
Nov 10, 2006
Messages
1,211
This looks like a great place to pose a question.

The Luminus Devices SST-50 data sheet claims, on the first page, "over 100 lumens per watt at 1.75A". Elsewhere in the data sheet, the forward voltage at 1.75A is typically 3.2 volts, so the power input at that current is 5.6 watts. In the binning information, it gives flux values of 300 - 530 lumens depending on bin, and whether minimum or maximum. The range of maximum flux values is 325 - 530 lumens, which calculates to 58 - 95 lumens/watt. The SST-90 data sheet has a similar disparity, with the binning data indicating a range of 65 - 98 lumens/watt maximum and the same claim of "over 100 lumens per watt". Even assuming that the higher bin devices actually exist, what kind of new math is being used to get the "over 100" value?

On the topic of drivers, it's been a long time since I've designed a switching regulator, but at the time the major contributors to loss were the necessary inductor and diode. At the low voltage of an LED, even a Schottky diode drop is significant, so I think modern regulators use a power FET for that function. As for the inductor, you're bump up against the laws of physics, and once you reach a certain point the only way to reduce the loss is increase the size. If you're at the point that the FET loss is significant, you'll have to increase its size as well, which means a larger and more expensive integrated circuit. As pauLallen says, efficiency and size can be traded, and flashlights put a pretty severe limit on the size. 90% is really very good, considering the constraints, and I doubt that many flashlight drivers actually achieve that.

c_c
 

exSun

Newly Enlightened
Joined
Apr 10, 2013
Messages
4
Location
CO
On the topic of drivers, it's been a long time since I've designed a switching regulator, but at the time the major contributors to loss were the necessary inductor and diode. At the low voltage of an LED, even a Schottky diode drop is significant, so I think modern regulators use a power FET for that function. As for the inductor, you're bump up against the laws of physics, and once you reach a certain point the only way to reduce the loss is increase the size. If you're at the point that the FET loss is significant, you'll have to increase its size as well, which means a larger and more expensive integrated circuit.
c_c

MOSFETS are used in place of diodes for low voltage circuits. And physically small, low inductance inductors can be used by raising the switching frequency. I'm new to the forum, but perhaps some of the reviews mention switching frequency. I would guess that over time, the frequencies have increased.
 
Top