How do you determine wattage of a LED light?

joe1512

Enlightened
Joined
Jan 7, 2010
Messages
755
Ive been here a bit, but I don't have a good grasp on this.

Say you have an itp A1 with a CR123. On its maximum, it outputs around 200 lumens.

How much wattage is this? My guess was about 3 or 4 watts, but I honestly don't know.

My thought is that a power LED's brightness depends on how much current runs through it. Many lights are current-controlled. So, my itp A1 probably runs about 1 Amp through it to get the 200 lumens.

The big question is: How much voltage does it take to push 1A through the LED?

I know that the CR123 produces around 3 volts. An RCR123 can go from 2.8 to 4.2 varying. The current regulator should adjust the input voltage (within input specs) so that the current stays regulated. That means it transforms up or down to the right voltage, which would then pull more current from the battery.


Do most current LEDs have about the same resistance? For example, a MC-E is fully driven at 2.5 Amps. P7 at 2.8. Do they vary widely?

How much voltage would one need to drive the MC-E at 2.5 amps? Do the power LEDs have a certain resistance or do you put them in series with a resistor like the 5mm ones?

Once you know that, then Wattage is simply the current times Voltage.



For the itp A6 Polestar, I could say: 6xAA eneloop = 7.2 input voltage at 2000 mAh = 14.4 Watt*hours.

Given a runtime of a bit more than 1 hour, I could say that the Wattage is probably around 13 Watts. Assuming 2.5 amps driven, that means around 5.2 Volts are used to drive it.


Anyone have any insight into understanding this a bit better? Thanks!
 
LED wattage is no longer a good indicator of brightness as there are 3+ generations of LED types and multiple bins of each type with the latest generations up to 3-5 times as bright when pushed to their limits. The more efficient LEDs made today can put out more light with a lot less power. The only usefulness of determining wattage of LEDs is for battery consumption only (capacity).
watts = volts x amps. There is wattage consumed by the light and wattage delivered to the LED itself which vary due to circuitry between the battery source and LED to manage the power to proper levels needed. Running off a primary 123 cell it would have to have a boost circuit that would raise the voltage up to the required level needed to drive it and the boost circuit would eat up some power as it is not 100% efficient. You should be able to get a chart to determine drive level at currents for most LEDs.
And no, you cannot think of an LED as a resistor it doesn't quite act that way as it varies across the level of output according to its efficiency at that output level, in other words most LEDs get less efficient the more power put in them so more it lost in heat than light making them non linear as far as the resistance load they put in a circuit is.
 
Last edited:
The big question is: How much voltage does it take to push 1A through the LED?

The Vf of the LED will vary, for each LED you'll have to check the datasheets for the I vs Vf graph, then knowing what current it's being driven at you can find the average Vf. Note that it's only an average though, your specific LED may have a higher or lower Vf.

Also, note that in your A6 Polestar calculations you've neglected the fact that the driver won't be 100% efficient, and you'll have some power losses due to resistance in the wires and so on. Once you take those into account I think you'll find that the voltage is around 3.5V?
 
Thanks for the replies.

*I am aware that wattage is a terrible indicator of brightness. This really is just a curiosity thing. I was wondering what kind of boost circuit we are talkin about in general. i.e. a significant voltage boost, or not.

* Yes, I skipped driver efficiency, wiring resistance and other factors. I would think that the actual voltage needed would be higher than my simple calculation rather than lower, due to these factors.


Ill definitely read that PDF and learn some cool stuff. Thanks again!
 
* Yes, I skipped driver efficiency, wiring resistance and other factors. I would think that the actual voltage needed would be higher than my simple calculation rather than lower, due to these factors.

I went and checked up the MC-E datasheet just to be sure, and it says that to send 2.8 A through the LED, the Vf is around 3.5 V. Even with natural variation, I highly doubt it'll be more than about 3.8 V.

Let's take your previous calcs:

Battery pack: 14.4 Wh
Assuming 2.5 A driven

Now, if we assume that the driver is 80% efficient, that means for every 1 W the LED uses, the driver must take in 1.25 W. For the purposes of this exercise, if it outputs Vf*I W, will have to take in Vf*I*1.25 W to account for inefficiencies. Therefore, the driver itself consumes Vf*I*1.25 - Vf*I = Vf*I*0.25 W.

Back to the total input however. From your rough guess of the power consumption being 13 W, if the driver takes that amount of power in:

13 = Vf*I*1.25

We assume 2.5 A driven, so we get:

13 = Vf*2.5*1.25

Vf = 4.16 V.

On a side note, this means that we overestimated the driver's efficiency, since Vf is still ridiculously high for an MC-E. The end result is still the same though, factoring in driver inefficiencies doesn't increase the voltage fed into the LED, it reduces it calculated Vf.

If you have some spare money and a good enough power supply, get an MC-E, put the dice in parallel and apply 5 V to it. I almost guarantee that it'll go :poof:

Ahh, I noticed this bit:
*I am aware that wattage is a terrible indicator of brightness. This really is just a curiosity thing. I was wondering what kind of boost circuit we are talkin about in general. i.e. a significant voltage boost, or not.

Note that the MC-E is four dice, so you essentially have four separate LEDs that can be joined in parallel or series. In the calcs in your first post, you were assuming the dice were in parallel, requiring ~3.5V at 2.5 A (actually, in this configuration the max. is 2.8 A) to be at maximum brightness, but down here you mention a boost converter, which, given your power source, indicates that the LED is wired in series, thus requiring ~14V at 0.7 A to be at maximum brightness. It might also be 2s2p, but that'd be kinda worrying. Same wattage required, but completely different numbers.
 
Last edited:
The simple answer to your question "How do you determine the wattage of a LED light?" is to look at the data sheet.

This will not be exact. The only exact way would be to test the specific emitter.

Most led's are in the 3.6 V range (and it is a range). so you just need to know if your light is driven to max spec. If so you have your answer...V*A = W.

With emitters small changes in voltage input result in large changes in current draw which means that an emitter with a vf of 3.6V may draw 700mA at that vf but may draw 2A at 4V (made up figure) and may burn itself up at that current draw.
 
Top