LED Power Considerations

sparkysko

Enlightened
Joined
Apr 2, 2007
Messages
228
Looking at the spec sheets of LED's, it's looking a little confusing, as the maximum power is given in amps not watts. If you look at the Seoul spec sheets it lists brightness based on amps in, not overall watts. So is the brightness entirely determined by the amps without consideration for the voltage? (assuming the voltage is within spec).

Say I have an LED that can accept 4v to 5v, and I feed it 1A. Will they be at the same brightness or will the 5v one be identical in brightness? It'd be intuitive to think that the 5v one would be brighter because it has a higher wattage, but the spec sheets don't reflect this, so WTF is going on here.

Another example, say I have the same LED and I give it 1 watt @ 4v x 250ma and an identical one I give 1 watt @ 5v x 200ma, would they be identical in brightness because the wattage is the same, or will the 4v x 250ma LED be brighter because the amperage is higher?
 
The voltage across the LED will vary as the current changes. Maybe 2.5V at low current, then 3.6-3.8V as you get up to the higher current ranges. You can't "put 5V across an LED" without a dropping resistor, and when you *do* add a dropping resistor in series you'll get maybe 1.3V across the resistor and 3.7V across the LED.

Does that make any sense ?
 
the way i like to think about it is : the current depends on the voltage (for leds in particular).

example: if i feed an led 3.7v it will draw 750ma. if i give it 4.2v it will draw 1.5amps (if the power supply or battery is capable of delivering this much current). these numbers are made up, just an example. leds are non linear in this sense, most other components will vary their current in direct proportion to the voltage. for leds, a small change in voltage causes a huge change in current.
 
Last edited:
So if I use one of those constant current driver, as long as the current is correct, let say 350ma, then I don't have to care much of the voltage?
 
Zen said:
So if I use one of those constant current driver, as long as the current is correct, let say 350ma, then I don't have to care much of the voltage?
You seem to be under the impression that current and voltage are independent of one another. That is not true -- the current is a function of the voltage, meaning if you supply a certain voltage, a particular current will pass through the LED as a result.

In the case of a simple resistor (R is fixed), this follows a relationship called Ohms law:
I = V / R
If you increase the voltage (V) the current I incrases by the same amount.

in the case of the LED though, the resistance (R) changes dramtically based on heating, and other factors. This means that a small increase in voltage or temperature will lead to a huge increase in current. That's why LEDs are difficult to control by regulating the voltage, and why hooking up an LED directly to a power supply coudl cause it to burn out.

The way the constant current driver works it constantly measures what the current is -- it then "tweaks" the voltage output as needed so that the current stays the same. This way it will keep consistent output even as a battery runs down, or the LED heats up.

In other words, the only thing you are interested in controlling the current. Once you know the current, you can find out how much power the LED is using by measuring the voltage drop across it -- use a voltmeter and connect one lead the positive, the other to the negative while the LED is on. Multiply that number by the current to find the power.
 
Last edited:
Zen, you are right.

The driver (buck, boost, or buck/boost) will try to provide the current selected for as long as possible based on the energy made available by the source.
 
Thank you 2x Trinity and Calina.

I guess the part that confused me is that LED doesn't have the fixed resistance like the regular incandescent light bulbs do. In a simple fixed resistor case, as long as the correct voltage is supplied, the circuit will work fine. No need to worry about what current is passing through fixed resistor.

So the figures on the LED manufacture spec sheet like 3.7 v 350ma means either I supply precisely 3.7 volt to the LED and the LED will draw 350ma of current or I supply 350ma to the LED regardless of what voltage. Am I correct?

I am new to the LED world. Any feed back is appreciated.
 
dont get fixed on watts, what gives?

most of the white leds run on about 4 Volts, so the only real important thing is to know:
how bright at a specific current?
 
Top