What combination of voltage and current can I use?

pinoy

Newly Enlightened
Joined
Oct 9, 2010
Messages
17
Let's say I have a 3 watt LED I don't know anything about. Can I apply any combination of voltage and current to it as long as it adds up to 3 watts of power? Can I power it with 30 volts and 0.1 A just as easily as with 0.1 volts and 30 A? Both configuration add up to 3 watts of power.
 
Let's say I have a 3 watt LED I don't know anything about. Can I apply any combination of voltage and current to it as long as it adds up to 3 watts of power? Can I power it with 30 volts and 0.1 A just as easily as with 0.1 volts and 30 A? Both configuration add up to 3 watts of power.

XP-G datasheet, look at page 4.

If you give an LED constant current, it will have a given forward voltage (with some variation due to manufacturing). Or if you give an LED constant voltage, it will experience a particular current. But LEDs are very sensitive to small voltage changes, and those variances I mentioned mean that what would work fine with one LED would destroy another one of the same type.
 
The best place to start with a LED would be 3V (I think). The LED will then draw however much current it is designed to at that voltage. If you measure it, you can work out how many watts it's using at 3V.

Then try 3.1V and measure the current again.

You can then start to get an idea of the current drawn at a particular voltage.

Anyone smarter than me feel free to correct me if I'm wrong
 
Let's say I have a 3 watt LED I don't know anything about. Can I apply any combination of voltage and current to it as long as it adds up to 3 watts of power? Can I power it with 30 volts and 0.1 A just as easily as with 0.1 volts and 30 A? Both configuration add up to 3 watts of power.

nope because the gate thing doesnt trigger unless there is enough voltage to run that gate.
the electronic things in there that put out the light have very specific voltages they operate at.

if you power with 30V at.1-1.0 amps because your fully controlling the "current" the led will still only reach the voltage at that current.
and example of that is a 30V bench power supply set to fully control tha max current to the led. the actual voltage at the led will not be 30.

to control the current , they are "controlling the voltage" to reach that current level. so (again) the voltage read across the led will still be what it would be "at that current"

if you slammed it with 30V and did not control the current, then you would get only the magic smoke.
if you hit it with 1V , you will get nothing, because the electronic gate has not triggered.

they do have leds that are in "Arrays" where the voltage of the string of leds together is much higher, but still each led item in the array works at a specific voltage (and current).

It is best TO control the current to run the led, while that means controlling the voltage potential, it is still preferable to control the current and not the voltage, although it is completly possible to control the voltage by "the design" and therin be controling the current.
 
Last edited:
Let's say I have a 3 watt LED I don't know anything about. Can I apply any combination of voltage and current to it as long as it adds up to 3 watts of power? Can I power it with 30 volts and 0.1 A just as easily as with 0.1 volts and 30 A? Both configuration add up to 3 watts of power.
In addition to what others have said, the most important thing to know is that you cannot freely choose a combination of voltage and current with anything, LEDs or otherwise. If you choose a voltage, then the device being powered will choose the current for itself. On the other hand, if you choose (or limit) the current, then the device being powered will choose the voltage for itself.

To understand this, consider that for every possible device, increasing the voltage applied to it will tend to increase the current flowing through it. (There are some special devices that react to the increase in current and change their resistance, but this does not invalidate the general rule.)

For instance, if you apply 1 volt to a 1 ohm resistor then 1 amp will flow. If you apply 2 volts to the same resistor then 2 amps will flow. And 3 volts, 3 amps. On the other hand if you limit the current to 0.1 amps, then the voltage must be 0.1 volts.

An LED does not behave in a proportional way like a resistor, but it still has a given voltage for any current flowing through it. The way to protect an LED is to limit the current to a safe value given in the data sheet. If the LED is an ordinary one, a safe current is usually 10 or 20 mA. If it is a power LED the safe current might be 300 mA with simple heat sinking, or higher if there is good heat sinking. You must always have heat sinking on a power LED or you will fry the LED.

Therefore to run an LED you must always have a current limited supply. If you have unlimited current you can fry the LED in the blink of an eye.
 
Last edited:
One more thing. Don't get hung up on the term "3 watt led." Wattage is really not a very good way to refer to an leds specifications. Instead, look at the data sheet and don't exceed those specifications. For current, and junction temperature.

Just for instance, a cree XR-E is often referred to as a 3 watt led. For a white led, typical VF may be 3.6V x 1,000mA = 3.6 watts. Not only will that vary from model to model, it will vary from led to led and will even vary in the case of the exact same led depending on junction temps. VF will drift as the led gets hotter, and as it ages.
 
Let's say I have a 3 watt LED I don't know anything about. .

This just means that if you apply current/voltage that results in more than 3 watts. the LED will have short life.

It's just a hunk of semiconductor - there's nothing inside to limit power to 3 watts.

Feed it with 800mamp and it'll draw around 3 watts.

Feed it with 3.8 volts and . . . . well, it depends on the chip temperature, the actual LED etc - DON'T feed LEDs with a constant voltage.

That's why most LED lights use a driver designed specifically for LEDs.
 

Latest posts

Top