JoakimFlorence
Newly Enlightened
- Joined
- Jun 4, 2016
- Messages
- 137
Yes, a resister could theoretically be used to limit the current to power an LED. I'm going to talk about the efficiency level of this approach.
------[////]------(O)------
The power lost to resistance is going to be proportional to the voltage drop across the resister. Well what's that going to be? Voltage drop in a resister is the product of the resistance (in ohms) multiplied by the current able to flow through it. How many ohms of resistance we will need depends on the voltage potential, and the voltage potential across the resister is going to be a function of the amount of voltage being supplied to the circuit minus the operational voltage drop taken up by the LED.
In other words, the LED behaves kind of similar to the resister in this respect. If the rated voltage of the LED is 3v and the circuit is being supplied with 6v of power, then the amount of resistance required for the resister to limit current to the level needed for the LED is going to end up meaning that both the LED and the resister are going to be consuming an equal amount of power. Half the power in the circuit will be lost to heat in the resister.
So theoretically, for maximum efficiency one would want to reduce the voltage being supplied to the circuit as much as possible. Suppose the LED is 3.2v and we want to supply the circuit with 3.4 volts. What's the problem with that?
The problem is, the smaller the resistance the less the circuit is going to be able to deal with voltage fluctuations. With a resistance that small, a tiny increase in voltage potential is going to lead to a very big increase in current able to flow through the circuit. In fact that's the whole reason we're even using a resister in the first place. Theoretically, if we had an absolutely stable voltage supply that supplied the exact amount of voltage that the LED was rated for we wouldn't need a resister; the natural resistance in the LED would be sufficient to limit current. But that's not how things work in practice. Virtually any power supply coming from a transformer is going to have voltage spikes that go significantly above its working output voltage. LEDs don't get burned out from average power levels; even a tiny fraction of a second of excessive current going through can cause a burnout.
So there is a little bit of a trade-off. The larger the amount of power drop being consumed by the resister relative to the LED, the more voltage fluctuation the circuit will be able to handle without risk of burning out the LED.
Normally a simple resistor is not used for these applications, except for extremely low power little 5mm indicator light LEDs (where the efficiency level is not that critical since the power consumption is so small). This is just a perspective on the approach.
------[////]------(O)------
The power lost to resistance is going to be proportional to the voltage drop across the resister. Well what's that going to be? Voltage drop in a resister is the product of the resistance (in ohms) multiplied by the current able to flow through it. How many ohms of resistance we will need depends on the voltage potential, and the voltage potential across the resister is going to be a function of the amount of voltage being supplied to the circuit minus the operational voltage drop taken up by the LED.
In other words, the LED behaves kind of similar to the resister in this respect. If the rated voltage of the LED is 3v and the circuit is being supplied with 6v of power, then the amount of resistance required for the resister to limit current to the level needed for the LED is going to end up meaning that both the LED and the resister are going to be consuming an equal amount of power. Half the power in the circuit will be lost to heat in the resister.
So theoretically, for maximum efficiency one would want to reduce the voltage being supplied to the circuit as much as possible. Suppose the LED is 3.2v and we want to supply the circuit with 3.4 volts. What's the problem with that?
The problem is, the smaller the resistance the less the circuit is going to be able to deal with voltage fluctuations. With a resistance that small, a tiny increase in voltage potential is going to lead to a very big increase in current able to flow through the circuit. In fact that's the whole reason we're even using a resister in the first place. Theoretically, if we had an absolutely stable voltage supply that supplied the exact amount of voltage that the LED was rated for we wouldn't need a resister; the natural resistance in the LED would be sufficient to limit current. But that's not how things work in practice. Virtually any power supply coming from a transformer is going to have voltage spikes that go significantly above its working output voltage. LEDs don't get burned out from average power levels; even a tiny fraction of a second of excessive current going through can cause a burnout.
So there is a little bit of a trade-off. The larger the amount of power drop being consumed by the resister relative to the LED, the more voltage fluctuation the circuit will be able to handle without risk of burning out the LED.
Normally a simple resistor is not used for these applications, except for extremely low power little 5mm indicator light LEDs (where the efficiency level is not that critical since the power consumption is so small). This is just a perspective on the approach.