A problem with understanding current.

MikePL

Newly Enlightened
Joined
Aug 28, 2007
Messages
52
I have been experimenting with LEDs for a bit of time but there's one thing I don't understand. Does current (miliamperes) always change with voltage?.

The Ohm law is R = V / I

So in order to calculate resistance I need to divide the remaining voltage (input voltage minus LED voltage drop) by current (I)

But does it mean that current always increases with voltage?

For example I want to drive off 12V a LED that requires 20mA of current and has a voltage drop of 3.5V

R= 8.5 / 0.02 R=425

And in the second example I want to drive off 12V a LED that requires 700mA and also has a 3.5V drop

R= 8.5 / 0.7 R=12


When I measure voltage with the 425 resistor (440) I get a voltage smaller than measuring voltage with a R12 resistor.

So how am I supposed to control voltage and current when I change the resistor I change not only current but also voltage.

Or maybe I am in deep confusion here and I shouldn't worry about the voltage I give to the LED and only focus on current?
 
The supply voltage is fixed - 12V

The LED voltage is fixed - 3.5V

Therefore the voltage accross the resistor is fixed at 8.5V. As you change the resistor value, therefore the current must change in the form

I=8.5/R

In reality - with a fixed LED, the LED voltage will change slightly as the current changes, depending on the LED's V/I curve, resulting in a small error in the actual current, compared with your calculated current. Normally though, you arrange it so that this voltage change is small compared with the voltage accross the resistor - this keeps the error small. You also want to have any supply voltage variation small compared with the voltage accross the resistor.
 
Last edited:
just focus on current since LEDs are 'current-driven' devices. :)
 
Top