I have been experimenting with LEDs for a bit of time but there's one thing I don't understand. Does current (miliamperes) always change with voltage?.
The Ohm law is R = V / I
So in order to calculate resistance I need to divide the remaining voltage (input voltage minus LED voltage drop) by current (I)
But does it mean that current always increases with voltage?
For example I want to drive off 12V a LED that requires 20mA of current and has a voltage drop of 3.5V
R= 8.5 / 0.02 R=425
And in the second example I want to drive off 12V a LED that requires 700mA and also has a 3.5V drop
R= 8.5 / 0.7 R=12
When I measure voltage with the 425 resistor (440) I get a voltage smaller than measuring voltage with a R12 resistor.
So how am I supposed to control voltage and current when I change the resistor I change not only current but also voltage.
Or maybe I am in deep confusion here and I shouldn't worry about the voltage I give to the LED and only focus on current?
The Ohm law is R = V / I
So in order to calculate resistance I need to divide the remaining voltage (input voltage minus LED voltage drop) by current (I)
But does it mean that current always increases with voltage?
For example I want to drive off 12V a LED that requires 20mA of current and has a voltage drop of 3.5V
R= 8.5 / 0.02 R=425
And in the second example I want to drive off 12V a LED that requires 700mA and also has a 3.5V drop
R= 8.5 / 0.7 R=12
When I measure voltage with the 425 resistor (440) I get a voltage smaller than measuring voltage with a R12 resistor.
So how am I supposed to control voltage and current when I change the resistor I change not only current but also voltage.
Or maybe I am in deep confusion here and I shouldn't worry about the voltage I give to the LED and only focus on current?