acourvil
Enlightened
I've seen a number of threads related to use if resistors to obtain the appropriate current for a given LED, and to balance current when using parallel circuits. I know there is power lost by doing this; could someone tell me how to calculate what is lost?
For example, in a sample circuit using a 4.5V source and multiple 5mm LEDs, with the LEDs having a Vf of ~3.3. If I use a 75ohm resistor, I get the desired current of approximately 20mA in the circuit; what am I losing to the resistor?
For example, in a sample circuit using a 4.5V source and multiple 5mm LEDs, with the LEDs having a Vf of ~3.3. If I use a 75ohm resistor, I get the desired current of approximately 20mA in the circuit; what am I losing to the resistor?