Simple Circuit

PC6uru

Newly Enlightened
Joined
Apr 15, 2007
Messages
41
Location
PA
If someone could help me I would greatly appreciate it. I am trying to understand alittle how resistors work with LEDs. I would like to connect two of these LEDs :
http://www.besthongkong.com/product_info.php?cPath=19_46&products_id=492

To a 3.7V 18650, I want them to be bright, so they are rated at 100ma max. at 3.3 volt. So i would take 3.7-3.3 to get .4 then take .4/.100 to get a 4 ohm resistor i should use, but since i am connecting two in parallel I should use a 2 ohm resistor? Is that right. Any good guide explaining how this works? Thanks.
 
OK, you want to connect two of these LEDs to an 18650.
The 18650 has a voltage of 3.0 to 4.2 V
The LEDs have a forward voltage range of 3.0 to 3.8 V but there's no guarantee that your two LEDs have the same forward voltage.
Under no circumstances you shall exceed the 100mA LED current, so you have to go worst case that is max battery voltage and min LED fwd voltage: 4.2 - 3.0 = 1.2 V
At 100mA this is 1.2 / 0.1 = 12R
Because the 2 LEDs can have a different forward voltage, you need one resistor per LED, each one 12R.
The resistors will dissipate 1.2 x 0.1 = 0.12W each. The resistors' power handling capability drops as the ambient temperature increases, but I feel that standard 0.25 W resistors are fine.

If you feel the design is too conservative (yes it is), then
buy a set of resistors from 12R down to 1R and experimentally find which value feeds an LED around 100mA from a fully-charged 18650. Then you deduct the resistance of your ampere meter from the optimum resistor value, this will give you the actual resistor value that you want to put.
 
12R (the R is ohms right?)

I don't understand this last step, if i find the resistor that feeds it 100ma, what is this step doing?
(Then you deduct the resistance of your ampere meter from the optimum resistor value, this will give you the actual resistor value that you want to put.
)
 
Hi there,

You can get away with a 10 ohm 1/4 watt resistor for each LED if you want
a conservative design. A little lower, like 8 ohms, will overpower the LEDs
a little while the cell voltage is high just after charge.
One resistor per LED.
 
PC6uru said:
12R (the R is ohms right?)

I don't understand this last step, if i find the resistor that feeds it 100ma, what is this step doing?
(Then you deduct the resistance of your ampere meter from the optimum resistor value, this will give you the actual resistor value that you want to put.
)

yes, R=Ohms

The resistor value obtained thru the worst-case calculation (12R) will probably produce a current below the 100mA target. This is because the world is usually not as bad as worst case. With this resistor value, your LEDs are safe but performance (brightness) could be better.
So you could reduce the calculated resistor a bit, but to be confident that you're not going too far, you will have to monitor the LED current with an ampere meter while you experimentally try lower resistor values. When you find the resistor that brings the current closest to your desired 100mA, you call it R100.
Now the meter that you have looped into the circuit is the equivalent of a resistor, very small value though. This meter is in series with the R100 resistor, so your total effective resistance is Rmeter + R100. The final circuit (without the meter) needs a resistor value of Rmeter + R100. Oops, I said "subtract", it's got to be "add".

To know the resistance of an ampere meter, check its specification. It typically depends on the range that you select. Cheap meters tend to have a higher resistance but as long as you know it, never mind.
 
ah, ok, the resistance of the meter, ok i get it. Thanks for your time and effort and detail of your posts martin. Take care.
:bow: :thumbsup:


Thanks also to the other repliers.
 
Back
Top