How to Stop a Converted Unregulated 9-LED Flashlight using 3 x AAA to 18650 from Burning Out

dlong

Newly Enlightened
Joined
Mar 21, 2005
Messages
63
So... I know this is posted somewhere in WWW but my searching has failed me. I want to convert one of these:


to use 18650. Now, the flashlight is unregulated so it's direct driven. I'm fairly sure that dropping in a 18650 will eventually (maybe even shortly) burn out the LEDS one by one... How do stop that from happening? What's the simplest (exact?) component needed? I'm thinking a resistor? How ohms? I know stuff, just not in-depth! o_O

Bonus: Remember those 3-pk Costco TechLite (250 and 300 lumen version). Those are regulated, right? So I can just drop in 18650 and it should work without burning out the LED?

Note: You can ignore the mechanical aspects of battery size fitment in the battery compartment.
 
LEDs burn out because of voltage, not because of current, so you'll be feeding 4.2 volts to something that was designed for 4.5 volts.
Not that I would advise doing something like this, but it should work.
 
LEDs burn out because of voltage, not because of current, so you'll be feeding 4.2 volts to something that was designed for 4.5 volts.
Not that I would advise doing something like this, but it should work.
So, is there something else that is causing these simple direct-driven 9V LED flashlights to burn out when using 18650 batteries?
 
LEDs burn out because of voltage, not because of current, so you'll be feeding 4.2 volts to something that was designed for 4.5 volts.
Not that I would advise doing something like this, but it should work.
I think you might have that backwards, LEDs absolutely burn out from excess current
 
and how exactly are you feeding it extra current...?
By not limiting it? Wire any LED directly to a variable DC power supply set to 1.5 volts, will it not blow unless you do something to limit the current?
 
By not limiting it? Wire any LED directly to a variable DC power supply set to 1.5 volts, will it not blow unless you do something to limit the current?
okay, so for our collective edufication I've built this:

1649196478315.png


the LEDs are rated from 1.8 volts to 2.4 volts, opening voltage 1750 mV - if I recall correctly - so connecting them in series should give us a voltage range of somewhere around 3.5 volts to 4.8 volts. the 18650 pictured above starts at 4.2 volts ( fully charged )

measuring the angry pixies running through the circuit I get 3 mA, so we are fairly within the operating conditions.

so I'm pretty confident one of us is very wrong.
 
More voltage makes the device draw more current. They are directly related. Cannot have one without the other. Current control varies the voltage by small amounts to keep the current steady. Leds will change their resistance a bit due to heat which is why its keeps a steady current
3xaaa is not 4.5v. You'd be lucky to see 3.3 v under load. So a single 18650 will overdrive the led. But usually not by much. I've always found those type of lights will fit an 18500 though. Not 18650. No difference though. And it should not burn out the leds. I think those are just crummy lights dying
 
Last edited:
I found another article: https://www.ledsupply.com/blog/how-does-a-5mm-led-work

So, based on the two articles, assumption of LED in Harbor Freight flashlight:
5mm White LED (Vf 3.2v, Imax 20mA) and 18620 4.2V (V.input, Max Voltage)

R.limit = (V.input - Vf) / I.max
(4.2v -3.2v)/0.02 = 50 ohms?

Resistor size calculation:
P(watt) = I.max^2 x R.limit
(.02^2) * 50 = 0.02W

or

* LED Wattage: 3.2V * 0.02A = 0.064 Watts
* Total Wattage: 4.2V * 0.02A = 0.084 Watts
Wattage dissipated by resistor: 0.084 - 0.064 = .02 Watts

So I guess, 1/8 Watt 50 ohms resistor should be good? Since I dropped out of my EE degree and went a different route (CS, but that's a story for another forum :D), any actual EE major want to check my understanding of the articles and more importantly, the formulas and math?
 
Last edited:
I'm not saying you're wrong. Since I'm the one asking the question! But here's something I just found:


Does it make sense to you? I'm still trying to work out the article.

I'm sorry, my snarky comment was directed at aaargh, not you :)

Yeah, so this article explains what I tried to write above, so in his example he has 1.7 volts ( I used opening voltage, direct translation from my native language, but the proper term is ) forward voltage, and 5V of input power and wants 20 mA to run through the current. So obviously, without limiting the 5V to 1.7 he would fry the circuit, so he uses a 165 ohm resistor to get the power down.

In my little contraption above I used two LEDs in series, both of which has around 1.75 volts forward voltage, but can take 2.4 volts of Vf. So in series, that means I have 3.5 to 4.8 volts of forward voltage, hence I didn't need to use a resistor.

HOWEVER If I were to connect only one LED to the 18650, then I would have 1.75 Vf and 4.2 Vs so I'd need a resistor of around 122 ohms to get the 20 mA needed current.

In my case it's let's say 4.2 volts of supply, I have two LEDs in series, where operating voltage is 2.5 volts max, and I want 20 mA of power.
So putting that in to the equation it's 4.2-5 / 0,02 = 0. So powering them to the max I don't need a resistor.
If I wanted to power them to the minimum amount of power that will make the LEDs come on, Vf, then it would be 4.2 - 3.5 / 0.02 and I'd need a 35 ohm resistor, which is fairly small, but if you want to pedantic, you can use that. ( Also anywhere below that voltage and LEDs don't come on )

But anyhow, until you're matching the voltage you don't need to worry about the current, what you want to take in to consideration is that 18650s are basically small bombs, so I wouldn't run them in any circuit that is at least a bit iffy.
 
I found another article: https://www.ledsupply.com/blog/how-does-a-5mm-led-work

So, based on the two articles, assumption of LED in Harbor Freight flashlight:
5mm White LED (Vf 3.2v, Imax 20mA) and 18620 4.2V (V.input, Max Voltage)

R.limit = (V.input - Vf) / I.max
(4.2v -3.2v)/0.02 = 50 ohms?

Resistor size calculation:
P(watt) = I.max^2 x R.limit
(.02^2) * 50 = 0.02W

or

* LED Wattage: 3.2V * 0.02A = 0.064 Watts
* Total Wattage: 4.2V * 0.02A = 0.084 Watts
Wattage dissipated by resistor: 0.084 - 0.064 = .02 Watts

So I guess, 1/8 Watt 50 ohms resistor should be good? Since I dropped out of my EE degree and went a different route (CS, but that's a story for another forum :D), any actual EE major want to check my understanding of the articles and more importantly, the formulas and math?

That would be for one LED, don't you have 9...?
Because then it would be (4.2-3.2)/0.18 = 5
 
I'm sorry, my snarky comment was directed at aaargh, not you :)

Yeah, so this article explains what I tried to write above, so in his example he has 1.7 volts ( I used opening voltage, direct translation from my native language, but the proper term is ) forward voltage, and 5V of input power and wants 20 mA to run through the current. So obviously, without limiting the 5V to 1.7 he would fry the circuit, so he uses a 165 ohm resistor to get the power down.

In my little contraption above I used two LEDs in series, both of which has around 1.75 volts forward voltage, but can take 2.4 volts of Vf. So in series, that means I have 3.5 to 4.8 volts of forward voltage, hence I didn't need to use a resistor.

HOWEVER If I were to connect only one LED to the 18650, then I would have 1.75 Vf and 4.2 Vs so I'd need a resistor of around 122 ohms to get the 20 mA needed current.

In my case it's let's say 4.2 volts of supply, I have two LEDs in series, where operating voltage is 2.5 volts max, and I want 20 mA of power.
So putting that in to the equation it's 4.2-5 / 0,02 = 0. So powering them to the max I don't need a resistor.
If I wanted to power them to the minimum amount of power that will make the LEDs come on, Vf, then it would be 4.2 - 3.5 / 0.02 and I'd need a 35 ohm resistor, which is fairly small, but if you want to pedantic, you can use that. ( Also anywhere below that voltage and LEDs don't come on )

But anyhow, until you're matching the voltage you don't need to worry about the current, what you want to take in to consideration is that 18650s are basically small bombs, so I wouldn't run them in any circuit that is at least a bit iffy.
Thanks for the lesson, it was informative and clearly I was mistaken… I am not sure what warranted the snark because I was genuinely asking if the LED would blow in my example. I did not intend to make a statement of irrefutable fact.
 
Last edited:
Thanks for the lesson, it was informative and clearly I was mistaken… I am not sure what warranted the snark because I was genuinely asking if the LED would blow in my example. I did not intend to make a statement of irrefutable fact.

the internet not carrying over non-verbal cues I can pick up on, and me being frustrated with other stuff non-related to this conversation.
my bad, we cool
 
More voltage makes the device draw more current. They are directly related. Cannot have one without the other. Current control varies the voltage by small amounts to keep the current steady. Leds will change their resistance a bit due to heat which is why its keeps a steady current
3xaaa is not 4.5v. You'd be lucky to see 3.3 v under load. So a single 18650 will overdrive the led. But usually not by much. I've always found those type of lights will fit an 18500 though. Not 18650. No difference though. And it should not burn out the leds. I think those are just crummy lights dying
Fresh lithium primaries about 1.7v each unloaded so 5.1v on a load that is about 20mA fresh alkaline typically hit 1.6 unloaded and since the cheap led are most certainly not loading them unless they are half dead when put in 4.5v is best case
What is burning the leds in this case is the fact that they are cheap leds they will burn from alkaline too. What the ok could do is swap them with good quality ones. But otherwise what you are all thinking is current limiting it can be done with a diode capacitor or resistor I don know the value or placement in this particular case but Google should be easy they even have calculators
 
What is burning the leds in this case is the fact that they are cheap leds they will burn from alkaline too. What the ok could do is swap them with good quality ones. But otherwise what you are all thinking is current limiting it can be done with a diode capacitor or resistor I don know the value or placement in this particular case but Google should be easy they even have calculators
Nope, what burns LEDs is, as previously stated, overcurrent. How much you pay for the LEDs plays very little role compared to how much current goes through the LED relative to the rating for the LED. And in fact, better quality LEDs may be more efficient, with a lower forward voltage, and burn out sooner when run on an unregulated voltage source.

As far as limiting the current goes, you can use a resistor (to size it properly, you need to know the LED's I/V curve and the input voltage), or a transistor-based constant current circuit (there are some integrated 2-terminal parts available that do this). Capacitors won't do anything useful at steady-state DC, and diodes, while they will drop the voltage, aren't going to give you the same ability to set drive current as the resistor or CC driver approach.
 
Also the shower head 5mm led lights aren't wired to regulate the current to each led, so if one has lower resistance it burns out first. Then the extra current and voltage take out the rest of the LED's one or more at a time.

I'm not saying all 5mm led lights are bad but the budget shower head lights typically will fail even without a lithium battery from regular use.
 
I
Also the shower head 5mm led lights aren't wired to regulate the current to each led, so if one has lower resistance it burns out first. Then the extra current and voltage take out the rest of the LED's one or more at a time.

I'm not saying all 5mm led lights are bad but the budget shower head lights typically will fail even without a lithium battery from regular use.
was saying diode caps and resistors can all be used to do current limiting but the one that should be used is a resistor I believe one for each led as if it like the ones I had the led are being direct driven in parrallel through the outer case so loads of bad paths and only mediocre connections I doubt it could pass many amps honestly.

The led will only use as much current as they possibly can use and extra becomes heat and heat is bad. Yes current limiting will solve the issue. But like others have been saying the leds wont just draw till death they will draw the max amps they can pass which will be running them at 100% -+15 of rated this will make them run hot. Current limiting will fix this.
 
That would be for one LED, don't you have 9...?
Because then it would be (4.2-3.2)/0.18 = 5
You are correct, it does have 9 LED which would make it 5 ohms and power dissipated by resistor of 0.162W. That would make the required resistor, 1/4W 5 ohm?

I did find this LED resistor calculator: https://www.hobby-hour.com/electronics/ledcalc.php

I guess, I don't totally understand all the relationship, because plugging in numbers into the calculator (using the above values). It recommends a 1/2W resistor.

All I have on hand are 1/4W. Will the resistor overheat and burn out, if I just turned on the flashlight and left it on .... ?
 
Top