Low current circuits - do they exist?

Black Rose

Flashlight Enthusiast
Joined
Mar 8, 2008
Messages
4,626
Location
Ottawa, ON, Canada
Not sure if this belongs here or in the LED sub-forum.

On DX and other such sites, you can buy circuit boards that output @ 350 mA and up.

I'm playing around with building a light source that uses 5mm Nichia LEDs and only needs 20mA of current.
I've searched the web and the only circuit I have come across in that range costs something like $80 :eek:

The obvious route is to use resistors to limit the current, but you end up wasting battery power and generating a bit of heat.

Is there not a more efficient and cost effective way to power these LEDs?
 
As someone who has used 5mm bulbs running 24/7 please consider driving them at 10 mA in lights that are turned on and off.
5 or 6 mA if they are to run 24/7.

They are nice and cheap BUT don't hold up for long periods 24/7, maybe 3 months.
Good luck with the project and please post a picture when you have it running.

Not sure if this belongs here or in the LED sub-forum.

On DX and other such sites, you can buy circuit boards that output @ 350 mA and up.

I'm playing around with building a light source that uses 5mm Nichia LEDs and only needs 20mA of current.
I've searched the web and the only circuit I have come across in that range costs something like $80 :eek:

The obvious route is to use resistors to limit the current, but you end up wasting battery power and generating a bit of heat.

Is there not a more efficient and cost effective way to power these LEDs?
 
efficent perhaps.... cost effective..... depends on your voltage source. I would say if you are dropping less than 2V a linear regulator probably would be most efficient with a simple resistor most cost effective. a buck circuit would be too expensive for such a low current and voltage amount.
 
You say nothing about the scope of your project. Driving 500 LED's would justify a switchmode driver, but hardly if driving 5 LED's.
I replaced the LED's (& resistors) in a 4.5V AAA flashlight, using Nichia GS B2V's. Their Vf is 2.83V @ 10mA, 3.02V @ 20 mA, and 3.20V @ 30mA. I used 68 Ohm resistors (instead of the original 10 Ohms; no wonder the old LED's gave in), giving me 25mA/LED with fresh batteries. Ok, so I get only 67% efficiency, but the batteries will still last a dozen hours or so with continous use. If I were building a camping light the extra hours I could squeeze out of the batteries might justify the extra cost, labour, & space needed for such a buck/boost circuit.
Maybe Maxim has some low-power IC's suitable for your application, but they're not the cheapest, usually in the $7-10 range.
 
One of the things I am working on is to create a battery powered (AA lithium or CR123) "light bar" for our shed since there is no electricity in there and we can't run any to it.

My idea is to run 4 LEDs in a parallel array to give sufficient light at night so I don't have to work with a flashlight stuck in my mouth.
 
A resistor would only work if Vin > Vdrop, so driving LEDs would have to be 3 cell at minimum...unless of course it is your intention to under drive it

I'm also curious as to how to go about doing this also....so far I've stuck with Max756s and built my own circuit

Max756 gives either 3.3/5V, 3.3Vs too low to effectively run a LED of any caliber, 5V can be downgraded with an appropriate resistor. 200ma max, so no more than 10 LEDs can be used. Boosts from 1.5V to 3V, I've been using the circuit to run dry lithiums for awhile now.

as far as low voltage bucks, no plans on try it

as for switches, I have a schematic of a 1N4401 based day-off night-on circuit...but that runs on a 9V battery and drains it 24/7, if someone has a similiar circuit for 5V let me know ;)
 
Last edited:
I would just use the proper resistor for the output you need. The waste is negligible for as long as you will be running it. at 25ma/led you are using 100ma and AAs should give you 20-30 hours of runtime easily. I am guessing you won't be using it more than 15 mins a day so we are talking about 3-6 months on batteries at least. I wouuld go with 25ma/LED to start out with lithiums if you are going to use them as your reference for resistor size. You could even make two switches... one for on/off and the other to bypass the resistor when light levels drop too low.
 
Bypassing the resistor would be an issue if I decide to use the circuit boards I have here for mounting 5mm LEDs.

The resistors are soldered onto the boards and form part of the completed circuit, so are ideal for a parallel array.

I have some switches coming from DX that are apparently ON/ON/OFF switches, so that is another option to consider.
 
So there's one resistor for each LED.
How about this simple setup; 3 AA's or whatever, resistors on each LED to limit the current to less than 100mA each with fresh batteries, a common resistor on the switch to further reduce the current to 25mA/LED - being bypassed by the switch when batteries are getting low.
 
Bypassing the resistor would be an issue if I decide to use the circuit boards I have here for mounting 5mm LEDs.

The resistors are soldered onto the boards and form part of the completed circuit, so are ideal for a parallel array.

I have some switches coming from DX that are apparently ON/ON/OFF switches, so that is another option to consider.
depending on how dim the light could be a bypass may not be necessary as by the time it reaches too dim the batteries would be mostly used up and needing replacement anyway. It is just a way to get a little more out of them before replacing them.
 
Here are some suggestions.

Firstly, use more LEDs. Two LEDs at 10 mA each will give more light than one LED at 20 mA.

Use 4 D cells as a power source. They will give you 6 V when fresh and 4 V when dead.

Wire the LEDs in parallel and give each LED its own current limiting resistor. Assuming the LEDs drop about 3 V, the resistor value for 10 mA would be (6 - 3) / (0.01) = 300 ohms.

Assuming you used 10 LEDs, the total current draw would be 100 mA. At that load with intermittent use, you should get about 20,000 mAh out of alkaline D cells. That would give you about 200 hours of use before the batteries run out.
 
Here are some suggestions.

Firstly, use more LEDs. Two LEDs at 10 mA each will give more light than one LED at 20 mA.

Use 4 D cells as a power source. They will give you 6 V when fresh and 4 V when dead.

Wire the LEDs in parallel and give each LED its own current limiting resistor. Assuming the LEDs drop about 3 V, the resistor value for 10 mA would be (6 - 3) / (0.01) = 300 ohms.

Assuming you used 10 LEDs, the total current draw would be 100 mA. At that load with intermittent use, you should get about 20,000 mAh out of alkaline D cells. That would give you about 200 hours of use before the batteries run out.
starting at 10ma they would get dim as the batteries depleted to 5ma per LED vs starting closer to 20 and dropping to a half state of perhaps 8-12ma
 
Use 4 D cells as a power source. They will give you 6 V when fresh and 4 V when dead.
This will be used outdoors where temperatures last week reached a balmy -31C / -23F - not alkaline territory.

The LEDs will be wired in parallel, and each LED will have it's own current limiting resistor.
 
Last edited:
Another option that's a bit more efficient than an resistor alone is to use a diode to drop voltage(thus current). Then using smaller resistor to fine tune the current.
 
This will be used outdoors where temperatures last week reached a balmy -31C / -23F - not alkaline territory.
Oh...well lithiums like you said then...

Another option that's a bit more efficient than an resistor alone is to use a diode to drop voltage(thus current). Then using smaller resistor to fine tune the current.
:confused:

How is voltage dropped over a diode more efficient than voltage dropped over the equivalent resistor? A voltage drop is a voltage drop. There is no difference in efficiency between one or the other. :huh:
 
Oh...well lithiums like you said then...

:confused:

How is voltage dropped over a diode more efficient than voltage dropped over the equivalent resistor? A voltage drop is a voltage drop. There is no difference in efficiency between one or the other. :huh:

Well one is burning the power as a load. The other restricts/clamps the flow by ndropping the potetial. That's why you see resistors could be 10ohm 5 watts. While a diode would 1w 22V. It doesn't use up the current.

I really gotta brush up on the difference, it's been a few years since I was studying to be a EE.

Ok ok I sat back and thought for a bit. A resistor will burn up power while dropping current and voltage. With a diode it only has a small potential drop and restricts current flow to 1 direction. What this does is sort like using an LED with a high vf. In the case of DD a high vf will lower the amount of current.

Resistor = burn up the excess power.
Diode = drops the voltage so less power actually flows from the battery.

Seriously someone correct me. It's been a while.
 
Last edited:
Well one is burning the power as a load. The other restricts/clamps the flow by ndropping the potetial. That's why you see resistors could be 10ohm 5 watts. While a diode would 1w 22V. It doesn't use up the current.
OK, look, that's all gobbledegook. It doesn't really mean anything that can be understood.

I really gotta brush up on the difference, it's been a few years since I was studying to be a EE.
There is no difference. It's all just the same.

Seriously someone correct me. It's been a while.
I'll try.

Firstly, you cannot use up the current. The current everywhere in a series circuit is the same.

When current flows through a device, there is a voltage difference between the terminals of the device. If you take the current flowing through the device and multiply that by the voltage difference across the device, that gives the power consumed by that device.

So if 1 amp is flowing through a diode, and the diode is dropping 0.6 V, then 0.6 W is being consumed by the diode and dissipated as heat.

Similarly, if 1 amp is flowing through a 0.6 ohm resistor, the resistor will be dropping 0.6 V and 0.6 W will be dissipated in the resistor as heat.

The diode and the equivalent resistor are the same as far as efficiency and heat generation are concerned.
 
OK, look, that's all gobbledegook. It doesn't really mean anything that can be understood.

There is no difference. It's all just the same.

I'll try.

Firstly, you cannot use up the current. The current everywhere in a series circuit is the same.

When current flows through a device, there is a voltage difference between the terminals of the device. If you take the current flowing through the device and multiply that by the voltage difference across the device, that gives the power consumed by that device.

So if 1 amp is flowing through a diode, and the diode is dropping 0.6 V, then 0.6 W is being consumed by the diode and dissipated as heat.

Similarly, if 1 amp is flowing through a 0.6 ohm resistor, the resistor will be dropping 0.6 V and 0.6 W will be dissipated in the resistor as heat.

The diode and the equivalent resistor are the same as far as efficiency and heat generation are concerned.

The more I know. :grin2:
 
Use CRD? (Current Regulative Diode)
http://www.semitec.co.jp/english/products/pdf/CRD.pdf
E-153 (25v MAX) is about 12~18mA
E-103 is about 8~12mA

A local LED parts shop carrys E-153 and it's about $0.8/pcs.


Some car lights mod tests found on the net:
CRD-test.jpg
 
Last edited:
Top