LED letter project

dolfan907

Newly Enlightened
Joined
Aug 9, 2010
Messages
5
I plan on spelling the name "Katie" out of red LED bulbs. I have figured that if I wire 21 series of 4 bulbs each into parallel with each other, then I will be able to spell the name. The bulbs that I plan on using a 5mm red bulbs.
The typical forward voltage of the bulbs is 2.0V, but the max is 2.6, the forward current is 20mA. I have figured that on the minimum voltage, I will need an 8V, 420 mA power supply, and on the maximum voltage I will need a 10.4V power supply.
My main question is how do I decide which resistor to use? I saw a tutorial online that indicated a 180 ohm resistor for a 12V power supply, even though the voltage drop is 12V, (http://www.brighthub.com/hubfolio/swagatam-majumdar/articles/65165.aspx ) so I don't quite understand how I should calculate the required resistance, since in that tutorial, his bulbs theoreticaly use up all of the voltage in each series, so how does he come to say that he needs a 180 ohm resistor? since the only information that a calculator will ask you for is power supply voltage, voltage drop, required current, and number of LEDs; there is no mention of the power supply's current, which if I am not mistaken seems like a relevant value
So I need to know what ohms, and wattage my resistors should be, if I should wire one into each series, and how you figured all of that information out, so that I won't need to post this same question for each of my future LED projects.

*p.s. I read on a site that inorder to get the maximum brightness out of an LED it is suggested that you come near double of the recommended voltage of an LED, which to me seems insane, can anyone tell me whether or not that is true?
 

bshanahan14rulz

Flashlight Enthusiast
Joined
Jan 29, 2009
Messages
2,819
Location
Tennessee
Yes, that wizard is much easier to use.

If you are using automotive voltages, you might also consider putting the whole thing behind a 12V regulator.
 

Noctilucent

Newly Enlightened
Joined
Jul 16, 2010
Messages
33
The others have already pointed out the LED/Resistor calculator -and- the fairly flexible LED strip option (definitely the easiest way to go, but you are at the mercy of their spacing, of course).

But as to your...
*p.s. I read on a site that inorder to get the maximum brightness out of an LED it is suggested that you come near double of the recommended voltage of an LED, which to me seems insane, can anyone tell me whether or not that is true
I'm sure it's true... for a fraction of a second and then your LED is dead. Yes, that's insane. Check the datasheet for any LED.. just a few tenths of voltage difference makes a -huge- difference in the current draw.

You can drive them a bit higher for flashing display purposes, but for continuous use, stick to the Vf(typ) (not Vf(max)). If you want brighter output, find a brighter LED or use more than 1 LED.
 

MikeAusC

Enlightened
Joined
Jul 8, 2010
Messages
995
Location
Sydney, Australia
I read on a site that inorder to get the maximum brightness out of an LED it is suggested that you come near double of the recommended voltage of an LED, which to me seems insane, can anyone tell me whether or not that is true?

If you increase the voltage on a high power LED by 0.1 volt, the CURRENT can DOUBLE.

"Doubling the voltage on LED to increase brightness" has to be the stupidest advice ever.
 

dolfan907

Newly Enlightened
Joined
Aug 9, 2010
Messages
5
i appreciate everyone's help but i'm not quite sure my question is being answered. Although i am fairly new to LEDs i do understand a few basic principles of electricity, and from what i have read LEDs seem to be very current sensitive, and i don't understand how i can prevent say all 400mA of current from running through a bulb and frying it. Does an LED only draw the current that it requires? like if i pass 400mA of current through an LED will it take all of it? thats what i don't quite understand
 

uk_caver

Flashlight Enthusiast
Joined
Feb 9, 2007
Messages
1,408
Location
Central UK
If you have a resistor in series with one or more LEDS, for a given voltage, the current flowing will depend on the resistance.

Pretending for the moment that the LEDS don't have any resistance of their own, if you have a figure for the forward voltage (Vf) of the LED, you can work out the current flow.

For example, if:
a) you had three red LEDs, each with a Vf of 2V
b) the LEDs were connected in series with each other and with a 1k resistor
c) you applied a supply voltage of 12V across the entire chain of LEDs+resistor

Then the current flowing would be 6mA.
Each LED needs 2V across it before they would begin to light up, and so the string of 3 requires 6V to get any current at all to flow through them.
You could think of it as each LED 'consuming' 2V, leaving 6V across the resistor, which would mean (by Ohm's law) that 6mA would flow through the whole chain of components.

Assuming the power supply is a constant voltage supply, the power supply's maximum current isn't relevant unless a load is connected which would 'want' to draw more than that current at the given supply voltage.
Whether your 12V supply had a maximum output current of 10mA or 100mA or 100A wouldn't make any difference to the current flowing, it would still be 6mA.

As long as that maximum current isn't exceeded, what controls the actual current is the voltage of the supply, and the characteristics of the load.
It's the same situation as with mains power - although a mains supply could potentially supply a lot of current, the current that actually flows when you plug in a lightbulb or an electric heater only normally depends on the voltage of the supply and the characteristics of the load.
It would only be if you applied an excessive load (like plugging in dozens of powerful electric heaters in one house, or short-circuiting the wiring) that the maximum available current would be an issue, and fuses/breakers would trip.

I can see why you might be confused by the article you linked to, where talking about red LEDs, the guy says:
Each series contains a group of 4 LEDs. If we divide the input 12 volts with 4 we will find that each LED receives 3 volts enough to make them glow brightly. The resistors make sure that the current to the LEDs is limited so that they may last long.
Basically, that's a bogus calculation, since the 12V isn't divided by 4.
What happens is as I mentioned above - you work out approximately what the voltage drop of the LEDs should be, and then subtract the total voltage drop of the LED string from the supply voltage to work out the current flow with different series resistors.
The best that could be said is that he was doing a rough calculation to make sure that the LEDs would light at all, but even that is much better done by summing the necessary voltage to light the LEDs and then comparing that to the supply voltage, since that straight away gives you a likely rough figure for the voltage across the resistor you're going to be using.

In reality, the voltage drop of an LED does change at different currents, and does vary a little between different LEDs even of the same type, but that won't make a huge difference to the current flow you calculate as long as there's a reasonable voltage 'headroom' between the total LED voltage and the supply voltage.
For example, if you had a string of 4 nominally 2Vt red LEDs plus a series resistor, and a 12V supply, a variations of +/-1V/LED (+/-0.4V total) in your calculated total LED voltage would leave the voltage across the resistor being between 3.6 and 4.4V, rather than the nominal/expected 4V, which would give currents varying by +/- 10% from the calculated figure.
On the other hand, if you had 5 LEDs in a similar setup, the voltage 'left' for the resistor would be between 1.5 and 2.5V rather than the nominal/expected 2V, which is a +/- 25% difference.
 

dolfan907

Newly Enlightened
Joined
Aug 9, 2010
Messages
5
@UK_caver

Thank you for your in depth answer to my question. While some of your explanation clarified some things for me, unfortunately a lot of that went over my head. This is a link to the type of LED i plan on using,( http://www.unique-leds.com/index.php?target=products&product_id=1616 ) i plan on creating 21 series of 4 bulbs each. Each series will be in parallel with one another, and the 21st series will only have 2 bulbs instead of 4. that is a total of 82 bulbs. Given the specs of the bulb, what power supply would you suggest i use, and what ohms/ wattage resistor should i wire into each series? and do i only need to wire one resistor into each series?

Thanks
 

uk_caver

Flashlight Enthusiast
Joined
Feb 9, 2007
Messages
1,408
Location
Central UK
So you have LEDs with a nominal Vf of 2V at 20mA.

Let's also assume you're going to use a 12V power supply.
A 12V supply does make sense. 12V supplies are pretty common, and hence pretty cheap, and there's a decent voltage headroom over your ~8V series strings of LEDs - high enough that the current isn't going to vary greatly with small variations in Vf of the LEDs, but not so high that you're wasting a lot of power unnecessarily.

Your total current consumption is (as you calculated earlier) a bit over 400mA. You might get away with a 500mA supply, but personally, I'd go for a 1A supply. Especially if buying a cheaper supply, there's a lot to be said for getting one that isn't going to be working near its maximum output.

Assuming you're using a single series resistor per string of LEDs, just looking at one string of 4 LEDs, that's 8V 'consumed' by the LEDs, leaving 4V to be 'seen' by the resistor. Let's call the resistance of that series resistor 'S' ohms

From Ohm's law, the nominal current per string is therefore 4/S (in amps) or 4000/S (in milliamps).

If you were going for ~20mA current per string, that would require a ~200 ohm resistor per string. Best to use that value, or maybe one slightly more if a 200 ohm isn't available in the particular range of resistors you're going to use.

Doing a quick calculation for how much power they will dissipate, in this case, the power dissipation would be 4V*20mA -> 80mW, so pretty much any shape/size of ~200 ohm resistor would do.

For the 2-LED string, the resistor would be dropping 12-(2*2) = 8 Volts, and so to get ~20mA, you'd need a resistance or ~400 ohms (or just use 2x~200ohm resistors in series).

The maximum Vf of your LEDs at 20mA is given as 2.6V, but in practice, any randomly-chosen red LED is unlikely to have a Vf that high, and any randomly chosen 4 LEDs in series are probably going to have a total Vf reasonably close to 8V.
 

dolfan907

Newly Enlightened
Joined
Aug 9, 2010
Messages
5
Thanks a lot man! my girlfriend has been nagging at me to make her something for a long time, now that i got all that info you gave me, i can put the wheels in motion and get this project started
 
Top