Hello again forum users, it has been a while since I have been on here. I had a quick and hopefully easy question for some of you LED techs out there. I am planing on making my own LED grow lamp for personal use. It is going to be a simple design using 3w Blue and Red LEDs which I will supplement with florescent lighting until I can make a lamp with a larger verity of colors. Anyway, I digress. The issue you I am running into is how should I go about wiring them to resistors. I want to do this in the most efficient way possible. Here are some specs to help out.
3w 440nm BLUE LED w/heatsink fv 3.6v fc 700mA, I will be using 67 of these
3w 660nm RED LED w/heatsink fv 2.4v fc 700mA, I will be using 33 of there, for a total of 100 LEDs.
The simple LED array tool I used tells me to wire the blues as such 22 sets of 3 blue including a 1.8ohm 1 watt resistor with a remainder of 1 LED with a 12ohm 6watt resistor. The reds are wired as such 6 sets of 5 red with a 1ohm 1/2 watt resistor with a remainder of 3 leds on a string using a 7.5ohm 4watt resistor.
My question is, is there any way I could string the Blue and Red leds together to make a more efficient system and use less resistors. Any help would be appreciated, thank you LOKE!
OOOPS!!! I forgot to add that I will be using a 350watt CPU power supply with the 12v rail.
So while I was waiting for a reply I was doing some reading. On a couple of forum posts on this site and others I was seeing an argument. The argument being that if you use up all of the voltage IE if i were to use 2 BLUE leds and 2 RED LEDs that would = 12v ie I would not need a resistor. Some people were saying that would not be the case and just the way that LEDs worked I would atleast want a 1ohm 1watt resistor on the string just to have it there. So the question is do I need to place a resistor on a 12v string of LEDs on a 12v line and if so what would be the correct resistor.
Also IF I did my math correct this is the most efficient way to string them 16x 2blue2red +resistor(?needed?) with 11 strings of 3 blue with a 1.8ohm 1w resistor and then 1 string of 2blue 1 red with a resistor(which resistor should I use?). Please correct me if I am wrong!!!
Again any help is greatly appreciated!
3w 440nm BLUE LED w/heatsink fv 3.6v fc 700mA, I will be using 67 of these
3w 660nm RED LED w/heatsink fv 2.4v fc 700mA, I will be using 33 of there, for a total of 100 LEDs.
The simple LED array tool I used tells me to wire the blues as such 22 sets of 3 blue including a 1.8ohm 1 watt resistor with a remainder of 1 LED with a 12ohm 6watt resistor. The reds are wired as such 6 sets of 5 red with a 1ohm 1/2 watt resistor with a remainder of 3 leds on a string using a 7.5ohm 4watt resistor.
My question is, is there any way I could string the Blue and Red leds together to make a more efficient system and use less resistors. Any help would be appreciated, thank you LOKE!
OOOPS!!! I forgot to add that I will be using a 350watt CPU power supply with the 12v rail.
So while I was waiting for a reply I was doing some reading. On a couple of forum posts on this site and others I was seeing an argument. The argument being that if you use up all of the voltage IE if i were to use 2 BLUE leds and 2 RED LEDs that would = 12v ie I would not need a resistor. Some people were saying that would not be the case and just the way that LEDs worked I would atleast want a 1ohm 1watt resistor on the string just to have it there. So the question is do I need to place a resistor on a 12v string of LEDs on a 12v line and if so what would be the correct resistor.
Also IF I did my math correct this is the most efficient way to string them 16x 2blue2red +resistor(?needed?) with 11 strings of 3 blue with a 1.8ohm 1w resistor and then 1 string of 2blue 1 red with a resistor(which resistor should I use?). Please correct me if I am wrong!!!
Again any help is greatly appreciated!
Last edited: