100w led, insane amps

Spire4

Newly Enlightened
Joined
Jan 17, 2019
Messages
6
Hello everyone,

I recently started a project out of pure boredom. Its one of those 100w budget lights diy.
Everything went fine, until I completed it and did some measurements.
The scheme is super simple:
12v (5Ah) lead battery (salvaged from ups) ->10A fuse -> 20A Switch -> "150W" chinese booster that changes voltage from 12V-34V -> LED COB forward voltage 34V, forward current 3A
(there is also separate circuit powering fan to cool this thing down)

When powered everything works perfectly, untill I checked the current.
Battery to booster is almost 15A! (I too have questions about the fuse)
Booster to LED is almost 4,5A!

I tried several batteries, the more voltage it has, the more current this thing draws.
Tried 2 boosters with same result.
Note that the boosters max input is 10A (manufacturers stats)

Why does this thing draws so much?
Sorry for stupid questions, but Iam by no means an electrician, just an enthusiast.

Thanks for any imput
enthusiast
 

night.hoodie

Enlightened
Joined
Aug 6, 2014
Messages
717
Location
Lost City of Atlanta
I can't help, but there are members that can... sounds like something I bet PolarLi could answer, but you'd probably need to post some more info, maybe pictures. How bright is that thing? Heck of a first post. Welcome to CPF.
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
A boost regulator will by definition have higher input current than output current, and lower input voltage than output voltage. Its very job is to take low voltage, high current and transform it to high voltage, low current.

If you calculate the input power (voltage times current) it should be only slightly higher than the output power. Ideally the input and output power would be the same, but no regulator is 100% efficient. I would expect the device you describe to be 80-90% efficient within specified limits, less so if overloaded.

I don't know exactly what device you have, but typically they have an adjustable current limit. This is for the output current. If it is adjusted too high, this is the behavior you would expect to see. If you were to set the output current to around 3A or less, it might behave more like you expect. To reduce the limit, turn the screw counter-clockwise. Keep in mind it is likely to have a range of 20-30 turns.

If this doesn't work or doesn't make sense to you, post the exact model you have and we'll see if we can be more helpful.
 

Spire4

Newly Enlightened
Joined
Jan 17, 2019
Messages
6
Hello,
im using regulator https://bit.ly/2QZjPG9, its the typical one used in all the youtube diy videos and other"guides"
LED https://bit.ly/2DnoYUN 100w 10000k at almost 8 usd its one of the expensive ones
Cables are copper 1,5mm certified to build switchboards.

I have also desoldered the voltage trimmer from the booster regulator and put it in parallel with a potentiometer to make the adjusting more comfortable. (now ranging from 29V to 33,9V)
You are right about the booster regulator. The current is not adjustable. Does that mean the LED is just powerhungry and the regulator gives it to her?
I dont really understand it now, because I never saw a build that had adjustable current. Perhaps its enough to make the video and let it burn?
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
While that wouldn't be my first choice for an LED driver, you should be able to get it to work reasonably well.

If the voltage is adjusted properly, you would expect about 3A in the LED (output) and about 10A draw from the 12V battery (input). Your LED would overheat quite rapidly unless it is on a pretty significant heatsink. The power supply would run pretty hot too, so a fan on it would be a good idea.

You'd expect the proper voltage to be in the range of 32-34V as the LED web site suggests, but it could be somewhat lower than that.

If you turn the voltage down a little more, you'd expect the input and output currents to drop fairly quickly.
 

Spire4

Newly Enlightened
Joined
Jan 17, 2019
Messages
6
Thats a good assumption. Ill contact the seller to find out the real voltage range of the LED, as most of the chinese sellers lie.
 

Spire4

Newly Enlightened
Joined
Jan 17, 2019
Messages
6
So I contacted the led manufacturer, he asked for specs and then sent this reply

"Hi,
I guss the problem is the power supply.Our led chip is constant current led.Not constant voltage.It need to work with constant current led power supply.We suggest to use 3A 30-34V for 100W led chip.It can't use by 12V led driver.
We also sell the led driver for our led chip.Pls check the following link.The led driver "3A 20-34V 100W"can use for our 100W led chip.
https://www.aliexpress.com/item/1/32756673848.html"

I only understand that I picked a wrong regulator/led setup. Will a simple regulator with adjustable both current a voltage do, or do I have to buy the one he suggest (30usd).
Or simply throw away the led and buy CV one ?

Can anyone make heads and tails out of this?
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
DON'T PANIC!!!!

You aren't in dire straights unless this has to happen in the next few days. If you are panicing, PM me and we can talk.

LEDs are by their very nature CC devices. Well, not really, but kind of.

If you take a single-die LED designed for 3A and put 3A through it, it will take somewhere around 2.9 to 3.3V. Lets say it's 3.3V, as this is typical of older generation parts probably found in your device. Stack 10 of them in series and you get a 3A, 33V LED array. If you bought 3 of these and put 3A through each of them, you might find that one is 33V, one is 34V, and one is 32V. Let's call these A, B, and C. The power dissipations would be 99W, 102W, and 96W, respectively.

Now what would happen if you put each of these across a 33V power supply? A would draw 3A, and dissipate 99W as before. B would draw less than 3A. How much less? It's impossible to say without some more data on the LED, but we can say that the change in current would be significantly greater than the change in voltage. Because both the voltage and current are reduced, the effect on power is compounded. You have approximately 3% reduction in voltage, but that might cause a 10% drop in current, and result in a 13% reduction in power to 89W.

Now what about C? We increased its voltage a little, and will get as a result a more significant increase in current. Again, 3% increase in voltage could cause 10% increase in current, resulting in the power increasing to 109W.

So the effect of using voltage source instead of current source caused the power range to increase from 96-102W to 89-109W.

But it gets even worse.

As an LED heats up, it's voltage drops. If you are on a constant current source, a drop in voltage causes a drop in power. This helps mitigate the increasing temperature. But if we were running a constant voltage source, the voltage can't drop. So what happens? The current goes up! And it goes up significantly more than the voltage would have wanted to go down. So instead of the power dropping as the LED heats up, it increases, exacerbating the problem instead of mitigating it.

In extreme cases, this can even lead to 'thermal runaway' where increasing temperature causes increasing power, which causes increasing temperature, etc until something fails. In practice, this rarely happens with LEDs and is most likely not of concern to the DIYer.

But the voltage/temperature effect does compound on top of the other one, making the situation worse than described above.

These two effects, taken together, are why we say you should drive LEDs with CC, not CV. You get more consistent and predictable results. You can get better performance, better uniformity, and better reliability.

That's not to say you're an idiot for running LEDs on CV. It's been done many times and quite satisfactory results can be had if your expectations are in line with reality.
 
Last edited:

Spire4

Newly Enlightened
Joined
Jan 17, 2019
Messages
6
Thans for the very informative answer!
What I understand, the led I have is basically made little bit off and when driven with provided max voltage (34V), the current is just too much. If I use the provided current (3A), the voltage will be probably around 30-32V and all will sit tight.
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
Yes, that's what I meant when I said you can probably use what you have. Just set the voltage lower and it will draw less current. It will have more variation over temperature, so make sure to check the current when the LED is at full operating temperature.
 

Spire4

Newly Enlightened
Joined
Jan 17, 2019
Messages
6
Thanks again, I will confront the seller about this and see. :thanks:
 
Top