LED Driving

DanMill

Newly Enlightened
Joined
Jul 30, 2004
Messages
3
Can someone explain the me the basics of LED driving? I always though the best way would be to regulate the voltage and then from there use resistors to set the current but lately I have been told the current regulation is all you really need. Is this true? I am slightly confused and just need some clarification. Thanks,

Danny
 

Steelwolf

Flashlight Enthusiast
Joined
Feb 6, 2001
Messages
1,208
Location
Perth, Western Australia
Hello Danny and welcome to the forums.

I'm sure there will be others more knowledgeable than I who will be able to explain better, but I will take a stab at it, at the risk of confusing you further. It may also be useful to look at on-line resources teaching the basics of electronics, including the one in the Electronics Forum of this website.

The is a lot of theory behind it all, but the short answer is current regulation is all you need.

The concept of current regulation is that the regulation circuit is set to deliver a specific current. It will attempt to do this by changing the voltage until the correct current is achieved, since voltage = current * resistance. (Only for voltage changes that are within reason as sometimes the components are unable to deliver high enough voltage).

What you are probably thinking of is plugging together 3 cells in series for 4.5V then using a resistor in between the LED and cells. This is not true voltage regulation. In true voltage regulation, the regulation circuit will vary the current to maintain the set voltage.

Why is current regulation more desirable for LEDs? Because LEDs are current controlled devices. What that means is that for a slight increase in voltage, the current flowing through the LED can change dramatically, all out of proportion to our well-known V=IR equation. So it is smarter to control the current flowing into the LED because the resultant variation in voltage is extremely small.

Also, the resistance of a LED can drop with increased temperature. If your LED were in a voltage regulation circuit and its temperature went up, you could go into thermal runaway. The regulator would attempt to maintain the set voltage by increasing the current delivered to the LED, even as the resistance of the LED continues to drop. But more current equals more power equals more heat equals lower resistance....

That's enough for now, I think. Hope that helped.
Do search in the archives as well.
 

DanMill

Newly Enlightened
Joined
Jul 30, 2004
Messages
3
I think some of you misunderstood my question. I understand all that and have actually been a lurker on this site for about 2 years (i've got some good flashlights to prove it /ubbthreads/images/graemlins/smile.gif ) ... anyways, I built an LM317 constant voltage driver the other day for a string of LED's but in doing further research I found that I should be making a constant current driver instead. Explain this to me, If I have a constant current driver and the voltage spikes from 9.8v to 12v then wouldnt the increase in voltage damage the LED's even if the current remained the same? This is what I am confused about.
Thanks,

Danny
 

idleprocess

Flashaholic
Joined
Feb 29, 2004
Messages
7,197
Location
decamped
A current source should keep output constant even if its supply varies. If the supply voltage keeps within the current supply's specs, it should be smooth sailing on the output end.

Big spikes might show up on the output end for milliseconds as the current supply adjusts, but I doubt you'd notice it with a multimeter (unless the spike was big enough to fry something).
 

Moat

Enlightened
Joined
Sep 24, 2001
Messages
389
Location
Mid Mitten
Welcome to CPF, DanMill!

This topic probably belongs in the Electronics forum, where the real whizzes would surely clear the air - but I'll give my $.02...

The problem is that - unlike most component resistance/loads folks are used to - LED's often tend to display the characteristic of requiring LESS voltage as they "heat up", to pass a target current - and the current is what does the damage (creates heat). LED's are current driven devices, and the manufacturers' specs state a very specific current limit, but only a loose range of drive voltages.

So, for example, you drive a single Nichia white LED at it's max recommended current of 30ma, by setting a supply voltage of, say, 3.65v - and let it run on the bench. Come back after 20 minutes, and you may find that same 3.65v is now OVERdriving the LED at 40ma - which produces excessive heat, which further drops the LED's voltage requirement (Vforward, or Vf), which further increases the current, more heat, etc... A chain reaction - thermal runaway, I think they call it around here - that can/will fry the LED. Supplying a source of constant current instead of voltage avoids this problem by simply regulating the voltage downward when this happens.

AFAIK, I think it's really only a problem when pushing currents beyond the manufacturer's specs, or when heatsinking is inadequate. Many of the currently available single LED flashlights use voltage regulating circuits and are often overdriven as well, with little/no ill effects or runaway.

For LED strings in series, it is wiser to use constant current regulation, as LED's of even the same batch can vary a fair amount in Vf requirements - and those variations can compound the current limit/thermal differences of the individual LED's across the string - each having an effect on the other - leading to a greater possibility of thermal runaway somewhere along the string. A constant voltage may be being applied to the entire string, but it's really not controlling the V drop (voltage applied) "seen" by the LED's, individually. I think... (?)

A constant current regulator avoids that issue entirely - what current goes thru the first LED in the string is exactly the same as what goes thru them all.

Driving LED's in parallel, though, is a different story - I think constant voltage is safer. If, for instance, a constant current is supplying 4 LED's - and one blows "open" for some reason (a common failure mode) - the current supply will now deliver it's 4 LED's worth of current to the remaining 3, overdriving them to failure, possibly.

In that scenario, a constant parallel voltage supplied instead will have no effect on the operation of the remaining 3 LED's.

I'm not very experienced in these circuits, I'm just blabbering' a bit of what I've picked up lurking here a while, too. I might be somewhat off the mark - someone please feel free to correct any errors, explain more concisely, and deduct my $.02 if need be!!! /ubbthreads/images/graemlins/twakfl.gif
 

Doug Owen

Flashlight Enthusiast
Joined
Jan 30, 2003
Messages
1,992
[ QUOTE ]
DanMill said:
Explain this to me, If I have a constant current driver and the voltage spikes from 9.8v to 12v then wouldnt the increase in voltage damage the LED's even if the current remained the same? This is what I am confused about.
Thanks,

Danny

[/ QUOTE ]

No. When the extra voltage 'spike' comes along it all appears across the *regulator* not the LEDs. Provided it's tough enough to take it (easy enough to ensure in most cases), the LEDs never 'see' the abuse. They're fully protected.

Doug Owen
 
Top