Electronics question: why not put a big cap on a PWM light?

fyrstormer

Banned
Joined
Jul 24, 2009
Messages
6,617
Location
Maryland, Near DC, USA
My understanding about the down-side of using PWM to control emitter brightness is: during the time the emitter is powered, it's running at the level of efficiency it would be at if it were running at full power with no PWM, whereas if it were current-controlled instead of PWM then it would be running at a more-efficient lower power level. So, while a light running at at 50% PWM duty cycle would be able to run twice as long as at full power, a current-controlled light calibrated to the same brightness would be able to run more than twice as long. So, relatively speaking the PWM light is less efficient than it could be if it were current-controlled instead. Is that accurate?

If so, my question is: why not wire up a large capacitor (such as physical space allows) in parallel with the emitter on PWM lights? Unless there's something I don't understand, that should smooth-out the square-wave PWM pattern into something resembling more like a sinewave, which among other things would reduce the peak voltage and current going through the emitter, without reducing the average voltage and current. In other words, it should soften the visual flickering and run the emitter at a more-efficient lower power level, even if only slightly.

I tried this a while back with a light running on a fluPIC driver, and the emitter did seem to brighten slightly and flicker a little less, which is exactly what I would expect based on what I said in the last paragraph. But, ultimately I had to remove the capacitor because there wasn't enough room to fit it behind the reflector. So I'm curious, am I crazy or does this actually make sense to anyone but me?
 

Der Wichtel

Enlightened
Joined
Mar 12, 2007
Messages
829
Location
Germany
I think that would slow the switching of the FET so you would get more losses and heading in the FET.

Greg

There won't be any effects to the FETs switching.

A capacitor can indeed smoothen the pwm signal but it has a high ESR which can decrease the efficiency. You need to look for low ESR capacitors
 

ZebraLight

Enlightened
Joined
Apr 13, 2007
Messages
310
Location
Irving, Texas
In theory, yes. However, to smooth out the voltage over a 3W LED, you need a huge capacitor. The MOSFET switch won't be affected. All DC/DC converters (buck, boost or buck-boost) use PWM to regulate the load voltage/current. You won't see flickering because most modern converters are working in the 1MHz range. Some drivers use MCU generated (usually low frequency) PWM to modulate the high frequency DC/DC converter PWM to adjust brightness, and that's when you see flickering. Increasing the frequency of the MCU generated PWM is one way to get rid of the flickering. Adjusting the PWM ratio of the DC/DC converters to get the desired output volatage/current is a much better way.


My understanding about the down-side of using PWM to control emitter brightness is: during the time the emitter is powered, it's running at the level of efficiency it would be at if it were running at full power with no PWM, whereas if it were current-controlled instead of PWM then it would be running at a more-efficient lower power level. So, while a light running at at 50% PWM duty cycle would be able to run twice as long as at full power, a current-controlled light calibrated to the same brightness would be able to run more than twice as long. So, relatively speaking the PWM light is less efficient than it could be if it were current-controlled instead. Is that accurate?

If so, my question is: why not wire up a large capacitor (such as physical space allows) in parallel with the emitter on PWM lights? Unless there's something I don't understand, that should smooth-out the square-wave PWM pattern into something resembling more like a sinewave, which among other things would reduce the peak voltage and current going through the emitter, without reducing the average voltage and current. In other words, it should soften the visual flickering and run the emitter at a more-efficient lower power level, even if only slightly.

I tried this a while back with a light running on a fluPIC driver, and the emitter did seem to brighten slightly and flicker a little less, which is exactly what I would expect based on what I said in the last paragraph. But, ultimately I had to remove the capacitor because there wasn't enough room to fit it behind the reflector. So I'm curious, am I crazy or does this actually make sense to anyone but me?
 

Alan B

Flashlight Enthusiast
Joined
Nov 19, 2007
Messages
1,963
Location
San Francisco Bay Area
There are two different types of circuits used to control LED brightness with PWM. The one referred to as PWM is on/off modulation of the current to the LED. The other is switching regulation where PWM is used but an inductor smooths out the current and the LED is in fact not seeing the PWM.

The current regulation does work well in terms of making the LED operate in a more efficient manner, but at very low currents things don't work well and the LED will suffer color shifts or partial operation. So to get to the really low levels the on/off PWM control is required.

Putting a capacitor across the LED makes pretty much no difference with the switch mode regulator.

Putting a capacitor across the LED with a PWM controller causes incredible current spikes since the FET sees the capacitor as a short until it is charged (there is no inductor to moderate the current). JimmyM tried this one day and I tried to tell him it would blow the FET, but the delay in the forums was too great and the FET was sacrificed before he read my advice.

Some lights have BOTH types of control - PWM for really low output and switchmode regulation for high output control.
 

fyrstormer

Banned
Joined
Jul 24, 2009
Messages
6,617
Location
Maryland, Near DC, USA
In theory, yes. However, to smooth out the voltage over a 3W LED, you need a huge capacitor. The MOSFET switch won't be affected. All DC/DC converters (buck, boost or buck-boost) use PWM to regulate the load voltage/current. You won't see flickering because most modern converters are working in the 1MHz range. Some drivers use MCU generated (usually low frequency) PWM to modulate the high frequency DC/DC converter PWM to adjust brightness, and that's when you see flickering. Increasing the frequency of the MCU generated PWM is one way to get rid of the flickering. Adjusting the PWM ratio of the DC/DC converters to get the desired output volatage/current is a much better way.
Fair point; I was referring to lights that "blink" the emitter to control apparent brightness, as opposed to lights that feed the emitter with something resembling a steady voltage.

I don't really know what the PWM frequencies are that the human eye can see, but it seems like even at 3W if the emitter is only off for a few hundredths of a second per cycle then a reasonably-sized capacitor ought to be able to handle at least rounding-off the corners of the PWM square-wave, even if it can't keep the voltage from hitting zero before the driver output turns on again.
 

fyrstormer

Banned
Joined
Jul 24, 2009
Messages
6,617
Location
Maryland, Near DC, USA
There are two different types of circuits used to control LED brightness with PWM. The one referred to as PWM is on/off modulation of the current to the LED. The other is switching regulation where PWM is used but an inductor smooths out the current and the LED is in fact not seeing the PWM.

The current regulation does work well in terms of making the LED operate in a more efficient manner, but at very low currents things don't work well and the LED will suffer color shifts or partial operation. So to get to the really low levels the on/off PWM control is required.

Putting a capacitor across the LED makes pretty much no difference with the switch mode regulator.

Putting a capacitor across the LED with a PWM controller causes incredible current spikes since the FET sees the capacitor as a short until it is charged (there is no inductor to moderate the current). JimmyM tried this one day and I tried to tell him it would blow the FET, but the delay in the forums was too great and the FET was sacrificed before he read my advice.

Some lights have BOTH types of control - PWM for really low output and switchmode regulation for high output control.
So basically, you would need either a capacitor and resistor in-series, to reduce the momentary-short effect, or else a really big capacitor that could provide its own resistance by way of not being completely discharged while the driver output is switched-off.

Would an induction coil in-series with the emitter be a better choice?
 

Der Wichtel

Enlightened
Joined
Mar 12, 2007
Messages
829
Location
Germany
What does ESR mean? I'd bet a few bucks I know what it is, but I don't recognize the TLA.

Ersatzschaltbild-Kondensator.png


That's the ESR. or look here: http://en.wikipedia.org/wiki/Equivalent_series_resistance

With a high output such as a LED it's better not to use a capacitor.
A capacitor is commonly used for signal paths. Such as to create an analog voltage signal from the PWM output of a microcontroller, this signal can be used with an opamp to drive higher loads.

It's better to use an inductor plus a diode. A small capacitor on the output will filter the rest out of the ripples.

http://en.wikipedia.org/wiki/File:Buck_operating.svg

However the switching frequencies needs to be very high to smooth out everything.
But overall there should be better results.
 
Last edited:

Alan B

Flashlight Enthusiast
Joined
Nov 19, 2007
Messages
1,963
Location
San Francisco Bay Area
So basically, you would need either a capacitor and resistor in-series, to reduce the momentary-short effect, or else a really big capacitor that could provide its own resistance by way of not being completely discharged while the driver output is switched-off.

Would an induction coil in-series with the emitter be a better choice?

A large capacitor will cause higher current spikes as it starts discharged and it takes more energy to maintain a higher voltage across the LED/Capacitor. The current will be limited by the battery, FET and capacitor's resistance which should be very low. The current spikes will waste a lot of energy and likely blow any reasonable FET.

Any resistance to limit the current flow into a storage capacitor causes loss. So the efficiency is low, exactly what you don't want. Inductors provide energy storage without the dangerous current surges and in so doing they achieve high efficiency. Capacitors may be used after the inductor, such as choke input power supply filters. This avoids the current surges and the resulting FET stress and loss.

You are essentially trying to turn a PWM controller into a switch mode regulator. There are many articles and books on the subject of designing switch mode regulators. An energy storage element and switch are used. Capacitors don't work well for this, they result in very low overall efficiency. Inductors are the preferred choice. This is how all switchmode regulators work. Energy is stored in an inductor. This works well but is not simple and requires a few parts. And it is no longer a PWM circuit. It will operate the LED at a more efficient point (which was your original issue), but note that it adds many losses to the system, and it will NOT get to very low outputs with LEDs. If you wish to have efficienct LED operation use a switchmode current regulator. But if you want to get to very low output levels use a PWM regulator which will maintain the LED color temperature better and cover a wider range of output control. The PWM circuit is also simpler and the circuit losses are lower. The LED efficiency is somewhat lower but the controller efficiency is higher so overall it is not too bad. For maximum versatility use a switchmode current regulator feeding a PWM controller. The switcher is used down to some level where the PWM takes over for the really low output levels.
 
Top