Strange LED behaviour!

Mash

Enlightened
Joined
Dec 18, 2006
Messages
378
Finally got around to completing my cree based under cabinet setup for my kitchen.
I am using a VARILED driver, with 16 crees in two channels of 8 in series, running at max current of 350mA per channel.
The variled has a dimming function which is useful since 16 crees at 350mA can be quite bright for night adapted vision!
However today I noticed soething very bizzare. When dimming the system down, at some point near the low end, two or three LEDs in every channel are actually completely OFF. ie as if they were disconected from the circuit. However since other LED in the circuits are in series with the off ones and functioning, it means there is juice in the wires and all is as it should be.

There is quite a bit of difference between where these LEDs switch off, and how far the rest can be dimmed down even more (actually one of them switches off as soon as I move lower than the 350mA maximum setting!)!
Question is, is this behaviour solely due to binning and Vf differences between different LEDs/batches? (the crees are all WDs from DX bought at different times)
Or am I missing something else here?
 
Last edited:
I have seen this same exact behavior in rebels, except I was using a voltage and current regulated power supply . I am not talking about dimming down to milliamp range where Vf differences can cause an individual emitter to sink below the glow range, but rather dimming from say 350mA to 300mA and one or more of the LED's would switch off completely, whereas if you connected the same LED only to the a 300mA source it would light up without any problem. Unfortunately I have never been able to find a good explanation for this behavior.
 
It could be the forward voltage of the LED at play. The LEDs with higher forward voltage might be the first to dim and at low enough currents(is this a constant current or voltage driver?), not light up at all. The tint of the LED shouldn't play in this because the tint is due to the phosphor which is after power goes to the blue LED die and produces light.
 
The VARILED is a constant current driver, with dimming capability.

Frenzee, it is exactly as you described it, that eg this LED switches off at what I would estimate to be 300mA, which is really strange!
Glad (in a way!) to be Im not going crazy, and others have seen the same thing!
 
AN update:
This strange behaviour has now become even more extreme!
One of the LEDs connected in series, just switches off completely, when I go even a small amount below 350mA.
The driver is powering two circuits, of 8 parallel LEDs on each.
One of the legs shuts down completely when dimming to minimum, which makes me think the driver doesnt supply exactly the same current on both legs.
DOesnt explain the strange behaviour of the single LEDs switching off though...
 
...The driver is powering two circuits, of 8 parallel LEDs on each....

Just an observation: I assume you meant to say two parallel circuits of 8 series LEDs each, right? 8 parallel LEDs will be almost impossible to correctly and uniformly drive. Assuming the former, unless you have shunt resistors of significant value in both circuits, it would still be nearly impossible to split the current equally between the two.

[Edit] Also, I have seen solder paste residue and baked rosin create an electrical path between the LED leads which is very difficult to detect or see. I suggest dipping your circuit in pure IPA and using a gentle, fine-bristle brush to try to get under the LED and clean out as much residue as possible.
 
Last edited:
Vf differences could cause a couple LEDs to be brighter or dimmer than others only when driven in parallel off a single power source. In series, each emitter will get the same current, at whatever voltage it needs. If your emitters are in series and a couple shut off completely when the string is driven at less than 350mA, there's something wrong with the installation, not due to Vf differences.

If two strings are connected to the same CC regulator with no balancing method (resistors, etc.), the driver will indeed supply different currents to each string. The string with a lower total Vf will get more current.

LEDs produce light even at very low currents (fractions of a mA). If you smoothly ramp down the supplied current (or voltage, for that matter, but CC is usually a better idea), they won't suddenly shut off below a certain point.

In a nutshell, I think (IMHO) your build has issue(s). Sorry. :(
 
I am using a VARILED driver, with 16 crees in two channels of 8 in series, running at max current of 350mA per channel.
The driver is powering two circuits, of 8 parallel LEDs on each.
How interesting. Assuming that should be "series", are the two strings of LEDs completely independently driven by a single driver? Have you checked all your contacts, that they're not touching the star heatsinks, for example?
 
I to have noticed this occurance in some of my own lighting projects but not to the extreme that you explain. I have a lot (nearly 100) of older 1W Luxeon Batwing emitters around the house. They are arranged on panels with 3 series led's and then many panels in parallel. They are driven with 12V through a PWM driver I built. When I dim them down nearly to the point of being off I notice that some Led's will completely shut off before others while some others are putting out more light than others and all this is happening in a series string which means that the same current needs to be going through the off ones as is going through the ones that are still on. I have thought this may be a way of binning or determining which LED's in a lot are more efficient or something but haven't looked in to it further. Here is a post on my home lighting setup.
 
Well if it's a series string it CAN'T be Vf differences. The only possible answer is leakage- current is being bypassed and lowering the voltage drop on the LED below the Vf.

Leakage is not commonly discussed. But I did notice discussion here on PWM vs lowering the current showed that at very low currents the device actually became less efficient. Which is odd, the junction is coolest there and should have been more efficient. I haven't seen this myself but I haven't tried to dim series strings real low anyways. Maybe it occurs on some devices and not others? Perhaps age or overheating could intensify it?

So the way I see it there is a leakage path where electrons are going through the device yet bypassing the junction's light-generating process and that leakage is probably a fairly constant current so it becomes significant at low device currents and can in fact take all the current away from the light-producing path. Even if it were mostly resistive, that would act mostly like a constant current leakage since the LED voltage doesn't vary much. If I put a 1K resistor in parallel with a 3.7V Zener diode, when the current is below 3.7mA then the diode current goes to zero, and the resistor current is 3.7mA for any current above 3.7mA.

Really if you've got the string dimmed down to 20mA or whatever and a device goes dark, the leakage theory the only answer because there's still 20mA of current going through. There is a current path here inside the device that does not produce light.

That doesn't explain frenzee's observation- I think he's got a different problem there. 300mA of "leakage" would not make any sense.
 
Last edited:
Well if it's a series string it CAN'T be Vf differences. The only possible answer is leakage- current is being bypassed and lowering the voltage drop on the LED below the Vf.

Leakage is not commonly discussed. But I did notice discussion here on PWM vs lowering the current showed that at very low currents the device actually became less efficient. Which is odd, the junction is coolest there and should have been more efficient. I haven't seen this myself but I haven't tried to dim series strings real low anyways. Maybe it occurs on some devices and not others? Perhaps age or overheating could intensify it?
I have an explanation for this effect:

The absolute minimum Vf necessary for an LED to emit at a given wavelength is determined by the following formula:

energy of a photon in eV = 1240/wavelegnth in nm

In an LED, one electron (e) drops a voltage (V), resulting a emission of a photon.

For a blue or white LED emitting at 450nm, that will V will be at least 1240/450 = 2.75V.

If you were to apply 2.7V or lower to a blue or white LED, it will draw some current, and it will still emit some light, but it will do so at a longer wavelength than the dominant wavelength. This results in both color shifting, and reduced efficiency.

In order to have both high efficiency, and eliminate color shifting as much as possible, the most efficient way to drive an LED would be through constant-current dimming down to the efficiency peak, then PWM to get lower. Most constant-current drivers are actually high frequency (~100kHz) pwm drivers that are filtered, so the PWM would have to be much lower frequency (~1kHz) to pass through the filter cleanly.



Really if you've got the string dimmed down to 20mA or whatever and a device goes dark, the leakage theory the only answer because there's still 20mA of current going through. There is a current path here inside the device that does not produce light.
20mA leakage current with the light completely off means there must be some fixed resistance in parallel with the diode, a partial short -- possibly an inconsistent one (hence the sudden changes).

For a Cree R2, 20mA actually coincides with the efficiency peak, and should be producing about 9 lumens. Cree LEDs will still emit light though with a few microamps through them (eg current conducting across my skin) albeit very inefficiently.
 
Top