Disappointing trend in LED manufacturing

Monocrom

Flashaholic
Joined
Aug 27, 2006
Messages
17,093
Location
NYC
It's very distracting to me also. If I'm sitting behind someone with low frequency PWM brake lights at a stop light, any lateral motion of my eyes creates lots of flicker. This practice needs to stop.
Watching car review videos is frustrating too.
Just ignore the constant flickering of the headlights. That only happens on camera. :rolleyes:
 

Dave_H

Enlightened
Joined
Nov 3, 2009
Messages
785
Location
Ottawa Ont. Canada
Let me state again. RESISTORS are a PERFECTLY ACCEPTABLE method of setting current in many applications and automotive is definitely one of those applications since we have a reasonably narrow range of voltages and Vf is relatively well controlled on the LED side now. For many applications, highly accurate LED output is not needed, only meeting some minimum under some conditions. With the exception of forward lighting, resistors are quite common in automotive including for tail lights because they are cheap, reliable, and they do the job.
Agreed; I have acquired a variety of automotive LED lamps for purpose of operating from 12vdc, not for automotive use as I suspect most would not meet required standards; that is topic of other discussions on this board. Smaller lights including marker, running/brake/signal, and small interior lights commonly use resistor dropping.

Clearly, efficiency is not great on some. Groups of three amber LEDs in series only requires about 6v, so the rest is loss; similar with red. I have run some at reduced brightness from a 9v battery. Some have one diode in series, others have two or even full-wave bridge so does not care about polarity. This adds 1-2 diode drops, further lowering efficiency.

One little light actually used an LM317L linear regulator wired for constant-current. I could make out the resistor colour code through the amber plastic, and calculated 25mA, which was close to measured and spec. Anything which can run from 12/24vdc can't do well with resistors or any other linear regulation (but one strange exception I can add).

Larger "auxiliary" LED lights would not do well with linear regulation; use a variety of switching drivers and series/parallel arrangements.

Dave
 

JustAnOldFashionedLEDGuy

Newly Enlightened
Joined
Apr 13, 2020
Messages
170
Is this quote from Joakim accurate?

Remember, wattage is energy per second of time. If you were to concentrate the energy of one watt over one second into only a hundredth of a second, that would be 100 Watts, enough to burn out a 10 Watt LED. One millionth of a Watt imparted through the circuit through a static discharge is easily enough to burn out an LED. (Remember again, a Watt is not an actual unit of energy. One Watt over one entire second equals one Joule of energy. A Watt is just the rate of energy flow)

Some yes, practical standpoint no. A true 10W led will take 10x its rating for 0.01 seconds if it is of any quality. Remember they are just diodes and diodes usually can take fairly high pulse energy, 30-100x for short periods. Of course in automotive, there will always be something to protect against load dump s you don't see high voltages. For AC, you will similarly have a MOV to limit maximum excursion. Most larger LEDs have ESD protection as well. Reverse bias ESD hits are worse than forward.
 
Last edited:

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
120
I wish the automotive LED engineers would quit using very low frequency PWM for tail lights. Personally would love to have had that outlawed from the beginning. Very distracting to me.
It's very distracting to me also. If I'm sitting behind someone with low frequency PWM brake lights at a stop light, any lateral motion of my eyes creates lots of flicker. This practice needs to stop.
I think it can be assumed that any PWM (that involves flicker) for automotive LED lights (when the light is some color other than white) indicates that the LED is being fed by some sort of voltage converting power supply, not directly from the battery through a resistor.

For a lead acid (standard automotive) battery, between full charge and 50% charge, the voltage drop is only less than 0.5%. Even when the battery gets down to only 20% charge, the voltage drop is still less than 4%.


Let me state again. RESISTORS are a PERFECTLY ACCEPTABLE method of setting current in many applications and automotive is definitely one of those applications since we have a reasonably narrow range of voltages and Vf is relatively well controlled on the LED side now. For many applications, highly accurate LED output is not needed, only meeting some minimum under some conditions. With the exception of forward lighting, resistors are quite common in automotive including for tail lights because they are cheap, reliable, and they do the job.
I don't know if what you are saying is true, but here is something else to consider just from a basic electronics theoretical perspective: If the LED lights were being fed directly from the battery (through a small resistor), then can we assume that the circuit connecting to the LED lights would have to be isolated from the alternator which continually charges the battery? The alternator is a generator (that uses a coil of wire) and it is common for the alternator to have voltage spikes. (In fact a cheap and malfunctioning alternator can cause damage to the battery over time if the voltage spikes are too big, I have been told)

This makes me very skeptical about what you say.
 
Last edited:

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
120
Is this quote from Joakim accurate?

Remember, wattage is energy per second of time. If you were to concentrate the energy of one watt over one second into only a hundredth of a second, that would be 100 Watts, enough to burn out a 10 Watt LED. One millionth of a Watt imparted through the circuit through a static discharge is easily enough to burn out an LED. (Remember again, a Watt is not an actual unit of energy. One Watt over one entire second equals one Joule of energy. A Watt is just the rate of energy flow)
Perhaps not literally but I was just illustrating an example to try to prove a point.

The point was that only a surprisingly small amount of energy is enough to be able to burn out an LED, such as from a relatively small static electricity shock. It is actually NOT the voltage that burns out an LED but the wattage (the amount of energy imparted to the LED within a very short interval of time, a fraction of a second). Theoretically, you could supply an LED with a million volts and that LED would run just fine as long as the current (mA) level did not at any time exceed the LED's specifications.
(wattage imparted to the LED would be equal to the portion of voltage actually absorbed by the LED, multiplied by the current)

If you want to actually closely look at the numbers, a "typical" static shock is on the order of several milliJoules of energy. (a millijoule being 0.001 Joule, the equivalent amount of energy of 0.001 Watts over one second) Yet this is still enough to blow out an LED rated for, say, 3 Watts of power.

This may seem counterintuitive to many people, because you might think if a component should be able to normally handle power over one second it should be able to handle a small fraction of that same power over a fraction of a second. But basic physics works a little differently on small scales than it does on large scales. The components inside an LED are tiny, on a very small scale. It would be like the equivalent of observing that a 50 Watt light bulb uses 30,000 Joules of energy over 10 minutes, then asking why it can't candle 2,000 Watts over one second (only 2000 Joules) if not turned on for the rest of that 10 minute time period. The simple fact is that the filament cannot dissipate that much heat fast enough. With microscopic electronic components, they can reach excessive temperatures very fast from very small amounts of energy. On a small scale, a second is a long period of time.

I should clarify in my statement, when I said "One millionth of a Watt imparted through the circuit through a static discharge is easily enough to burn out an LED", that what I actually meant was the equivalent energy of one millionth of a Watt over one second, which could mathematically work out to many Watts in the time interval of a tiny fraction of a second.
I thought that should have been obvious from the context of the rest of my statement, so I did not feel it was worth the effort of correcting the error.

This is probably unnecessary but lastly I want to clarify something, for anyone who thinks my claim about an LED being able to survive a million volts is outrageous. It is technically true, but I am not aware of any high voltage (>20kV let's say) power supply that could meet the very tight current specifications. Even an extremely low powered high voltage power supply would be sure to blow out the LED. The high voltage power supply might have a low average current, but when high voltages like that are involved there will usually be current spikes. To explain this in a different way, the calculated average wattage output of the power supply might be only 0.1 watts, but within that there are going to be short intervals of hundreds of watts. It's just the nature of high voltage, it has a tendency of moving around very quickly in short little bursts.
 
Last edited:

LEDphile

Newly Enlightened
Joined
Mar 8, 2021
Messages
197
I think it can be assumed that any PWM (that involves flicker) for automotive LED lights (when the light is some color other than white) indicates that the LED is being fed by some sort of voltage converting power supply, not directly from the battery through a resistor.

For a lead acid (standard automotive) battery, between full charge and 50% charge, the voltage drop is only less than 0.5%. Even when the battery gets down to only 20% charge, the voltage drop is still less than 4%.



I don't know if what you are saying is true, but here is something else to consider just from a basic electronics theoretical perspective: If the LED lights were being fed directly from the battery (through a small resistor), then can we assume that the circuit connecting to the LED lights would have to be isolated from the alternator which continually charges the battery? The alternator is a generator (that uses a coil of wire) and it is common for the alternator to have voltage spikes. (In fact a cheap and malfunctioning alternator can cause damage to the battery over time if the voltage spikes are too big, I have been told)

This makes me very skeptical about what you say.
Your "basic electronics theoretical perspective" appears to have missed "parallel circuits" - a circuit that is fed from an automotive battery is also fed by the alternator (which is in parallel with the battery).

Now, as to the use of a resistor for an automotive rear lighting application, if we consider a string of 5 mid-power red LEDs (e.g. Lumileds Signalsure 75), being driven by an automotive electrical system through a 47 ohm resistor, we get the following set of drive currents (linear models were used for simplicity - in reality, the current spread will be a bit smaller)
VoltageCurrent
16V (max design voltage)87mA
14.5V (nominal alternator voltage)68mA
12V (nominal battery voltage)35mA
10V (dead battery voltage)9mA

This gives about a 2:1 change in output (output is approximately linear with drive current) between the "engine on" and "engine off" states, keeps the LEDs within their maximum ratings at the max design voltage, and still has some output with a dead battery. At this point, it is worth noting that an incandescent lamp will be about 1.9x as bright at 14.5V as at 12V, so a comparable change in output.

Could a constant-current regulator make things more consistent? Absolutely. Will a regulator increase cost, size, and points of failure? Also yes. Is a resistor good enough? Probably.
 

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
120
Your "basic electronics theoretical perspective" appears to have missed "parallel circuits" - a circuit that is fed from an automotive battery is also fed by the alternator (which is in parallel with the battery).
I have a feeling you don't understand what I was saying.

You understand that the battery cannot be connected to the alternator at the same time that the LEDs are connected to the battery?
 

JustAnOldFashionedLEDGuy

Newly Enlightened
Joined
Apr 13, 2020
Messages
170
I think it can be assumed that any PWM (that involves flicker) for automotive LED lights (when the light is some color other than white) indicates that the LED is being fed by some sort of voltage converting power supply, not directly from the battery through a resistor.

For a lead acid (standard automotive) battery, between full charge and 50% charge, the voltage drop is only less than 0.5%. Even when the battery gets down to only 20% charge, the voltage drop is still less than 4%.



I don't know if what you are saying is true, but here is something else to consider just from a basic electronics theoretical perspective: If the LED lights were being fed directly from the battery (through a small resistor), then can we assume that the circuit connecting to the LED lights would have to be isolated from the alternator which continually charges the battery? The alternator is a generator (that uses a coil of wire) and it is common for the alternator to have voltage spikes. (In fact a cheap and malfunctioning alternator can cause damage to the battery over time if the voltage spikes are too big, I have been told)

This makes me very skeptical about what you say.

You are skeptical because you don't have the first idea of what you are talking about or even a hint of practical or professional experience and I apologize to the moderators if that is harsh, but it is quite obviously true.

YES they are doing PWM right from the battery with a resistor. NO they are not isolating from the output of the alternator and hence larger voltage swing. What the heck do you think happened when we used incandescent bulbs!!??
 

JustAnOldFashionedLEDGuy

Newly Enlightened
Joined
Apr 13, 2020
Messages
170
Perhaps not literally but I was just illustrating an example to try to prove a point.
.... blah blah

Again you are talking about things you really have no clue about. Static electricity does not damage LEDs from over-current it damages from over-voltage. It literally punches through diffusions without the LED ever conducting as it is often a reverse bias situation. LEDs are quite robust w.r.t. forward biased static discharges as the effective source resistance of a static discharge is quite low. It is not usual for high powered leds to be in parallel with an esd diode conducting in the opposite direction.

LEDS are fairly large die devices because they have to be, thermal issues and all from high forward voltages and can take over current quite large (as can most diodes really).
 

JustAnOldFashionedLEDGuy

Newly Enlightened
Joined
Apr 13, 2020
Messages
170
For a lead acid (standard automotive) battery, between full charge and 50% charge, the voltage drop is only less than 0.5%. Even when the battery gets down to only 20% charge, the voltage drop is still less than 4%.

And again, you have no clue what you are talking about. Just stop already. Anyone can find an online SOC versus voltage charge and see that basic calculations are eluding you. Just stop already.
 
Top