The question I have is how the LED drivers that are out there work.
I'm designing an LED color photo enlarger light source. Gonna need dimming capability. I figured on using some beefy DC power supply and some darlington transistors hooked up to the PWM output of a microcontroller to create PWM dimming.
PWM is great for resistive loads, but what about LEDs? I understand that LEDs should be ideally driven by a current source. Yet, I know that common 'buck' flashlight drivers use PWM. I see how PWM can prevent the LEDs from therrmally melting down; if you reduce the duty cycle, the LED won't thermally overload, yet, I'm sure you couldn't just PWM 500V DC, no matter how low the duty cycle you used. I'm sure there must be a limit to the voltage as well, or an ideal range. So, is there a certain voltage range (say 1-2x Vf) that is acceptable for straight PWM LED dimming, or do LED drivers stack PWM and some other control circuitry?
I'm designing an LED color photo enlarger light source. Gonna need dimming capability. I figured on using some beefy DC power supply and some darlington transistors hooked up to the PWM output of a microcontroller to create PWM dimming.
PWM is great for resistive loads, but what about LEDs? I understand that LEDs should be ideally driven by a current source. Yet, I know that common 'buck' flashlight drivers use PWM. I see how PWM can prevent the LEDs from therrmally melting down; if you reduce the duty cycle, the LED won't thermally overload, yet, I'm sure you couldn't just PWM 500V DC, no matter how low the duty cycle you used. I'm sure there must be a limit to the voltage as well, or an ideal range. So, is there a certain voltage range (say 1-2x Vf) that is acceptable for straight PWM LED dimming, or do LED drivers stack PWM and some other control circuitry?