vesture,
Thanks for the quick reply. So, judging by your responce, led transmit heat backward into the heatsink where as typical incandecent transmit forward to the reflector and lens. Correct?
Are leds rated the same way as incandecent bulbs? A 6w bulb uses 6w of power at 6v pulling 1 amp, therefor a 6w led rated for 3.8v would need 1500ma. Or are they rated in equivalent light output like flourescent lights are (where a 60w bulb actually only draws 13w)?
Following W=V*A and that leds are only about 20% efficient, I would assume excess wattage would be dissipated as heat. Does this relationship hold true or do variations in voltage and amperage lead to nonideal operating perameraters for the led? I guess this is more of what I should have asked instead of asking about linear relationships.
With the buck driver, does one set the output voltage via a resistor or do you just purchase one with the proper output?
Conte,
I was thinking noobie in a different way. I'm new to flashlight and light tinkering and modification. Usually, I'd just go to the store and buy a light in my price range, typically the two pack of rayovac 2D lights. The only led light I have is a watchcell powered direct drive keychain light I got at Wally World for a buck. I used to do testing and repair on computers, am a certified Audi mechanic, and am in school for mechanical engineering. I got started on leds because I recieved a free dive light with a bad bulb and a broken bezel. The next lower dive light in my series has an led replacement but mine doesn't, and I'd really like to have an led light. Also, I'd eventually like to make led tail lights for both of my vehicles ('01 CVPI and '80 Spitfire) for looks and safety. I do appreciate the warm welcome and hopefully will get enough knowledge and parts together to get some beamshots. :twothumbs
LEDs essentially generate their heat inside the chip, right against the heatsink. And if that chip gets above (something like) 160* C, it will quickly cause trouble with the LED. You've seen how quickly a 10-watt soldering iron will get hot - now imagine that same heat inside a 3-millimeter square. LEDs need more heatsinking than incandescent lights, at least because they break at much lower temperatures. My headlights will first experience thermal failure when the plastic reflector melts or discolors (at around 200*C, an inch or more away from the hot filament).
LEDs are rated by their power. The forward voltage of most white LEDs is 3.2v, and the ones we use these days are generally rated at 1 amp, so 3-watt LEDs are common, while a 5-watt one takes a little over 1 amp. Technically speaking, LEDs (and all diodes) have a 'forward voltage' for a given current. At a lower current, they take lower voltage, and at higher current, more voltage is required.
LEDs become less efficient at higher drive currents. You can look at a datasheet to find specifics - there's a minimum lighting current (10 mA or so), and then efficiency peaks and then falls. The rated current is often below the peak efficiency, and exceeding the rated current quickly decreases efficiency. Note that this will also heat up the LED - and hot LEDs are also dimmer than cold LEDs. Ideally, lights cut back the current supplied when the LED gets hot, preserving the LED and battery life, but reducing output slightly. Just like with other lights, the next lumen is less noticeable and more expensive.
As far as I know, constant-current drivers magically determine the correct voltage to supply metered current at.
Details on the current/heat/efficiency of getting more light from one LED: Take this with a grain of salt in case I'm horribly wrong.
If you put more current through an LED to get more power, you also drive it at a higher voltage - take the new Cree XP-G. At 350mA, it has a given luminous flux (132 lm/Watt, I believe). Doubling the current to 700 mA gives about 175% of the output at 350 mA, and at 1 amp you'll have 250% of the 350 mA output. But you've not only increased the current by a factor of 2.85, you've increased the voltage. This Voltage to Amperage is an exponential increase - small voltage increases will dramatically increase the current, while increasing current will affect the voltage less and less.
At 350 mA, the XP-G takes 3 volts to light. At 750 mA, it's more like 3.2v, and at 1000 mA, you're around 3.3v. This means that you have the following wattages and outputs, approximately:
.35A * 3v = 1.05W, 132 lm, 125 lm/Watt
.75A * 3.2v = 2.4W, 231 lm, 96 lm/Watt
1A * 3.3v = 3.3W, 330 lm, 100 lm/Watt
Now we talk about heat. Almost linearly, increasing heat decreases the luminous output of the LED. Running them at higher power can decrease the output if there isn't enough heatsinking to keep it cool. I hope this helps, and is correct.