Brightness Differences?

dfischer

Newly Enlightened
Joined
Jul 10, 2008
Messages
29
Tests of LED lights I've studied show more light intensity with 1 LI primaries instead of 2x nimh. Take a 2xAA light vs the same light with a 14550 and the 14550 will have something like 30% more light. Why? Most tests are done with 2600mah aa or better. I see the LI's have a higher voltage, but isn't that what the regulator is for? Or does it only regulate the amp draw and it works off whatever voltage it's presented?

I thought LED's needed something over 3v to work? how could a 2xAA w/nimh get the LED lit wo a voltage boost?
 
It depends on the circuitry of the light in question. Some lights work on direct drive on Li-Ion's (which are NOT primary cells), while others are buck only, some are boost with direct drive >3.7v.
 
Your only showing the limited range of the driver circuit's input voltage/current conversion by going to the higher voltage and better current delivering lithium battery over the 2 NiMH. Those flashlights might run better, or you may wind up burning out the driver circuits.

You didn't mention what these flashlights are and what they were originally intended to be driven with battery wise? If they were designed to run off 2AA and are brighter with 1 3.7V Li ion, or vice versa, if they were designed to run off 1 3.7V Li ion and they are dimmer off of 2 AA.

You can't simply say that set of batteries with high current capacity should have driven the LED light to the proper brightness level, you have to match the input requirements of the driver circuit that is converting voltge and current to meed the needs of the LED. You can overrun a boost driver (higher than expected voltage range) and possibly make the light brighter, and haven't proven anything about the current capabilities of the battery at all.

In direct drive most LEDs don't even turn on until approximately a 3V threshold so 2.4 to maybe 2.9V off of a pair of NiMH would barely run them or wouldn't even turn them on, but the driver that converts from lower voltage higher current to slightly higher voltage and a constant current to the LED would make it work fine. But if that driver was designed for a max input of 3.2V and you take it to over 3.7V you may eventually cause it to fail and then find out you can only run that flashlight with 3.7V battery or it stops running altogether. Problem is there is no way to tell its going to fail until it does.
 
Well OK, and certainly thanks. But it still doesn't make sense. Look at this multi-mode C3: http://www.candlepowerforums.com/vb/showthread.php?p=2207113

A light rated to take up to 4.2v.

Give it 1.2v and somehow it boosts enough to deliver 25 light units. The circuitry boosted it by over 100%, by at least 1.8v

Give it 2xaa, somewhere between 2.4 and 3v and it outputs more like 53 light units.

Give it 3.7v and it outputs more still, something like 68 light units.

So, what's this thing doing? It ain't boosting by percentage, if it doubled the 3.7V it would be over. And it ain't boosting by a consistent amount of volts, I don't think, or the 3.7v cell would have stuffed 5.5v into it and poof!

so I'm left with some kind of curve, but why? If it could boost 1xaa by at least 1.8v it could have pushed the 2xaa to at least 4.2v, and that's more then the 3.7v. Or am I to think that it can boost to over 4.2 on the output. If so, there has to be a limit to what can be done, but the light states it can accept a 14500, so..

I can only assume this light is both boost and buck (in a $20 light?) And I must next assume it somehow gates current delivery based on input voltage (why else is the 2xaa not deliverying full light output?).

But is this a reasonable assumption? Or a better explanation?

My thanks to all that take time to consider,

dan
 
The driver in the Ultrafire C3 boosts from a low voltage to what ever the LED needs, in this case, probably around 3.3-3.5v. The driver boost as much of the voltage as it can. When the battery's voltage is enough to drive the LED by itself at the current needed, the driver goes into direct drive and does nothing to boost the voltage. When the input voltage(and the battery itself) is able to drive the LED at a current higher then the driver is supposed to give, the driver goes into direct drive.

When you use a 4.2v fully charged Li-ion battery in the Ultrafire C3(and many other boost driver lights), the driver goes into direct drive and the battery pretty much drives the LED at what ever voltage and current the LED needs within the battery's ability, until the voltage drops to a level that lets the boost driver work again.

When the input voltage meets or exceeds the Vf of the LED at a current, the driver goes into direct drive.

I'm not sure if I explained this correctly. I'm sorry if I confused you even more.
 
The driver is boosting to the same voltage in the first two cases. But the driver is able to draw less amperage in the first case, hence less brightness. The third case is direct drive, not buck.
 
Short answer is: It doesn't have a really good regulator in it. Based on having good rechargeable batteries it will hold a fairly constant current based on a voltage conversion, but that voltage conversion is different for different battery types. So its not a great power transform function. Its a very low budget driver. And Alkalines still suck big time.

Don't expect it to be linear in how its doing the true "power" conversion. Low voltage/high current/ higher voltage/higher current held for less time. Or in lower modes high voltage/low current for a long time.

The vendor will probably say they did this on purpose so you can have a multi level output based on your needs. You put in the level of battery power you want and it will hold a fairly constant output based on that.
 
Top