Venturing into modding, what determines battery output?

trailhunter

Flashlight Enthusiast
Joined
Jul 2, 2014
Messages
1,095
I bought a couple of empty c8+ hosts as a stepping stone to my first mods. I'm into my quest of determining emitters, drivers and batteries. How do I make a decision on what type of battery (vtca5 for example) vs a lower performing battery and if it can drive the emitter I go with?

As a first attempt, I want to mod an 3v xhp50.2 into a c8+ but I dont know how to drive it to its potential capacity.

Is there a math to this when looking at driver/emitter specs?

Sent from my SM-G970U using Tapatalk
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
To drive that LED to it's specified capacity you'd want to push 6A into it. If you are willing to sacrifice some (maybe most) of it's very long lifetime, you can push it beyond that, probably well beyond, but I haven't seen much info on overdriving that particular device. It may be out there, but I just haven't seen it.

On the other hand, at that power level, a host that small is not going to keep it cool very long. If you want to push it to even 6A, you'll need thermal step-down either in the driver or the operator (when the light gets hot, turn the brightness down).

As to the load on the battery, in your case a reasonable assumption makes it really easy. Since the LED voltage is around 3V, and the battery voltage is higher than that, but not by much, you can assume that whatever current you send to the LED is the same current you need to draw from the battery. We could go through a lot of complicated explanations and calculations, but at the end of the day we'd end up right about there.

So if you want to drive the LED at 6A, you need a battery that can source 6A. Nowadays, most quality 18650 cells can do that no problem.

But that would be considered a moderately high drain, and you should consider what that drain does to the capacity of the cell. For any given cell, the higher the load you put on it the lower the capacity is. "High drain" cells are designed to minimize this effect. The VPCA5 is a very high drain cell, and loading it at 6A, you will probably get its rated capacity or better. A cell rated at 8 or 10A loaded at 6A would likely give you well below its rated capacity. So if you paid a premium for a 3500 mA-H cell, but got one rated at 8A, you might not get much or even any better performance than you'd get from the VPCA5, rated at 2600 mA-H and capable of 35A.

I'm sure that doesn't sound as clear to others as is does in my head, so please ask for clarification as necessary.
 

trailhunter

Flashlight Enthusiast
Joined
Jul 2, 2014
Messages
1,095
Thanks, this makes sense. Is it safe to use a high drain battery that can push 10amps where your emitter is only wanting 3amps?

I suppose I see a scenario where I install a 5-6amp emitter on a 3amp driver to a 10amp battery. Would the emitter try to zap the driver beyond its potential and possibly causing damage?

Sent from my SM-G970U using Tapatalk
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
The driver pretty much controls the current. So if you have a 3A driver, a 35A battery (I think I saw that the VTCA5 is rated for this), and a 10A LED, the current will be 3A and everyone will be happy.

But if you swapped out for a battery rated at 2A, the driver would still try to pull 3A from it, and the battery would be overloaded. If the battery was protected, it would shut down. If not, it would suffer other bad effects, primarily a severe degradation of its lifetime, but also possible safety issues (is that the primary effect?).

Now if you had a 2A battery, a 3A driver, and a 1A LED, the driver would still try to push 3A, and both the battery and the LED would be in serious danger.

To go a little further, in 'normal' operating conditions the battery determines the input voltage, the driver determines the current, and the LED determines the output voltage.

But you also have to consider:
1. The battery has a range of voltages depending on where it is in the discharge curve, and also somewhat based on load. It also has max allowable load current, which if exceeded. produces various results.
2. The driver has: input voltage range, max and min; output voltage max and min; current range (if adjustable). Note that max output voltage may be determined by input voltage.
3. The LED has: max current; current vs. voltage curve.

Each of these characteristics is pretty simple to understand in isolation, and that's certainly the place to start. It's seeing how they interact and fit together that distinguishes the hacker from the designer.

I haven't even mentioned thermal issues yet. There again, simple concepts combine to model reality.
 

trailhunter

Flashlight Enthusiast
Joined
Jul 2, 2014
Messages
1,095
The driver pretty much controls the current. So if you have a 3A driver, a 35A battery (I think I saw that the VTCA5 is rated for this), and a 10A LED, the current will be 3A and everyone will be happy.

But if you swapped out for a battery rated at 2A, the driver would still try to pull 3A from it, and the battery would be overloaded. If the battery was protected, it would shut down. If not, it would suffer other bad effects, primarily a severe degradation of its lifetime, but also possible safety issues (is that the primary effect?).

Now if you had a 2A battery, a 3A driver, and a 1A LED, the driver would still try to push 3A, and both the battery and the LED would be in serious danger.

To go a little further, in 'normal' operating conditions the battery determines the input voltage, the driver determines the current, and the LED determines the output voltage.

But you also have to consider:
1. The battery has a range of voltages depending on where it is in the discharge curve, and also somewhat based on load. It also has max allowable load current, which if exceeded. produces various results.
2. The driver has: input voltage range, max and min; output voltage max and min; current range (if adjustable). Note that max output voltage may be determined by input voltage.
3. The LED has: max current; current vs. voltage curve.

Each of these characteristics is pretty simple to understand in isolation, and that's certainly the place to start. It's seeing how they interact and fit together that distinguishes the hacker from the designer.

I haven't even mentioned thermal issues yet. There again, simple concepts combine to model reality.
This is really good info, I'm surprised I didn't get an update on your response to follow up sooner!

I posed the question separately but maybe I can ask here. When would I choose a linear vs fet driver? I'm thinking of a triple or quad 219b build and I'm at this cross road.

Sent from my SM-G970U using Tapatalk
 

grayjay70

Newly Enlightened
Joined
Dec 12, 2018
Messages
31
This is really good info, I'm surprised I didn't get an update on your response to follow up sooner!

I posed the question separately but maybe I can ask here. When would I choose a linear vs fet driver? I'm thinking of a triple or quad 219b build and I'm at this cross road.

Sent from my SM-G970U using Tapatalk

I am still somewhat new to flashlight modding but have been working to understand these same issue. This explanation might be a bit of a simplified generalization, I would welcome critique/improvement from more knowledgeable members;

a FET driver on direct drive mode pretty much behaves like wiring the battery directly to the LED, the max current is really only regulated by the combine resistance of the LED, battery, spring, wires and the overall efficient for delivery of power from the battery to the LED when in direct drive mode is very high, minimal waste of power due to heat generated by the driver (though the LED may get over-driven and operate at lowered efficiency). A direct driver is good if you want to supply relatively high current levels and/or you have the total system resistance somewhat matched to not exceed the current that the LED can handle and/or the amount of heat that the host can dissipate. FET drivers can also be setup to provide PWM in order to lower the current delivered for lower light levels. Driver efficiency in the throttled FET PWM mode is generally not great. The PWM regulation can also be used to step-down the direct drive mode in response to overheating or in response to a timer (turbo mode timer) or in response to battery voltage dropping as battery charge is used up. In all modes (direct drive and FET PWM regulated) the brightness of the light will gradually diminish as the battery voltage drops.

Linear drivers use AMC 7135 chips to supply a fixed amount of current to the LED at the drivers maximum setting. When the fully charged battery voltage is higher than the voltage needed to flow the drivers set current level to the LED, then the excess voltage is wasted as heat generated by increased resistance when passing through the 7135. This keeps the max current that reaches the LED regulated to a fixed level to prevent the LED from overheating, keeps the LED operating at a lower and possibly more efficient output level (but with lower driver efficiency when the voltage is high. Once the battery begins to discharge and the available voltage matches the voltage needed to push the given current level to the LED, then the excess resistance of the 7135 is eliminated and the driver works at high efficiency, similar to a direct drive though it is not longer regulated (dims with further voltage drop). Linear drivers can also use PWM to decrease LED brightness levels but the PMW again introduces additional inefficiency.

More advanced drivers combine both types of operation, such as adding a single 7135 to an otherwise FET/DD so that it can be switched to use the single 7135 for more efficient low light modes instead of all regulation occurring via PWM of the FET.
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
grajay, you certainly have a solid grasp of DD/FET and linear drivers.

I would point out though, that while the nearly ubiquitous '7135 chip has fixed current that can only be turned on and off, it is possible to build linear drivers with adjustable current. I build some with up to 0-25A output. This creates an 'analog' change in current/brightness, and eliminates the 'flicker' that PWM causes. It's easy to fool the human eye to ignore flicker, but much harder to fool digital cameras.

Also, PWM doesn't cause degraded efficiency. I know you didn't say it does, but I can see that people might think you did. PWM generally causes efficiency to stay the same instead of increasing as currents drop, as is possible with analog dimming. It isn't always true, but in general dimming by PWM keeps efficiency the same, while analog dimming improves efficiency.

Lastly, purists may point out that we are misusing the word 'efficiency'. We are. Get over it. I did.
 

grayjay70

Newly Enlightened
Joined
Dec 12, 2018
Messages
31
grajay, you certainly have a solid grasp of DD/FET and linear drivers.

I would point out though, that while the nearly ubiquitous '7135 chip has fixed current that can only be turned on and off, it is possible to build linear drivers with adjustable current. I build some with up to 0-25A output. This creates an 'analog' change in current/brightness, and eliminates the 'flicker' that PWM causes. It's easy to fool the human eye to ignore flicker, but much harder to fool digital cameras.

Also, PWM doesn't cause degraded efficiency. I know you didn't say it does, but I can see that people might think you did. PWM generally causes efficiency to stay the same instead of increasing as currents drop, as is possible with analog dimming. It isn't always true, but in general dimming by PWM keeps efficiency the same, while analog dimming improves efficiency.

Lastly, purists may point out that we are misusing the word 'efficiency'. We are. Get over it. I did.

DIW- Thanks for the feedback!, I am definitely still trying to sort this all out myself.
If I compare HKJ's test graphs of current/voltage/efficiency for the following basic PWM drivers;

FED/Direct driver; https://lygte-info.dk/review/DriverTest BLF17DD UK.html
and
7135 Linear driver; https://lygte-info.dk/review/DriverTest 17mm 7135x3 AK47A MCU Dimming 1000mAh 5Mode UK.html

Both of these drivers appear to entirely use PWM for regulating the lower power modes. In both cases, it appears that the overall efficiency calculated for the lower power PMW modes is around 10-20% lower than the equivalent max power (non-PWM regulated) mode available from the driver at the same input voltage level. Do these cases not represent a reduction in driver efficiency due to the use of PWM regulation or are there other effects at play that I am not considering that contribute to the apparent lower efficiency?

I would imagine that with a linear driver, it should work better to achieve the lower regulated current levels by having the driver selectively disconnect some of the available 7135 chip from the power supply path, rather than leaving all of the available 7135's turned on an regulating them en-mass via PWM?

Do you have a thread explaining your linear 0-25A analog driver? I am curious how this is achieved! Are you using 70+ individual 7135's to regulate that much current?
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
I think there are some effects you aren't seeing.

With any DD, FET, or Linear driver all the current from the battery goes to the LED (well, except for the very small 'quiescent' current in the regulator, which can usually be ignored). All the voltage difference between the LED and the FET is dropped across some resistance. So the 'efficiency' is purely determined by the ratio of the LED voltage to the battery voltage.

But what is the battery voltage? Is it the electrochemical voltage? Terminal voltage? Under load? No load? Fully charged? Average? Trying to calculate efficiency in the purest sense is kind of pointless.

As you increase current, the LED voltage goes up and the battery voltage goes down (by most measures), so your calculated efficiency would increase with increasing current. But is this actually the case? No, not in any meaningful way. In fact, quite the opposite is true. Battery capacity falls with increasing current, as does LED luminous efficacy. By any meaningful measure, the total light output from any given battery and LED will will be inversely related to current. This might reverse at very low currents, but who cares?

As far as PWM vs analog, there are tradeoffs with both. Specifically regarding efficiency, using high current and turning it on and off rapidly (PWM) may give you higher calculated efficiency, but running at a steady, lower current will almost certainly give you more lumen-hours from your battery. With high drain batteries and modern LEDs this effect may be pretty small in the ranges where most people operate their lights, but when pushing hard with DD and FET drivers, it may still be significant.
 

DIWdiver

Flashlight Enthusiast
Joined
Jan 27, 2010
Messages
2,725
Location
Connecticut, USA
Here's a link to the highest-current driver I've made. https://www.candlepowerforums.com/v...-Feeler-thread-0-25A-adjustable-linear-driver

There's a single FET controlling the current. You need a pretty substantial heatsink to get anything like maximum performance from it.

There used to be a thread showing schematics, discussing all kinds of technical aspects, but I can't find it. I wonder if it got lost in the crash. It's got me curious. I'll post more if I can find it.
 
Last edited:
Top