As the sandwiche shoppes blue shark driver is a boost board the input voltage must always be lower than the output voltage.
So i have a 3 cell 5500mah LIpo pack that has a theoretical maximum voltage straight off the charger of
12.69v (4.23 x 3)
I have my blue shark modified to output at 1200ma.
So running the 4 XPG R5 at 1200ma gives a total forward voltage of
13.2v.
So the board at at start up will be boosting voltage by
0.51V
With a theoretical Lumen output of 1598.72 using 15.84 watts.
- Is this difference in boost voltage enough to keep the board in regulation and working as it should?
- And how do i calculate the efficiency percentage? and from that an estimated maximum run time?
lovecpf
As long as Vbatt < Vload, the Blue Shark will run in regulation. The closer Vbatt gets to Vload, the less hard the driver has to work. As long as Vbatt won't be greater than Vload, the issue of concern really is how low Vbatt can be relative to Vload and still run in regulation. The Blue Shark can only boost so much. At some point, you will exceed the switch current limit of the driver and it won't be able to deliver the drive current that you are demanding. Also, the lower Vbatt is, the lower the driver efficiency is. This translates to more waste heat generation and a greater demand for good thermal management.
With your setup, you are at just about the optimum -- Vbatt < Vload, but nearly equal. Thus, driver efficiency will be at its max and waste heat generation at its min. When your cells are spent (e.g., Vcell ~ 3V under load), your total Vbatt is still going to be a significant percentage of Vload. At your drive current level of 1.2A, you probably want to stay above a Vbatt:Vload ratio of 1:2, which is easily done with your 3-cell Li-ion pack. For you to get down to Vbatt ~6.6V, each cell would have to drain to 2.2V, which hopefully you won't allow to happen.
To determine driver efficiency, I first run the light with batteries, measuring the tail current draw. Then I put the driver/LED subsystem on a bench supply and crank it up until the current draw matches the tail draw that I measured when running on batteries. The assumption is that when the currents match, then the input voltage from the bench supply is the same as the voltage under load for the batteries. From there, you can calculate Driver Power In. While running on the bench supply, I also simultaneously measure the drive current and the total LED Vf. That gives you driver power out. Efficiency then is simply (Driver Power Out)/(Driver Power In)*100.
Another approach to measure input power is to use the two-DMM method described
here in the "Power consumption" section.
Or you can estimate driver efficiency by referencing
this link.