Estimating (predicting) runtimes?

beav

Newly Enlightened
Joined
Dec 15, 2004
Messages
28
Location
the hole
I'm not too savvy in all things electrical... Have a quick question.

Is there a relatively accurate way to estimate runtime of a battery powered LED light?

I presume that I'll need to know the starting voltage, estimated current draw, and maH of the battery? Also the estimated efficiency of the circuit. Anything else I'll need?

Any help with formulas and theory is appreciated.
 
the toughest part of this is figuring out the loss due to battery resistance.. however you can usually find this out.

What i do is figure out what the WH of my battery will be at the load i intend to use it.. in other words.. if my battery can output 1A for an hour.. at an average voltage of 3.5V.. that means 3.5WH.. Watt•Hour is a unit of stored energy think of it like how big the gas tank is or how many gallons of gas..

once you have WH it's a pretty simple calculation.. you need to know the current draw at the battery and the average voltage of the battery to figure out Watts at the battery.. so say your light draws 500mA in this example above.. well.. the battery has 3.5WH of total energy.. divide by (.5Ax3.5V).. and you get 2.0 hrs.. which logically makes sense because it's 1/2 the demand of the 1A solution.. you will get MORE runtime than that because the lower the current the less energy is wasted on heating the battery.. so you have to find out what the WH rating is at the current level you have, but energizer.com has great charts that typically include 3 levels of current.

if you measure the current and voltage at the battery you work the circuit efficiencies out of the equation.. if you want to start at the emitter which most of us do.. i..e if i run my emitter at 700mA.. than you have to multiply the emitter voltage by the emitter current to get emitter power... you divide this by the efficiency of the circuit to get estimated battery power.. which you can calculate if you have it or use a typical sliding scale..

at higher power levels most ckts are more efficient.. up to 95%.. but more likely 90%.. at lower power levels (like 100mA at emitter).. they are more often like 75%.. and this is for closely matched power sources.. the farther the bat voltage is from the emitter voltage the lower the efficiency.. for example.. my 1.5W driver is 70% efficient putting .5W into the emitter from 1.5V, but 93% efficient putting 1.4W into the emitter from 3.0V.. yet from a 3.6V LiON the efficiency goes back down to about 80%, but can output 1.9W.

to be on the cautious side when doing rough calculations i use 80% for my converter loss... and i make sure that i know the actual battery discharge curve because if you over-drive a battery you'll waste at least 1/2 the power in the heating of the battery.

example: an alkaline battery (aaa) may have a rating of about 1.8WH.. but if you push it to 1/2A.. it will be dead in 40 minutes.. so.. you get 24% of the energy of the battery (.43 WH usable energy and about .65W).. but a LiON AAA battery.. rated at 1.1WH.. you can run that at 450mA for about 35 minutes.. but since the average voltage is 3.55V.... you get .93WH or usable energy out of a 1.1WH battery.. or 85% of the stored energy.. and you get it out at a rate of 1.6W.. 3x the power.

(any wonder tiny cellphones all use LiON batts.. or my nano requires them?)

-awr
 
Top