- Use an ampmeter or multimeter set to amps, remove the tailcap and make the connection with the meter, remembering that you are adding in resistance.
- Check out a review for the light or users threads on them, often someone has tested , at LEAST they test the runtime which can give a fair idea of about how much the light uses.
- Do your own runtime test , and guess based on the approx battery capacity, and the runtime before it droops.
- Find a thread where the light is discussed, and toss in the question, in eventuality somone might be able to answer it who has a meter.
- Make assumptions based on what is IN the light itself, and an approximate way that it is driven. say if its a 3W led, and its driven pretty well, then its probably drawing about .75 amps at the bulb item. Then calculate how much voltage is in the batteries, subtract losses for drivers, and you get the amps being drawn from the battery itself.
Watts math is really simple, Volts X AMPs = Watts
and the inverse if you know the watts, Watts divided by the battery voltage equals the battery amperage draw.
so how about an example:
assume we have a 2 cell light that runs at an actual 3Watts and it has a driver thing
3W divided by the 2x1.2V battery = 1.25A (tack in 20% for the driver) and its about 1.5A
- read the spec sheet on the item, at least they might have lumens or something shown, or tell the wattage it (might) run at. from the lumens you can check about what the led drive would be, from the wattage, you could calculate the amps from the battery, all approximate.
- and the #1 way to test what a light really uses , hook it up to a (lab) power supply, crank up the supply to the approximate voltage of the battery it will have in it, and read the current draw at that voltage. with the readout power supply , the meter resistance doesnt mess up the testing. with that you can also test at various voltages of psudo battery, and get an idea how the item reacts with both a full and a low battery.