The difference between 2.8 and 2.5 amps on my LED was only 0.078 volts! That is to say they are very touchy.
And for this reason, the burden voltage of
any ammeter in series with an LED will make an enormous difference to the measured current.
I would love to be wrong, but I have never heard of any meter, digital or analog, which has a burden voltage of less than 0.1V on its current scale***.
Most of them are around the 0.15 - 0.25V mark at FSD.
Since LEDs are non linear, this burden voltage makes a HUGE difference in the current reading - the current when the meter is not in circuit will be a lot higher than the reading. Even if your meter is perfectly accurate.
The only way to tackle this problem is by doing exactly what Al Combs has done:
1) Measure (or predict from published data) the Vf at your desired current.
2) Measure the voltage your batteries will deliver into a dummy load at your desired current. This can be done with perfect accuracy, provided your curent meter stays in-circuit while you measure the output voltage with a high-resistance voltmeter (eg any DMM).
3) Calculate the difference between the voltages, divide this difference by your desired current, and there's your required resistance.
4) Subtract, from this calculated value, your estimate of what you think is the internal resistance already built into the batteries/connections/wiring/etc. A figure of 0.2 ohms is typical for a Maglite, I read on this forum.
5) If you now calculate zero, or less than zero, you are just doing direct drive. Otherwise, mutiply your reduced resistance (in ohms) by the square of your desired curent (in amps) to get the required wattage (in watts) for your dropper resistor. Small resistors can be parallelled to get the value or wattage right. This resistor then is spliced into the positive lead.
***The 0.1V champion? You might not believe me, but it's ye olde 500-ohms-per-volt Avo Model 7. Yes, really. But even this will give erroneous readings, as even a tiny burden voltage will affect the reading hugely.