Hi, there are a couple of ways that an LED driver can work. The most common:
a) No driver also called "direct drive"
- just rely on the battery voltage and system resistance to be about right for the current needed.
- Obviously, this means that things have to be operating in a target range
- You have to be tolerant of imperfect regulation
- Resistance in the light turns into heat
b) Resistored light
- This is similar to the direct drive, but a fixed resistance is placed in the circuit
- If the battery voltage is within 1 - 2 volts of the LED forward voltage, this is a viable approach, although imperfect
- Same limits as the "no driver" approach
- Resistance in the light turns into heat
c) AMC type / Variable Resistor (sometimes called "linear drivers")
- More or less, this is a variable resistor that helps to maintain a constant current by changing the resistance of the circuit
- Each chip can hande a fixed amount of current, I think around 350ma
- Works well again for 1 -2 volt maximum voltage drop from the battery to the LED Vf.
- Commonly 60% efficient, the rest turns into heat.
d) Inductor based designs
- Take in power (battery voltage x battery current) from the batteries, and output this power to the LEDs.
- The voltage is adjusted automatically by the driver to provide the desired current
- If the battery voltage is too low, you use a "boost" driver.
- If the battery voltage is too high, you use a "buck" driver
- There are also combination versions of these that can do both, but they are less common.
Efficiency for well designed models can hit 80-90%. Efficiency for lower quality versions is about 50 - 60%.
Coming back to the original question, if you use an inductor based driver, then you want the minimum resistance possible in your light for maximum efficiency. If you use the other methods, then it does not matter that much.