The answer is slightly more complicated, but let's run through an example to get close to your numbers.
- 1 LED, Lumileds K2 or Cree P4 type LED
- Assume a nominal Vf of 3.3 volts at 700ma for either (actually not exactly dim)
- Probably 100 ma is more of a dim mode and then the Vf would be more like 3.0 volts.
Power to run the LED
= 3.3 volts x 0.7 amps = 2 watts (approx) or
= 3.0 volts at 100ma = 0.3 watts
To run for 2 hours, you will need a cell that can deliver 2 watts for 2 hours
2 watts x 2 hours = 2 watt hours.
For a R 123, they have about 3.7volts @ 0.7 amp hours
3.7 volts x 0.7 amp hours = about 2 watt hours, so a pretty reasonable choice.
The "gain" from using a driver is that it can take in "power", which is amps x volts, and output it as "power" - effectively transforming any "excess voltage" into current. The "excess voltage" is the difference between V battery and the LED Vf.
The loss from a driver, is that it is not perfectly efficient, so there is routinely 15 % lost in many constant current drivers. In addition, with single cell setups, the battery voltage will often swing from being "above" the LED Vf to being "below" the LED Vf. This is a real pain, as it means the driver must be a "buck / boost" type to deal with it, and they are harder to design efficiently.
Interestingly, most resistored flashlights will turn on when the cell is too low to run a driver. Why - because the driver is trying to run at full spec, and the resistored LED just drops to a lower current draw naturally.
How much is an extra 10 % of run time worth to you? More than carrying an extra cell? If so, then use a driver (buck boost).
I hope that helps.
Harry
PS - Feel free to post an example configuration and use the above info to run your own numbers. We will help you with the corrections / details.