The greater the resistance in series with the voltage source, the less current draw there will be and therefore longer run time.
With 6V "ideal" battery pair and 60 ohms in series with it to the LED, current draw is drastically reduced from what it was without it. Regardless of a regulator chip in the pill driving the LED. The so called "ideal" light emitting Diode turns on at about 3V, 6V-3V is 3V, The most current you can get is 3V / 60 ohms which is 50mA. 50mA of current across 60 ohms is of course 3V 3V times 0.050 amps is 0.15 watts. So the resistor is not just burning up the watts that the LED is not, its restricting total current and therefore power to the LED. The actual voltage will not be 3V it will be somewhere between 3.0 and 3.7. 3V was used as an absolute maximum worst case voltage differential. Point is the resistor will make the voltage drop below that of the regulator where it goes into direct drive. So the LED will be running at low power, the regulator will be doing nothing, and hopefully not wasting any power beyond the level of 1 diode drop (0.6V) worth either, and the resistor will not be roasting either.
If the LED was at say 3.3V for 50mA that would only be 165mW or 0.165W for the LED. The batteries still see no greater than 50mA current draw. If the batteries had a 1000mA/hour rating then they in theory could run up to 1000/50 or 20 hours at this level.
So no its not just wasting all of the energy that should have gone to the LED as heat. Its restricting current to everything and wasting only a little (compared to the original power level without it) as heat.
If you look at the ratio of resistor to LED power its almost even. The resistor is about 150mW and the LED is about 165mW (these are not exact numbers). So to reduce the total power to the LED, you are now wasting about 48% of the total power (315mW) presently put forth by the batteries into the new total load; but if the LED was previously running at 800mA and 3.7V (almost 3W) and the driver was 85% efficient from those same "6V" batteries, total power draw was probably around 3.5 watts from the batteries, so you are still saving a lot of power to run the light at much lower brightness for longer time. There would be absolutely no point in making a lower power mode with a resistor if the resistor were simply to waste all of the power of the original drive level as heat so you could have less light.
These are all example numbers, not exact.
This ratio changes with the value of the resistor.
So its not a bad thing to have a low power setting with just an in series resistor in the tailcap. What I don't know is how these guys are coming up with specific lumens output numbers. Did they have them measured some where or are they making assumptions based on the reduced current draw and dividing down the "rated" lumen output published for original current draw??