Manufacturers tend to tell you the peak output and the total runtime, with the total runtime including any "moon mode". This doesn't mean that the light produces the peak output (or, necessarily, anywhere near the peak output) for the entire run time period. So a simple multiplication of the two numbers gives you a meaningless result unless the lights both have a very well regulated output and no "moon mode". For a fair comparison of any other lights you'd have to integrate the light output over the runtime. LEDs become a lot more efficient at lower output levels (until they reach a very low output level, where the efficiency abruptly drops), so a light that spent more of its runtime at a lower (but not too low) output level would have an additional efficiency advantage, all else being equal.
There are a few lights which have a very well regulated output and no "moon mode" -- that is, the output stays constant until the battery is drained, then the light abruptly shuts off. The Mini-Mag LED is one light I know of with this characteristic. If both of the lights you're comparing have this characteristic, then simple multiplying of the runtime and output is a reasonable comparison -- with the realization that if the regulators are equally efficient, the light putting out the least light should deliver a better product because of the increased LED efficiency at low levels.
One additional factor to consider when doing this kind of comparison is battery energy. This isn't a big factor here, since both lights use lithium cells. A NiMH or lithium (primary or rechargeable) cell has a reasonably constant total energy content over a fairly wide range of discharge currents. It will deliver more total energy, however, at lower current, giving the lower output light an edge. But if you're ever comparing lights where one or both use alkaline cells, the one drawing less current can have a big advantage because it'll get more energy out of the battery.
c_c