Has anyone seen any analysis comparing the contribution of the LED die type & bin with that of the regulator & electronics to overall flashlight performance?
Such an analysis, if possible, is not necessary. If necessary, it's not possible (or at least not easy).
I don't say that facetiously. Take 2 lights, identical except for the LED. In order to analyze the effect of the LED, everything else about the light, i.e. driver, battery, etc, must be well understood and predictable over the range of conditions caused by the differing LED specs. If you have this, you simply look up the performance in the LED data sheets. If you don't, good luck making a comparison.
That said, there are some rules of thumb that might be useful to you. They are different for each type of driver, and whether you are in 'regulated' mode or not.
DIRECT DRIVE, FET DRIVERS, and RESISTOR LIMITERS
These are all basically the same thing, except that FET drivers can use PWM to change the apparent brightness. The circuit consists of a battery, an LED, and a bunch of resistors. The battery voltage minus the LED voltage, divided by the sum of resistors equals current. Unfortunately, it's hard to know the resistance of a battery or an LED, especially because they vary with state of charge, age, temperature, and current. The output of these lights drops continuously (though not linearly) from the moment you turn them on, until you turn them off. The LED's Vf and resistance have a huge impact on the initial brightness.
Because the efficiency of such a system is rather difficult to define in a way that everyone would agree on, much less find useful, it's rarely discussed with these lights. Maximum output and maximum current are the most commonly discussed characteristics here.
With the latest generations of LiIon cells and LEDs, it's now fairly easy to build a light that will fry the LED within seconds or minutes of turning it on.
LINEAR DRIVERS
These include the common AMC7135 based drivers that are found everywhere. A linear driver has a current sensor, a variable resistor, and a control circuit. The '7135 integrates this all into a single chip. The control circuit adjusts the resistor until the current is at the desired level. As the battery voltage drops, the resistance is reduced to keep the current the same. Of course the resistor has a minimum value, and if the battery voltage falls too low to supply the set current value, the current will drop below the setpoint, and the driver now looks like a resistor limiter. This is the point where the driver 'drops out' of regulation.
Of course the 'variable resistor' is actually an FET. The difference between linear drivers and FET drivers is that in a linear driver the FET is operated in its linear region where the resistance changes with gate voltage. In a FET driver, the FET is operated in the saturation region, where the gate voltage has minimal impact on the resistance. The point where a linear driver drops out of regulation is the point where its FET enters saturation.
When regulating, these types have an efficiency that's equal to the output voltage divided by the input voltage. Interestingly an LED with higher Vf causes the regulator to operate with higher efficiency. Unfortunately this generally corresponds to lower efficacy in the LED, so light output is not increased. Conversely, a lower Vf means lower efficiency and higher power dissipation in the driver, but because of the higher efficacy of the LED, you may get a modest increase in output. Here, it's lumens/amp or lumens/milliamp that matter. While this can usually be calculated from the datasheet, it's rarely specified directly.
Also note that as the battery voltage drops, the efficiency of the regulator increases. I've seen efficiencies go over 99% as the regulator approaches and enters the dropout region. As I recall, the '7135 tops out around 94-95%. But with a battery initial voltage of 4.0V and an LED voltage of 3.0V, the initial efficiency is only 75%.
SWITCHING DRIVERS
Switching drivers, by using magnetic devices (inductors, usually, but transformers can be used too) to store energy and translate voltages, can overcome the efficiency limitations of the linear driver. It's possible to top 95% over a broad range of conditions. Unfortunately, doing so requires complex circuitry, and it's increasingly difficult to do as the voltages decrease. At single cell voltages, it's difficult to top 90% in a simple driver (though modern ICs are making it possible, only high-end drivers will use these). Low-end drivers will typically be in the 80-90% range, while super-cheap ones are likely to be less, especially at high currents.
Keep in mind that any given switcher will have a range of efficiencies. Low voltages and high currents push efficiencies down.
Here Vf has a different effect. It shouldn't change the output, as this is determined by current, which should be unchanged while the driver is in regulation. But it does change the power output of the driver, which in turn changes the power input, which changes the load on the battery. A 10% drop in Vf should result in about 10% increase in battery life.
I hope this helps. I know it's a lot to absorb; feel free to ask questions (even a lot of them) if need be.
D