Interesting the way the curve falls with current.
Is this a thermal effect? IOW, if the LED were sufficiently cooled, would it remain at or near the highest level?
I believe part of the effect may be thermal, but while the diminishing efficiency effect may be reduced with good thermal management, it won't be eliminated completely.
As semicnoductor devices, there are only a limited number of many generation and recombination sites to go around. Beyond a certain current, the LED will begin to saturate, and the probability of an electron "finding" an efficient pathway to get through and generate a photon will get lower and lower, where the probability of an electron losing energy due to resistance (from impurities, or the semiconductor crystal structure) will go up as the current goes up.
Heat will have the effect of increasing the lattice vibration in the crystal, which will make the effective resistance higher. However, heat or no heat the LED will still saturate above a certain current.
On the low end, somewhat more interestingly, there is an effect where LED efficiency goes down for extremely LOW currents as well. What is happening here is that it takes at least 2.8V to generate a blue 450nm photon (assuming a perfect LED, in a real device the necessary voltage may be higher) If you drive a blue (or white) LED at less than that voltage however, there still will be some measurable current, and there still will be some photons made -- but at a longer-than-ideal wavelength. Thus, both the emitter and the phosphor will operate less efficiently, and the device will actually have lower efficiency. This is also the threshold at whcih noticeable color shifting of the LED will start to take place.
A way around this if you want extremely low output without efficiency and color shift, is to use PWM.