The further the output voltage is from the input, the less efficient the converter becomes, and the higher the current drain the lower the effective mAh of the battery. Also, it seems like you're thinking about putting it between the converter and LED? The converter is also a polarised component, and so it should be between the battery and converter otherwise a reversed battery will blow the converter unless it's already got inbuilt protection, at which point a diode becomes a moot point anyway.
Note that it'll also dissipate a lot of power, especially in higher applications. Since you say that "all the single cell ones have to boost the voltage up anyway" I'll assume you're talking about AAs and lithium primaries (compared with Li-Ions at a max 4.2V, at which point you'll need to step down the voltage).
So let's take an AA and assume that it's at 1.3 V and we desire to draw 500 mA from it. Without a diode, we simply draw the 500 mA, the converter is getting 0.65 W.
With a diode that has a Vf of 0.5 V, this leaves the converter with 0.8 V. To get the same 0.65 W, it'll have to draw 0.8125 A. Note that the current is also flowing through the diode, which is now dissipating (as heat) 0.5*0.8125 = ~0.41 W, so we're now drawing a total of 1.06 W from the battery instead of 0.65 like before.
I think that's a bit of a worst case scenario though.
Also hope my logic and maths is correct, I'm not fully awake yet.