The issue is that a typical current regulator IC is either buck (drop) or boost (raise) voltage, or those that could do either have different circuit configuration to do one or the other.
A controller "could" do both, have a boost circuit to raise voltage first, THEN buck to drop it to a desired current, but most people do not want that. It raises cost, driver size, heat, and reduces efficiency. Edit: There are some high cost lights that do both to allow greater battery choices with multiple cells, having so much capacity and so high a current that the inefficiency of both boost and buck is a lesser concern, but these lights cost so much that few choose them, and more often they're still just a buck driver with a wider allowable input voltage.
You want to use the wrong battery type in a device. You could randomly pick any device, say your TV remote control and ponder why it would be damaged using the wrong batteries, and devise some hack to make it use the wrong batteries, or just buy the right batteries for it.
You can in fact find small lights that have multiple modes, so you use a mode that doesn't supply excess current, or you can buy a light designed for the type of battery you want to use. Today there are MANY choices of what to buy, so I don't see the point of your question. Why can't you use the wrong battery? Because a light is rightly optimized for a different battery.
In other words, if a light has a boost circuit to power an LED from a battery with lower voltage than the LED forward voltage, as is the case with any light that can run from ~1.5V to power a white, approx 3.2Vf LED, the most efficient boost circuits would do nothing to limit current if the battery starts out with a higher voltage than the LED forward voltage (Vf).