They are definitions.... we as HUMANS define them as what they are. This is not a case where we say the world is flat because we don't know any better or that the earth is the center of the universe. This is more like UP is UP and DOWN is DOWN. They are simply definitions and as such as not disputable. There is a conversion factor between watts and lumens at a given frequency of light. Again, this is a definition.... not really up for debate, but potentially refinement as the CIE curve which defines the conversion has changed. To that end you will never get more than 683 lumens/watt ... ever... as that is what the definition allows.
There are some "arguments" that people use against this, but they are wrong:
1) I put in 0.2watts and through some magic process (call it cold fusion), the equivalent of 10 watts of light comes out.... the reality is that while you may be putting in 0.2watts through some external source, some other mechanism, (call it cold fusion) is generating another 9.8 (or 10) watts of energy. You are not magically getting more than the theoretically maximum lumens/watt, you are generating extra power through another mechanism.
2) Increase in perceived brightness. There has been a lot of research that shows that pulsing LEDS at a specific duration and repetition can make LEDs be perceived as brighter than the amount of "total light emitted". This is absolutely true....running continous with power X may generate Y lumens but pulsed with power X may be PERCEIVED to be 2Y lumens. Again, it is PERCEIVED. The definition did not change. Is it "brighter"... as far as anyone that is looking at it, of course it is. But the definition of watt and lumen did not change.
Does that make it any clearer?