Our vision does indeed have a logarithmic response to brightness. Astronomers have known this for a long time - they rate stars according to magnitude - a measure of the relative brightness of stars. Magnitudes are arranged in the manner the ancients liked - so a first magnitude star is noticeably brighter than a second magnitude star, which is in turn brighter than a third magnitude star. The ancient greeks weren't especially scientific about this - the really bright stars were first magnitude, the next tier down were second magnitude, etc.
The difference in brightness between stellar magnitudes is a factor of 2.51 - so a first magnitude star is 2.51x brighter than a star of the second magnitude. The magnitudes are spaced such that they are obvious to the human eye - a first magnitude star is CLEARLY brighter than a 2nd magnitude star. The factor of 2.51x means that a difference of 5 magnitudes is a difference in brightness of a factor of 100x. Folks who are VERY experienced can estimate a magnitude to about 0.1 - so if you are really practiced you can see a ~25% difference the in the brightness of a star. (You can't be anywhere near this precise without another nearby star of known magnitude to use as a reference.)
Keep in mind that since I'm talking about stars it's purely a point source of light - area illuminated by a flashlight changes this discussion a bit (makes it worse actually - your eyes work on DIFFERENCE, so a point source is the best case for something you can see), but the general principle applies - a LARGE linear difference in the relative brightness of a light source has a much smaller apparent difference to your eyes - 5 magnitudes of stellar brightness is a factor of 100x brightness. The sun,btw, is magnitude -26 - it is apparently 25 billion times brighter than a very bright star.
And that tells you the reason our eyes have a logarithmic response to light - our eyes can handle an absolutely HUGE range of brightness, from starlight to sunlight.