Would it be visibly noticeable if a flashlight went from a 100% output to 90% ?

beezaur

Flashlight Enthusiast
Joined
Apr 15, 2003
Messages
1,234
No.

Your eyes have a roughly logarithmic response to light, so you would not detect a 10% reduction in actual light as a 10% reduction in stimulation. The reduction in sensation would be maybe 2%. It's probably not detectable, even with 2 lights side by side.

Scott

PS: Just reread your post.

There is an optometrist or two floating around here, that's really who you need to answer this.

Just for the sake of argument, lets say that 1) your eyes have a precisely logarithmic response, and 2) you need a 10% change in senation to tell a difference. Mathematically let's say that means log(a)/log(b) = 0.9 for two brightnesses 'a' and 'b'. That means 'a' (the dimmer of the two) is around 63% as bright as b. Basically you would need to increase or decrease the actual brightness by 1/3 before you could readily tell.

Those aren't real numbers, but it shows you how the thing works. I don't think I am horribly far off.
 
Last edited:

IsaacHayes

Flashlight Enthusiast
Joined
Jan 30, 2003
Messages
5,876
Location
Missouri
Depends probably too on how bright the light is to begin with.

I can tell differences pretty good. But only when changing back/forth between 90-100% could you really tell. If you used one, then put it down, picked up again 10mins later with 90% output, you'd have a hard time telling.
 

VidPro

Flashlight Enthusiast
Joined
Apr 7, 2004
Messages
5,441
Location
Lost In Space
IsaacHayes said:
Depends probably too on how bright the light is to begin with.

I can tell differences pretty good. But only when changing back/forth between 90-100% could you really tell. If you used one, then put it down, picked up again 10mins later with 90% output, you'd have a hard time telling.

yup what he said. from 1W to 3W on my 2 mode lights, i have to either side by side, or switch it while specifically looking at the light it puts out to tell.
3x as much light, and IF you put it down for 10 minutes you would have a hard time telling.

On the other hand, when i am trying to look at something far away in the dark, at the creek or in the woods, that extra 2W is all the difference in the world.

90% though? even if it was off for 2 seconds, i could probably not tell that small of a difference.

my experience is power based, not lumen or lux specific, efficiency is higher at lower powers with leds, but the basic idea is still the same.

if my light dropped to even 80% slowly while using it in a single session, i would probably not notice , because within that session, my eyes would be adjusting to the situation. sooooo if my light went down slowly, it would not bother me much.
i can alwasy squint and try harder with the retina, and brain, but if you dont have to thats cool too :)

arent incadescent lights running with alkies, dropping MUCH worse than that all the time, and untill its very low, most people dont have a big issue with it.
 
Last edited:

JohnK

Flashlight Enthusiast
Joined
Dec 7, 2002
Messages
1,534
Location
Tennessee., USA
Well, I'm the optometrist, but I can only tell you what I have learned here.

I did NOT believe the statement that you need about a 50% increase/decrease in brightness (lumens) to notice the difference. Totally rejected it. Enda !

Then I purchased a LOT of lights.

As stated above, unless you have the two lights on at the same time, and side by side, it is VERY difficult to tell rather large differences in brightness. That's the way it is.

I own a Fenix LIP, and a Streamlight 4AA Luxeon. Quickbeams tests show the Fenix has about 70% of the brightness of the SL. They have very similar beams, corona, and spill.

If I go outside in the DARK, no ambient light, and do not have the lights together, it is difficult to tell the difference. Side by side, yes.

Interesting subject, to say the least.
 

Roy

Farewell our Curmudgeon Administrator
Joined
Apr 14, 2002
Messages
4,465
Location
Granbury, Tx USA
When I joined the CPF back in the spring of '02, the rule of thumb was sthat it took about a 50% change in the light level before MOST people would notice the change. That became the basis for defining runtime as the time it takes a light to change by 50% of MAX brightness.
 

Yooper

Enlightened
Joined
Nov 2, 2005
Messages
462
Location
Upper Peninsula of Michigan
I'm not an optometrist (OD). I'm an ophthalmologist (MD). Because of the logarithmic response curve, with an average healthy visual system most people require about a 50% difference in brightness to distinguish a difference. You can train yourself to do better, but a 10% difference would be really tough to see, even with side by side comparison.
 

Latest posts

Top