VaThInK said:
May I ask why near-UV in particular? I thought the dangerous one is IR and above as the wave gets longer. Hint: CO2 laser at 10.6um can be categorised as microwave as well and see any material as opqaue even clear glass.
The shorter the wavelength, the more dangerous. The reason is because energy in the form of light is transmitted as photons. Let's say there are two light sources of identical properties (power etc) except one is IR, the other UV. Both will be sending the same amount of total energy in a given time, however, the IR will be sent as a whole bunch of low-energy photons, and the UV will be sent as fewer higher-energy photons. In the case of the UV, any one of those photons is energetic enough that a collision with a skin cell, or eye cell, can kill the cell or cause a mutation -- that is why UV light causes sunburn (in the case of killing cells), and cancer (in the event of causing a cell mutation). No matter how much power there is in the IR there is though, no single photon collision will ever be rnough to cause a cell mutation.
The other way that light can cause damage is through heating -- essentially, all the energy from the photons is absorbed. Wavelength is irrelevant when talking about heating up things like black tape, match heads, etc. In the case of heating damage on the retina,
green is going to be the worst -- as that is the wavelength that the eye focuses most sharply.
If I understand your post correctly, yes this is partly what I'm trying to say. If IR light is not well collimated, it's just an ordinary flashlight with the exception of being invisible to human eye, which is harmless.
While I don't think the IR from a typical pointer is nearly as dangerous as the actual
green light coming from the laser, it is more dangerous than an IR flashlight as the light is coming from a smaller point-source.
Consider which is more irritating to your eyes -- looking at a 50W frosted incandescent lamp, or a 50W CLEAR incandescent lamp. Both have similar spectrum and your eye is absorbing a similar amount of power in both cases, but staring at the clear lamp would be worse for you since all the lgiht is concentrated into a small image of a filament on your retina, instead of a large image of a frosted bulb.
I just wanted to point out that a major reason people complain about the IR in green lasers is because it's used as a falcifying marketing tool. the laser is "100mw" but this is 30mw of green and 70mw of IR! (which is also cheep, a 1W IR diode is like $20, $10 or less in bulk! while ture green laserlight is curently very expensive per mw (though this will change when green diodes are avalible))
Most of us know the dangers of IR, we're jsut mad about the marketing scam, think of it like going to a burger store and ordering the LARGE fries, only to find out that the box is taller --- but only half as deep, so you're getting as many or LESS than the smaller fries for the same $$$!
In most cases, if someone's going to be dishonest about the power ratings, they'll usually just pull a number out of thin air, rather than systematically measuring the IR and using that. However, I've read lots of complaints about for example the 20mW lasers from DX which apparently do emit 20mW of green, plus some amount of IR, due to potential safety concerns.
I have wondered for a while: How much IR are you getting while gazing at a roaring campfire for several hours? Its obviously putting out a LOT O WATTS.
Again, it's not so much the wattage that's relevant, but how concentrated that wattage is
on your retina. A fire, or a hot road surface on a summer day for example are diffused light sources. A IR LED, or IR laser are not.