PhotonWrangler
Flashaholic
Got a question about the Andover LPM-1 laser power meter that was a part of a CPF group buy awhile back.
I have a couple of pointers based on the 405nm Blu-Ray diode, so out of curiosity I aimed them at the sensor on my LPM-1 and got a reading that was approximately double the rated power on the sticker. What I don't know is whether the sensitivity if the photodetector is higher at shorter wavelengths or if the LDs are actually running hotter than the stated power level.
Does anyone know what the sensitivity curve is on the LPM-1 meter's PD, and whether there is a compensation value that can be used to calculate the actual reading when used on the 473nm setting?
**UPDATE**
Looking down the bore of the LPM-1 I see what appears to be a polycrystalline silicon photovoltaic cell. A quick search shows that theese cells have slightly less sensitivity to blue than they do to green, with sensitivity peaking towards the NIR region. Changing the LPM setting from 532nm and 473nm, it appears to add more gain for the shorter wavelength to compensate for the lowered sensitivity, as my Blu-Ray pointers read a little higher on the 473nm setting. The power reading difference seems to be about 3.5mw spread between 532 and 473nm. Since the sensitivity curve of a Si cell seems fairly linear (although not flat) between 400-600nm, is it safe to assume that my offset will be about .059mw per nm subtracted from the 473nm setting, or an additional 3.89mw added to the reading when measuring a ~407nm diode on the 473nm setting?
PW
I have a couple of pointers based on the 405nm Blu-Ray diode, so out of curiosity I aimed them at the sensor on my LPM-1 and got a reading that was approximately double the rated power on the sticker. What I don't know is whether the sensitivity if the photodetector is higher at shorter wavelengths or if the LDs are actually running hotter than the stated power level.
Does anyone know what the sensitivity curve is on the LPM-1 meter's PD, and whether there is a compensation value that can be used to calculate the actual reading when used on the 473nm setting?
**UPDATE**
Looking down the bore of the LPM-1 I see what appears to be a polycrystalline silicon photovoltaic cell. A quick search shows that theese cells have slightly less sensitivity to blue than they do to green, with sensitivity peaking towards the NIR region. Changing the LPM setting from 532nm and 473nm, it appears to add more gain for the shorter wavelength to compensate for the lowered sensitivity, as my Blu-Ray pointers read a little higher on the 473nm setting. The power reading difference seems to be about 3.5mw spread between 532 and 473nm. Since the sensitivity curve of a Si cell seems fairly linear (although not flat) between 400-600nm, is it safe to assume that my offset will be about .059mw per nm subtracted from the 473nm setting, or an additional 3.89mw added to the reading when measuring a ~407nm diode on the 473nm setting?
PW
Last edited: