blasterman
Flashlight Enthusiast
- Joined
- Jul 17, 2008
- Messages
- 1,802
Since I don't have a spectrometer I've been trying to come up with a way to measure relative intensity levels between specific LEDs that has little bias in regards to spectrum as possible, and drawing a blank. Dealing with a lot of large reef tank builds and the sheer volume of offbrand LEDs I need some way to measure the relative intensity of LEDs. I can keep it as simple as using a Cree at a specific current, and then determining what intensity level 'X' LED is when using the same config given that 99% of everything else is going to be less efficient anyways. I really don't care about color readings - I want to determine how much less energy 'X' LED is emitting compared to the benchmark. That has to make things easier.
I thought about using a trick from the laser geeks and using a small solar panel, and using a multimeter to take voltage / current readings off the panel with bare LED's illuminating them at close range.
Problem with this is I need to take readings from both royals and various whites. Depending on composition solar cells start to get terribly non-linear around 460nm. When I start throwing in that math, along with actually trying to determine the offset I simply have too many varibles to keep things straight.
Any ideas appreciated.
I thought about using a trick from the laser geeks and using a small solar panel, and using a multimeter to take voltage / current readings off the panel with bare LED's illuminating them at close range.
Problem with this is I need to take readings from both royals and various whites. Depending on composition solar cells start to get terribly non-linear around 460nm. When I start throwing in that math, along with actually trying to determine the offset I simply have too many varibles to keep things straight.
Any ideas appreciated.