Last edited:
the cool white was like 514ish
and the warm white was like 450ish
But I thought my 250hp car motor made 1000hp. You mean that can't be true either?Claiming "75 watts of light" implies that you are in fact emitting enough light to equal 75 joules per second of light - 4.2 joules of light will heat a gram of water 1 degree C. This is a lot of light, as white LED light is (depending on several things like dominant spectra and such) about 300 lumens per watt. So about 22500 lumens - not so shabby! But nobody can make "75 watts" of light by consuming "7.8 watts" of power. It's false on the face of it.
If you stretch and say "Output equivalent to a 75 watt lightbulb," then we have to reference output of 75 watt bulbs, which are about 1300 lumens. If the LED claimed emits 1300 lumens (with about 10% optics losses factored in) while consuming 7.8 watts, it has an astonishing 167 lumens per watt efficiency. This is slightly more efficient that Cree's best LED does in lab conditions, but at 22 times the current.
They claim 470 lumens for this LED - while that's not bad for a CRI of 90 at varying color temperatures, it's really got about the output of a 30 watt incandescent bulb. So it's not a bad LED edison fixture, considering that all LED edison fixtures start out bad. Buy real LED lights that aren't crammed into a tiny insulated appliance in the first place!
This sounds almost exactly like something I wrote in another thread. I hate the term "watts of light" with a purple passion. It has to be one of the most meaningless, confusing, and imprecise terms that ever existed. Even if you assume "watts of light" means incandescent bulb equivalent, not all incandescent bulbs are equal. Some 75 watt lamps emit about 1200 lumens. I've also seen ones which emit 700. And lower wattage bulbs always have lower lumens per watt, making the term even more nebulous. In short, if someone say "100 watts of light", and it's taken to mean incandescent equivalent, then you could mean as high as 3500 lumens ( i.e. short-life 100 watt projector bulbs ), or as low as 400 lumens ( 25 4 watt night light bulbs ). That's close to an order of magnitude difference! Simply put, "watts of light" is a term which should be relegated to the dustbins of antiquity.Claiming "75 watts of light" implies that you are in fact emitting enough light to equal 75 joules per second of light - 4.2 joules of light will heat a gram of water 1 degree C. This is a lot of light, as white LED light is (depending on several things like dominant spectra and such) about 300 lumens per watt. So about 22500 lumens - not so shabby! But nobody can make "75 watts" of light by consuming "7.8 watts" of power. It's false on the face of it.
Well now you're claiming something completely different, technically. Either way, sounds a lot like bad advertising to me.Buy the bulb and try it, you will see it's as bright as a 75 watt incandescent, it sure works for me.
/
Also, the math they're using for the lifetime is suspicious:
Each Array bulb will cost much more than its glass and filament 60W incandescent ancestor, but a lifespan of some 50,000 hours ( three years permanently lit, or about 10 years or so of "normal" use) combined with the electricity savings you'll make, will compensate for the price.
50,000 hours is 5.7 years permanently lit, not 3 years. No idea what "normal use" is, but if you assume 8 hours per day, then that's about 17 years.
All we really need now is for the price of the LED to come down in price