Vermonter73
Enlightened
- Joined
- Jul 25, 2006
- Messages
- 335
Anyone heard good news or rumors about what's next in LEDs? I heard something about a new Cree that's 160 lumens per watt, but I forget the details.
I'm hoping for anti droop technology so the efficiency stays higher for higher currents. They can already hit over 100 lumen per watt, now it's keeping that efficiency at a higher power. Might require a new phospor and better heat transferring.
Why don't they make LED bulbs with the phosphor on the glass and the die where the filament would be, like florescent?
The phosphor would run cooler and would be converting far less blue to white light so perhaps would be much less droopy?
Sorry, the Multi-Die analogy is 100% wrong. In facts, its nearly completely opposite...
Then explain in detail why I'm wrong rather than saying I am.
"Do not assume the 'Flashaholic' designation under my name assumes I give a flip about flashlights, because I don't."
Neither is the case for LEDs.The point is that for CPUs, electrical power and processing power are in no direct relationship.
Only in parallel, just like computer processors. Doubling the amount of current to a single LED chip *doesn't* deliver twice the light, and doubling the clock of a computer processor requires more than doubling the current directed at it. Using more LED chips on a single die *does* deliver twice the light at double the power levels, and two processor cores *does* double processing power. The added technology requirement is minimal. Intel and AMD dove into multi-core CPUs because the per die economics dicated they move in this direction, and the people buying the new multi-core processors had sufficient technology to support and take advantage of SMP. Just like CPU's, LEDs have become sufficiently efficient with color problems solved that more chips is really the way to go. That is, unless you know of some 400lumen, single chip, warm-white Cree I can buy.LEDs, otoh, are perfectly "parallel", AND their performance is proportional to their electrical consumtion.
Perhaps I made a mistake with semantics in my first post, but I was referring to MC-E and Bridelux type architectures, and these indeed have significant advantages over single chips if you are capable of simply adding up the cost to reach a desired lumen level. Except for Intel strapping multiple P4 Xeons on a single die and calling it 'Dual core' for a bit, the analogy is sound. Multi LEDs on a single die are likely the trend for quite awhile because single chip technology doesn't seem to be going anywhere other than occasional news blurbs from some Engineering wing of a semiconductor maker looking to notch up stock prices at the end of a quarter based on hype. Seriously, many of you are naive to think otherwise.So there is no inherit advantage of multi-die or bigger dies.
Can someone give an insight to how LEDs today, say an R2 compares to a top of the line LED from a 18 months ago, would that be a Luxeon III?
The R2 was released around about 18 months ago.
Do LEDs follow along with Moore's Law same as computers.
Haitz's Law states that every decade, the price of light-emitting diodes have fallen by a factor of 10 while the performance (measured in flux per unit) has increased by a factor of 20, for a given wavelength (color) of light.
Why don't they make LED bulbs with the phosphor on the glass and the die where the filament would be, like florescent?
The phosphor would run cooler and would be converting far less blue to white light so perhaps would be much less droopy?