Led-FX,
Thanks for the links, I followed them, and I think that Peter actually has a nugget of an original idea. He has also 'spilled the beans' on the basic concept of the idea, and by 'offering to sell' without having done the patenting work, he has greatly reduced the value of the patent rights that an investor could get.
His comment "1 LED = Diode = 1/2 thermocouple, how can I say much about how it works without giving the game away." is the kicker.
You correctly mention the difference between a thermocouple and a peltier effect device, but they are in essence the same things, simply optimized differently. If you place a temperature difference across a peltier effect device, you obtain a voltage difference that you can measure and use to evaluate the temperature difference. If you power a thermocouple, then one pair will heat up and the other will cool down (or heat up less...as I^2R heating might overwhelm the cooling effect).
I believe that what Mr. Lowrie is trying to sell is the idea of alternately operating the LED in an overpowered pulse mode, and then operating as the heat absorbing junction of a thermoelectric heat pump. The goal is to operate at higher average power while improving the heat dissipating capability of the LED. If he can successfully get the LED to 'refrigerate itself' then he does have something novel and interesting.
I would call the above concept plausible technobabble /ubbthreads/images/graemlins/smile.gif I have grave reservations about the technical viability of the concept, and I also have reservations about the economic value of the concept even if it does work. However it is enough that I can't simply dismiss Lowrie's claims as simply impossible.
If his device works as I suggest, then during the pulse period you are operating the LED at higher power and (slightly) lower efficiency. I say slightly because most of the efficiency lost when an LED is operated at higher power comes from the fact that the junction temperature is increased...if he is holding the junction temperature constant, then the efficiency loss is reduced. Then during the 'refrigeration' period, still more power is consumed pumping the heat away from the LED, presumably heating up a junction elsewhere in the driver circuit. Net result: more light output from a given LED, with more power consumption, and more heat dissipated, but with the LED itself running cooler and a mythical 'other junction' getting hotter.
I question the technical viability of the LED as a thermoelectric junction because the potential barriers are all wrong. The energy of a visible light photon is lots more than the energy of a thermal electron at room temperature. I simply don't think that an LED junction would make a good thermoelectric junction. Current flow and heat transport would be very small, and IMHO the resistance heating effects in the package would swamp any cooling effects.
I question the economic viability because LEDs themselves are already 'bright enough'. You can already purchase LEDs that are far too bright to look at. This means that for indicator and direct viewing applications, you don't need the 'LED accelerator'; simply select a brighter LED and use a normal constant current supply. For illumination applications, brighter LEDs would be potentially useful, but for illumination you are fighting the availability of things like compact florescent lights which are cheaper and more efficient. Any system which makes LEDs _less_ efficient will diminish their ability to compete. Finally the cost of this driver circuit will have to be lower than simply using _more_ LEDs. Why but 1 '4x accelerated' LED if I could buy 4 normal LEDs?
-Jon