I have a simple question, which I shall put in context.

Say I have an LED array arranged in series which runs on 35v and consumes 3w. If I wanted to run these same LEDs off a LiPo cell, I see two options: Either run the array in parallel, so that it runs off 3v, or use a voltage multiplier to get the battery's output to the required voltage.

The question is, which is more efficient?

My understanding of Ohm's law would mean that running the array in parallel would consume 1A, whereas its series arrangement would require around 0.08A. However at the same time a voltage multiplier should be inherently inefficient.


I just hacked open an LED lightbulb that I got cheap. With the diffusor it claims 240 lumen, though I am never really sure how they measure this, as I have never seen an industry standard lumen-meter. (I am a photographer, and all my lightmeters use other scales, and since lights seem to all use their own invented scales, I long gave up on trying to compare lights and their output, by number). So actually, since the light is running off 240v, the amperage may be even lower. Oddly the rectifier board uses components which I can't even find datasheets for, but I think uses both a bridge and a mosfet.