# noob question on "wasted" voltage

#### lampeDépêche

##### Flashlight Enthusiast
When I drive a 3-volt LED with a 6-volt battery, what happens to the extra 3v?

Does the difference in voltage mean that half of the power in the battery is "wasted"?

My understanding of electronics is *really* weak -- if I can model it with hydraulic flow, then I can understand it (usually) but not more than that.

So when I think about a pond with 6 feet of head, driving a turbine that only needs 3 feet of head, then it seems to me that it will waste a lot of power. Water will come out downstream of the turbine with 3 feet of unused head, i.e. potential energy that has not been converted into useful work, which just flows downstream. Wasted.

Back to circuits, I imagine the electrons coming out of the battery with 6v worth of energy, then dropping down to 3v of energy after passing through the LED, and then they go back into the battery. That extra 3v could have done some more useful work before returning to the battery, if there was something else in the circuit (e.g., a second LED in series). So when it does *not* do more useful work,and just goes back into the battery, is that a pure waste?

A related way of putting the question:

Suppose I have a 3v cell and a 6v battery, and each of them is rated at 1Ah. So the first battery has 3 watt-hours of energy in it, the 6v battery has 6 watt-hours of energy in it.

Will the 6v battery drive the LED for longer than the 3v cell will? As much as twice as long? If not, what happened to the extra 3 (or however many) watt-hours of energy in the bigger battery?

I know that for many of you people this will be a painfully stupid question, but I'll be grateful for any illumination (so to speak).

#### archimedes

##### Flashaholic
Re: noob question on &quot;wasted&quot; voltage

That is a simple question with a complicated answer.

Although I am not an expert either, this is a good place to start reading ...

https://www.candlepowerforums.com/v...n-regulation&p=2632695&viewfull=1#post2632695

After you get somewhat more comfortable with the basic concepts of buck, boost, PWM, and related regulation circuits, then more focused searching may be very informative.

It is an interesting area to learn, cheers

#### lampeDépêche

##### Flashlight Enthusiast
Re: noob question on &quot;wasted&quot; voltage

Thanks, Archimedes!

I am familiar with the concept of a buck circuit, which reduces the voltage before it gets to the LED.

But I have run lots of 3v LEDs straight from a 9-volt battery with nothing other than a resistor in the circuit. They worked fine -- did not burn out the LED, and ran for weeks or months (depending on the resistor and lumen output).

So were those resistors functioning as a super-simple "buck," reducing the effective voltage to the LED? Or was the LED getting the full 9-volts and wasting some? Or how should I think about it?

#### john61ct

##### Newly Enlightened
voltage is just difference in **potential**

No "there there" to get "wasted".

The **power** is the energy, what gets used efficiently or not.

Like "water pressure" as opposed to actual water volume flow.

If you double the water pressure without changing anything else you double the flow of water.

Think of a large water tower. Two towers with the same volume, but the one that is much higher off the ground than the other has higher capacity to do work. The water comes out at a much higher pressure.

That's the voltage part.

#### kpatz

##### Newly Enlightened
When I drive a 3-volt LED with a 6-volt battery, what happens to the extra 3v?
Voltage can be thought of as "pressure" if you think of water. The voltage overcomes resistance to allow a current to flow. At a given resistance (pipe thickness), the higher the voltage (pressure), the more current (amount of water) will flow.

If you drive a 3 volt LED directly from a 6 volt battery (directly meaning no driver, regulator or resistor between the battery and LED), the LED will burn out, because too much current will flow. Something has to limit the current to the range the LED will accept.

Google "Ohm's Law" to understand the relationship. Current (I) = Voltage (E) / Resistance (R). You can also solve for voltage or resistance: R = E / I; E = I * R.

Does the difference in voltage mean that half of the power in the battery is "wasted"?
Some loss will occur depending on how the regulation is achieved. A simple series resistor will waste the excessive power (voltage) as heat, though for low current LEDs this is fine since the amount of power used is so small. For a high current LED like in a flashlight, a more efficient driver is usually used, typically a buck or boost converter, which uses fast switching instead of a resistor to control the total power.

Back to circuits, I imagine the electrons coming out of the battery with 6v worth of energy, then dropping down to 3v of energy after passing through the LED, and then they go back into the battery. That extra 3v could have done some more useful work before returning to the battery, if there was something else in the circuit (e.g., a second LED in series). So when it does *not* do more useful work,and just goes back into the battery, is that a pure waste?
The total voltage across the circuit will remain the same, so if you connect a 3 volt rated LED across a 6 volt battery, there will be 6 volts across the LED (less any drop due to resistance in the battery which we'll ignore for simplicity's sake). Your LED would burn out. If you use a resistor in series with the LED, you would end up with 3V across the LED (producing light and some heat) and 3V across the resistor (producing just heat). Is it wasted energy? Maybe, maybe not. If you use the heat, it's not wasted. If you just want light and heat is an unwanted side effect, then yes, it's technically wasted. This is unavoidable with today's technology.
A related way of putting the question:

Suppose I have a 3v cell and a 6v battery, and each of them is rated at 1Ah. So the first battery has 3 watt-hours of energy in it, the 6v battery has 6 watt-hours of energy in it.

Will the 6v battery drive the LED for longer than the 3v cell will? As much as twice as long? If not, what happened to the extra 3 (or however many) watt-hours of energy in the bigger battery?
Assuming you limit the current with a series resistor so both LEDs are drawing the same amount of current from each battery, both batteries will drive the LED for the same amount of time, since both batteries are 1 amp hour. The 6V battery would have more wasted power by heating the series resistor more.

If you use a buck regulator instead of a resistor, the loss is less and you would get some more runtime out of the 6V battery, but not as much as you would if you used a 2 AH 3V battery instead.

#### terjee

##### Enlightened

Question is a 6V battery into a 3V LED, with only resistor to regulate.

Now, the easiest way to look at this might be to switch from voltage to current. The LED might want 20mA of current, and the resistor slows things down until only 20mA is flowing, and the LED is happy.

That same slowing is also dropping the effective voltage that the LED sees down to 3V, but it's the current that matters in terms of how bright, and avoiding letting the magic smoke out of the LED.

So where does the last 3V disappear? Across the resistor.

If you have a cheap multimeter, you can actually measure this.

If we simplify just a little bit, then you should see:
Measure the voltage across the battery - 6V
Measure the voltage just before and after the LED - 3V
Measure the voltage just before and after the resistor 3V

So you're dropping half the voltage across the LED, and half across the resistor.

And yeah, you are turning a bit of energy into heat that way, but at 20mA and 3V, we're only talking about 0.06W, which isn't a whole lot.

Now, getting back to the examples, if all you have is a resistor to regulate, then a 1Ah 3V battery and a 1Ah 6V battery would power the leds equally long, but the 6V battery could also have powered two LEDs in series equally long. You'd just be dropping the voltage across two LEDs, rather than a LED and a resistor.

All of this is slightly simplified, but makes sense?

#### lampeDépêche

##### Flashlight Enthusiast
Thanks, kpatz and terjee, that does make sense, and answers many of my questions.

It tells me, to start with, that half of the energy in the 6v cell would be wasted as heat, in a simple LED+resistor circuit.

One of my mistakes was thinking that when I put a 3v LED on a 9v cell with a resistor, then the LED is exposed to all 9v. Now I see that that is false. The resistor itself acts as a really stupid buck driver, lowering the voltage to something that the LED can tolerate.

Previously, I have not thought of the resistor reducing the voltage, just the current. I think of resistors as current-constrictors, not as voltage-regulators. So when I pick a resistor for the job, I think, "okay, I want to feed this LED about 10miiliamps from a 9v source. If I pick a resistor of about 900 ohms, that should do the trick, since I= E/R which in this case is 0.010 = 9/900."

So I find a resistor of roughly 900 ohms, and it works out okay. Or I use a 450 ohm resistor, and get a brighter LED, or I use a 9000 ohm resistor and get a nice dim LED.

That makes sense to me, because I know that light output for LEDs is generally linear in response to current input (within the right range).

But what the hell is happening with voltage in all of those cases? How come all of these different resistors somehow magically know to waste off 6v and only 6v of the voltage gap? How do they all do it, despite having different resistances? That's really weird to me.

Somehow, the circuit as a whole is telling the resistor how much voltage to drop. Since, after all, if I used a red LED with a vf of 2.5, then the same resistors would now waste off 6.5v instead of 6v. How do they know?

I mean, I realize they don't know anything. But how do the Ohm's law calculations work out in each case, so that the voltage stays constant when the resistance is varying so much?

#### kpatz

##### Newly Enlightened
When you have 2 resistances in series in a circuit, the voltage will be divided between the two resistances. If they're equal, the voltage will be equally divided. The current will be determined by the voltage divided by the sum of the resistances.

Let's take a 6V battery and two 500 ohm resistors in series, so you have 1000 ohms. Connect the two resistors in series across the battery terminals. The current will be 6V/1000ohms or 6 mA. The voltage across the battery (and both resistors together) will be 6V. But if you measure across one of the resistors, the voltage across that will be 3V (E=I * R... so, .006A * 500 ohms = 3 volts). If the resistances are unequal, you can calculate the voltage across each the same way, and the voltages will be divided correspondingly.

Now, diodes aren't resistors. While a resistor has a fixed resistance, and the current that passes through directly corresponds to the voltage and resistance, diodes (LEDs included) have a more or less fixed voltage drop. In other words, the voltage across the LED will be constant. If the forward voltage of an LED is 3V, its "resistance" will be such that the voltage across the LED is 3V. If you give the LED a higher voltage, it will conduct enough current to get the voltage to drop to 3V, so if current isn't limited externally (via a resistor or other regulator), the LED will consume as much current as it can get and will burn out. So, unlike 2 series resistors, where the voltage is divided up based on the resistances, an LED and resistor will always have the LED forward voltage across the LED and the resistor will have any remaining voltage. So, if you connect a 3V forward voltage LED through a 500 ohm resistor to a 9V battery, what will the voltages be across the resistor and the LED? The LED is easy... 3V. The resistor will have the remaining voltage across it--6V. And the current would be I=E/R but less the LED forward voltage, so 6V/500 ohms = 0.012A or 12 mA.

#### lampeDépêche

##### Flashlight Enthusiast
Thank you, kpatz. I am starting to understand.

I really appreciate the explanations.

#### thermal guy

##### Flashaholic
I thought I understood this completely. Until I just read the answers😂😂😂. I use to work with a Russian engineer and he always said the more you know the less you understand. I get that now.

I also just spent 20 minutes googling water towers. 😂😂

Last edited:

#### asdalton

##### Flashlight Enthusiast
A nice thing about resistor regulation is that it's robust to small uncertainties in the LED forward voltage. So, consider if the forward voltage is really 2.8 V rather than 3.0 V. With a 500 ohm resistor, the current is (9 - 2.8)/500 = 0.0124 A, or 12.4 mA (almost unchanged).

#### kpatz

##### Newly Enlightened
A nice thing about resistor regulation is that it's robust to small uncertainties in the LED forward voltage. So, consider if the forward voltage is really 2.8 V rather than 3.0 V. With a 500 ohm resistor, the current is (9 - 2.8)/500 = 0.0124 A, or 12.4 mA (almost unchanged).
This is why current regulation is recommended for LEDs rather than voltage regulation. Forward voltage can vary from LED to LED, and even with things like age, temperature, and current.

Let the LED drop whatever voltage it wants, and feed it the current it needs (without giving it too much), and it will be a happy, healthy LED.

Resistors are fine for low current applications, like the LEDs in a digital clock or the power LED on your TV, but in flashlights where the LEDs are drawing amps of current, a resistor will generate a ton of heat and waste power, and switching regulators/drivers are preferred.

Last edited:

#### RetroTechie

##### Flashlight Enthusiast
But what the hell is happening with voltage in all of those cases? How come all of these different resistors somehow magically know to waste off 6v and only 6v of the voltage gap? How do they all do it, despite having different resistances? That's really weird to me.

Somehow, the circuit as a whole is telling the resistor how much voltage to drop. Since, after all, if I used a red LED with a vf of 2.5, then the same resistors would now waste off 6.5v instead of 6v. How do they know?
It's not the resistor that drops the 'required' amount of voltage, it's the LED! This happens because the voltage drop across a LED is a material property (mostly depends on the LED color), and relatively independent of the current you push through it. For example @ 1 mA it could be 2.9V, @ 10 mA it could be 3.0V, and at 50 mA it could be 3.2V. All very close to 3V within a wide range of current.

A similar story goes for batteries - within limits. That is: in a given state of charge, and up to a maximum current. A good Li-ion charged to 60% might sit at 3.9V when no current flows, and maybe drop to 3.85V when you let it supply 50 mA to a "load". So within 0..50 mA in this example, the voltage supplied by the battery stays constant around 3.9V. This voltage only changes as the battery discharges, or when you draw so much current that the battery has trouble supplying it. Or some combination thereoff - wet chemical cells are very complex to model accurately...

So the resistor sits between a somewhat constant battery voltage, and an even more constant LED voltage. And as a result, fixed resistor value is often good enough to put LED current in small enough range. Like so: (I = current in amperes, R = resistor value in Ohms)

I = ( Vbatt​ - VLED​ ) / R

Or, if you have decided what LED current should be, and want a suitable resistor value:

R = ( Vbatt​ - VLED​ ) / I

Note that the above is essentially one formula, just re-written depending on what you want to calculate. Lazy folks look it up in an electronics textbook or Google "LED calculator"... :laughing:

Some loss will occur depending on how the regulation is achieved. A simple series resistor will waste the excessive power (voltage) as heat, though for low current LEDs this is fine since the amount of power used is so small. For a high current LED like in a flashlight, a more efficient driver is usually used, typically a buck or boost converter, which uses fast switching instead of a resistor to control the total power.
NOPE! Even with a small current, efficiency can be really poor, and this can be a big problem (like if you want a battery to last for years eg. in a smoke detector). And: a series regulator can be very efficient depending on the voltage that goes in & out. Basically:

Linear regulator: current out = current in. Efficiency = Vout​ / Vin​. Loss = voltage difference x current.

So in the given example, 9V in and 3V out means efficiency is 3/9 = only 33%. But if you start with 4V, efficiency is 3/4 = 75% even with a linear regulator. Efficiency can go really high if input voltage matches output voltage close enough. Series resistor is a 'special' case here where current scales linear with the voltage difference. Note that in a simple battery-resistor-LED circuit, efficiency gradually increases as the battery voltage drops closer to the LED voltage. The drawback is your LED outputs less & less light as the battery discharges, whereas a regulator IC can keep current (and thus LED output) constant until battery is almost empty. This setup is very common in flashlights since nominal 3.7V battery voltage matches up nicely with ~3V white LEDs.

Switch-mode regulator: power out = power in, minus some percentage losses. Efficiency is typically around 80..90%, but low voltages are a problem so for example with 1.5 -> 3V conversion (boost), 70% efficiency would be quite decent. 300V -> 12V DC a switch-mode converter could do up to 95..98% efficiency depending on the parts used. Note that with a switcher, current flowing in can be very different from current that flows out, it is the power that remains roughly constant.

For example: a home-built (and designed!) power supply I have, uses a switching converter in the higher-voltage DC -> 5V output line. Unregulated input is around 17V, and circuit does around 85% efficiency under most conditions. Say I draw 3A from the 5V output. That is 15W. Input supplies around 15 / 0.85 = 17,6W. At 17V that is just over 1A supplied on the input. So almost 3x more current comes out than what goes in! It's the power that is (roughly) the same. A linear regulator would do very poor here & heat up things fast. Which is why I used a switcher. But for the 12V output on the same power supply, this isn't too bad (12/17 = around 70% efficiency, and it's a lower current output) so I used a linear regulator there.

Which of linear or switch-mode is better (and how you define "better" ), depends on many factors like input & output voltage ranges, circuit topology, cost, what losses are acceptable (cooling, battery vs. mains powered), complexity you're willing to handle, component quality, or even circuit board layout. That's where experienced electronics designers / engineers come in...

#### lampeDépêche

##### Flashlight Enthusiast
I want to thank all of you for a really helpful and informative tutorial on some very basic electronics.

Let me see if I am following so far, by posing another question:

If I put a resistor into a circuit that has an LED and a power-supply that is very closely matched to the Vf of the LED (say a 3-volt battery and a 3-volt LED), then does the resistor waste any power? Or does it then simply restrict current, while not changing the efficiency?

As I understand the picture from what people have said above, the power wasted in the resistor is the product of the current through the resistor times the voltage drop across the resistor. So if the voltage drop is zero....?

Thanks again for helping me learn the basics.

#### john61ct

##### Newly Enlightened
Re: noob question on &quot;wasted&quot; voltage

No. Power (Watts) is lost to resistance, and Volts will drop, but not uniformly, will vary by the current Amps.

afaik V drop will never be zero

#### WalkIntoTheLight

##### Flashlight Enthusiast
I want to thank all of you for a really helpful and informative tutorial on some very basic electronics.

Let me see if I am following so far, by posing another question:

If I put a resistor into a circuit that has an LED and a power-supply that is very closely matched to the Vf of the LED (say a 3-volt battery and a 3-volt LED), then does the resistor waste any power? Or does it then simply restrict current, while not changing the efficiency?

The resistor will always convert some of the power into heat, thus wasting it. But the smaller the difference between the LED's forward voltage and your battery voltage, the less power the resistor will waste.

IMO, if you've got a 3v power source and a 3v LED, I wouldn't even bother with a resistor. That's especially true with batteries that will sag at high current. For example, lithium 3v coin cells can be wired directly to 3v LEDs without a resistor.