Voltage |VS| mah

rangerxtrn

Newly Enlightened
Joined
Dec 6, 2009
Messages
41
Location
Mid-Atlantic
okay, I searched and came up with nothing so here goes...
I realize that higher voltage typically means higher performance, but recently I have been shopping to see if I can find higher capacity rcr123s for my quark tactical the trend I noticed was higher capacity means lower voltage so it appeared that the battery wasn't necessarily a better battery... can someone break down voltage |vs| mah and how the ratio effects performance??? Thanks! :confused:
 
okay, I searched and came up with nothing so here goes...
I realize that higher voltage typically means higher performance, but recently I have been shopping to see if I can find higher capacity rcr123s for my quark tactical the trend I noticed was higher capacity means lower voltage so it appeared that the battery wasn't necessarily a better battery... can someone break down voltage |vs| mah and how the ratio effects performance??? Thanks! :confused:

With quality of materials considered equal, the voltage is usually a result of a type of cell wired in series with the next. With batteries, wiring in series adds voltage but not amp-hours. Wiring in parallel keeps voltage the same, but you get to add up the amp-hours between the cells.

Batteries that are truly better are usually a result of either 1) using better quality materials, or 2) manage to stuff more cells into the same space of the battery casing.

So to clarify:
Parallel wiring: add mAh
Series wiring: add voltage
Better battery: usually either stuffed more cells (thin Li-ion) together or simply used better materials.

I suggest getting acquainted with Ohm's law. Here's a short video for the lazy who don't want to read a wiki article: http://www.youtube.com/watch?v=-mHLvtGjum4
 
With quality of materials considered equal, the voltage is usually a result of a type of cell wired in series with the next. With batteries, wiring in series adds voltage but not amp-hours. Wiring in parallel keeps voltage the same, but you get to add up the amp-hours between the cells.

Batteries that are truly better are usually a result of either 1) using better quality materials, or 2) manage to stuff more cells into the same space of the battery casing.

So to clarify:
Parallel wiring: add mAh
Series wiring: add voltage
Better battery: usually either stuffed more cells (thin Li-ion) together or simply used better materials.

I suggest getting acquainted with Ohm's law. Here's a short video for the lazy who don't want to read a wiki article: http://www.youtube.com/watch?v=-mHLvtGjum4

Thanks for the info but that isn't exactly what I was asking... I understand resistance etc... I wanted to know performance characteristics involving light output... i.e. if you go with two 3.7vRCRs@750mah you have intially 8.4V then it stabilizes @ 7.4V@ 1500mah collectively? as opposed to a 3.7V18650 @ 1600mah... so will the light be dimmer but run longer with the 18650? that's what I'm asking...Still though thanks for giving some input
 
Batteries store energy. You want to get as much energy as possible out of any battery. There are two main things about them everyone should understand, voltage and capacity. The energy stored is a product of Voltage * Capacity. Also with Ohms law, you loose less energy with higher voltage, because you need a lower current for the same energy.


Ah * V = stored energy / 2000mAh*1.2V =2.4Wh. With this you can power a 1W lightbulb for 2.4 hrs, or a 2.4W light bulb for 1hr. / 900mAh*3.7V=3.3Wh...

P(lost)=I²*R. Meaning, that power losses increase fast with high currents. And lower voltages require higher currents for the same power. And all the lost power goes into heat, which actually makes things even worse. So, if you want a lot of energy out of a low voltage cell, it will heat up more then the higher voltage one, and you won't get the rated capacity out of it. (if its even capable of supplying the high current).


Eh anyways, there are quite a lot of things to consider here so i'm not going through them all. I suggest you start with your basic equations and try to implement them with your surroundings, then go into details about the subject you are curious about. Its the only way you truly understand things going on around you. But be warned that the more you learn, the more you start to realize that you don't know anything. And your knowledge will only make it worse with all the choices out there. And that the rated specs in the end don't make any difference because the Chinese just print made up stuff all over the place.

So yeah i suggest you actually don't get too tangled up, buy stuff impulsively and just believe the salesman when he tells you you have the best [insert anything].
 
Not 100% sure if this is what you are asking, but:

You need to consider both voltage and mAh together to determine the energy in a cell. Remember that power (i.e. Watts) is a function of both voltage and current (P=VI).

So a 1.5V cell with a 2000mAh rating is the same amount of stored energy as a 3V cell with a 1000mAh rating.

So to compare the energy density of two cells, multiply the voltage by the mah rating. This allows you to compare milliwatt hours instead of milliamp hours.

HOWEVER, depending on the circuitry used, more voltage may allow the device to run more efficiently (or less efficiently).

EDIT:T0RN4D0 beat me to it by a couple minutes. What he's saying above generally matches what I've said here.
 
Last edited:
Oh and during all my BS you posted another post. :)

No, adding batteries in series will only add up voltage. (logically, because if voltage and capacity doubled, you would get 4 times the energy out of 2 batteries).

So, 7.4V and 750mAh if you put them in series. Or 3.7V and 1500mAh if you put them in parallel. For 7.4V 1500mAh you would have to make a 2S2P pack (2 in parallel, 2 in series).

For flashlights it 2S would be better, because li-ion cell voltage can drop under 3V when its depleted, which can be too low for fully driving the emitter. With 2S, you still have enough voltage for a buck driver to deliver the full voltage. If you put them in parallel, the LED would start to dim when batteries were getting empty.
 
You also have to take into consideration that these ratings are rough estimates of the capability of the battery.

In reality, the output is far from constant and is actually representative of the materials of the battery.

You'll find the most stark differences can be noticed between batteries of different composition. There are NiCd, NiMh, Lithium Ion, and lead acid for example. Each one has different charge and discharge rates, and certain ones lose voltage at different rates than others. Lead acid, for example, doesn't actually have a fixed capacity, and it actually depends on the rate of discharge (Peukert's law).

When making a decision on battery type, you can actually analyze the discharge graphs for different types of batteries to see which would suit you best. When not in use, each type loses a charge at a specific rate (which is also temperature dependent). For long term storage, you might not want to use NiCd due to its ~10% loss per month (among other crappy features). While in use, some batteries are able to supply greater currents than others, which can be attractive for people building ridiculous short bursting flashlights. Basically, battery ratings are good for getting an initial estimate of what you can expect from your battery. Depending on your driver, certain battery types will have noticeable drops in output in relatively short periods of time simply as a result of the composition.

Things to consider with batteries: ability to supply high current, rate of voltage fade (which adversely affects light output, some are less linear than others), energy density, size, etc.

So in a nutshell, the ratings aren't the end-all in these things. A lead acid battery that has the same advertised voltage/mAh as a lithium ion battery simply will not perform the same.
 
What are you referring to when you say higher capacity batteries have lower voltage? AFAIK, an AW 16340 puts out the same voltage as an AW 18650.

WRT lights getting brighter with higher battery voltage, it depends on the driver. For example, the Quark mini's aren't current regulated, so they get brighter when you use a higher voltage battery. They also get hotter, which is why 4sevens is iffy about customers using rechargeable Li-Ions in them. The regular Quarks use a current regulated driver with a buck circuit, so higher voltages are brought down to a predetermined level and don't result in brighter output.
 
WRT lights getting brighter with higher battery voltage, it depends on the driver. For example, the Quark mini's aren't current regulated, so they get brighter when you use a higher voltage battery. They also get hotter, which is why 4sevens is iffy about customers using rechargeable Li-Ions in them. The regular Quarks use a current regulated driver with a buck circuit, so higher voltages are brought down to a predetermined level and don't result in brighter output.

I believe that most light designed for 1.5 volt will vary in brightness, depending on battery voltage, because the driver has constant or increasing current draw with increasing voltage.
Many light designed for 3 volt will increase in brightness if run on LiIon, because they goes into direct drive.

I did write a bit about it here: Drivers, how leds are adapted for different battery voltages
 
Another thing to consider in simple terms cos I am not so bright.
Not considering anything on the driver and assuming this is constant current.

Cells deliver more overall mAh if the current draw is lower.
More volts means less current for the same output.

This means that the cells are more efficient and can deliver maximum mAh. Low Voltage and high Amp drain means the cells work hard and are not as efficient as they could be.


Example.
A NiMh cell may give 1800mAh drained at 1A but 1900mAh when drained at 500mA.
2x cells in a battery will mean the current is halved so it will output the 1900mAh levels for example.

1.2v x 1800mAh = 2160 Wh * 2 = 4320 Wh
2.4 x 1900mAh = 4560 Wh

The chemistry determines just how different the mAh is when drained at different rates.


In your example an 18650 might be rated at 2400mAh but that is probably at some weird lab specific drain of 250mA or something.
Driven at real world 1A this is no longer 2400mAh.

The 16340's may be rated at 650mAh but will actually deliver close to this as the actual draw on them is divided by the number of cells. Because the V is higher the A drain is less. In the real world scenario they would only be pulling 500mA, 330mA if you used 3.

There are many other factors and I am learning !




Illustration, difference between 1A and 0.5A in output.
EneloopAA2000atVariousRates.gif
 
Last edited:

Latest posts

Top