blackbag223 said:
theoretically? so what will it do in real life?
It depends on what discharge rate the manufacturer used when rating the cells. If they used a 700ma discharge rate, you will get very close to what you expect. If they used a higher rate you might see more run time, lower and you'll see less. The reason is because all cells have internal resistance and when you discharge them at higher current, there is more wasted power through the cells internal resistance (meaning less capacity), I²R losses. All this of course assumes a constant current load as well which probably isn't the case.
There's no standard for this and methinks alot of manufacturers probably use low discharge rates so they can get higher capacities during their testing. So in short, the answer to your question is that it's not all that simple to get an exact figure, but at least you can get a rough idea of what to expect or as someone else already put it, YMMV
.
Regards,
Dave