Remember that Voltage doesn't equal power, and that Voltage doesn't equal energy either.
Power is measured in Watts, and energy in Watthours.
You get the watts by multiplying Voltage with Current.
If the charger input voltage is 10V, and the input current is 1A, that's 10W. If the charger was 100% efficient, which is impossible, then it would be able to charge a 14V battery at about 0.714Amps, the same amount of watts as it was taking from its input.
Typical efficiencies, however, are around 70-85%. 15 - 30% is lost in the conversion.
Let's go back to your example, let's assume you're charging at 1A from your 12V battery. That's 12Watts. Let's assume your charger has a great efficiency of 85%, that means the output power will be 0.85 * 12Watts, or 10.2 Watts. Since the battery voltage starts at 12V, that's also the voltage at which the charge starts. But let's ignore that fact and just assume it will be outputting 14V as you said. Convert it back, 10.2W / 14Volts equals about 0.73 Amps. So, the charger is consuming 1A from the battery, while charging at 0.73Amps. A net loss of 0.27 Amps, no charging will take place at all.
But let's change the thought experiment, and assume we've got a cryogenic freezer for free, which along with some magic pixie dust enabled us to have a 100% efficient battery charger.. Even then, the charger would not be outputting more power than in it was taking in. As the output voltage rises, either the output current will drop, or the input current will rise, giving us a net energy of 0.
OK, now let's add a second, identical battery to the thought experiment. The second battery is full, and we will use our impossible 100% efficient charger to charge the first battery. When that battery is full, we reverse it, and move our energy from one battery to the other. With the 100% efficient charger, you'd think that you wont lose energy? Wrong again. The battery doesn't have 100% charge efficiency. What that means is, that if you put in 100 Watthours of energy into the battery, some of that energy will be lost in the form of the battery warming up both during charge and discharge. With lead-acid batteries, some is also lost when small amounts of the water in the electrolyte gets converted to hydrogen and oxygen. 100 Watthours in, less than 100 watthours out. (Probably something like 80 Watthours out)
Back in the real world, what this means is that with a realistic and achievable 85% efficient charger, and a lead-acid battery with 80% charge efficiency, giving us a combined efficiency of 0.85 * 0.80 = 0.68, or 68%. That means, that, for example, from 100 Watthours that the charger has consumed, ultimately only 68 Watthours would be available from the lead-acid battery that it charged!