Originally posted by Jonathan:
</font><blockquote><font size="1" face="Verdana, Arial">quote:</font><hr /><font size="2" face="Verdana, Arial">Originally posted by Ted the Led:
on one chosen setting of power supply:
analogue meter on power supply shows apx. 4.2 v olts / and shows 0 amps ( nothing connected) / short circuit across powersupply outputs= 200 ma. and 4.6 volts / then, with 3 aa connected to powersupply : (nimh I think) Amp reading in series at + battery terminal= 200 ma. / forward voltage at + batt. terminal .58 volts
<font size="2" face="Verdana, Arial">Ted,
I think that you are on the right track to understanding what is going on, but there is something not quite right with your measurement....ahh, it is in your definition of 'forward voltage'.
You are using 'forward voltage' as a measure of the difference between the power supply open circuit voltage and the battery pack open circuit voltage. You get this by disconnecting your power supply from the battery pack and connecting your meter between the battery pack and the power supply.
+++++++++++++++++++++++++++++++++
Ted sed:
Jonathon,
i disconnected the positive terminal of the power pack only, otherwise did exactly as you describe.
-----------------------------------------------------------
This is a useful measurement, generally not called 'forward voltage'. I'd use a term like 'available compliance voltage' or something similar.
+++++++++++++++++++++++++++++++
Ted sed:
i think you are just trying to force me to use another acronym,
--------------------------------------------------------
The term 'forward voltage' is usually used to describe the voltage measured _across_ a diode when it is biased in the forward direction and carrying current.
You will find an additional measurement _very_ informative. Adjust the power supply to read 10V, and set it (as above) so that you get 200mA when you short circuit the output. Connect your volt meter to the output terminals of your power supply.
With the power supply 'open circuited', the _external_ volt meter should read 10V. Now short circuit the supply. The internal meters on the power supply will read about 10V and 200mA, but the external volt meter should read something close to zero volts.
Instead of a short circuit, try a 1 ohm resistor. The external meter should read about 200mV. With a 10ohm resistor, it should read about 2V. With the 3AA pack, you should read something between 4 and 6V, slowly rising.
As I think you will see, the 'voltage' reading on your power supply is not the voltage placed on its output terminals. Rather it is the maximum voltage that it will place on its output terminals, provided that the current limit is not reached. If the current limit is reached, then the voltage on the output terminals will drop in order to properly control the current.
What does this say about battery charging? Well, for charging a battery, what matters is reversing some chemical reactions. This means that a _quantity_ of atoms needs to be moved around, and this will take a _quantity_ of electrons. So for charging a battery, what matters is the total _number_ of electrons delivered, and this means current. _Voltage_ is not what you need to control, _current_ and total _charge_ are what you need to control. But this is not to say that voltage is unimportant; it is quite important.
+++++++++++++++++++++++++++++++++++
Ted sed:
It iiiissss??
-------------------------------------------------------------
Voltage is important because it is what causes the current to flow. The charge current flowing through the battery is _directly_ related to the charge voltage applied to the battery. _Not_ the voltage setting on your power supply, but the actual voltage at the power supply terminals.
What your power supply will do when you set it to 200mA and any sufficiently high voltage, is that it will adjust the voltage at the output terminals so that 200mA flows through the battery. The voltage at the output terminals will be _less_ that the voltage setting of the supply, and as the battery charges up, this voltage will change.
++++++++++++++++++
TTL sed:
no matter, i can still choose the voltage the battery sees, and at the same amperage.
----------------------------------
With some types of battery, notably lead acid batteries, there is a very nice, steady relation between state of charge and battery voltage. If you connect these types of battery to a voltage source of the proper voltage, then the voltage will rise as they charge up, and when the battery voltage reaches the supply voltage,
+++++++++++++++++++++++
Ted sed:
which never happens with photovoltaic batteries in the real world, because the power supply from a solar panel is around 19 volts, way to high for a 12 volt battery to think of approaching. (that's what regulators are for.)
------------------------------------------------
This means that you can 'float charge' lead acid batteries with a constant voltage supply. With your power supply, you set the voltage setting to the 'float voltage',
+++++++++++++++++++++++++
Ted sed:
oh we're back to my original question, how do you compute 'float voltage'??
------------------------------------------
set the current limit to some comfortable value, and let it go. With a flat battery, the voltage will be low, the supply will hit current limit, and the maximum current will flow into the battery. As it charges up, the battery voltage starts to hit the output voltage setting, and the current drops.
++++++++++++++++++++++++++
Ted sed:
right, but the supply is always trying to put out the set current, say 200ma, the battery is just less resistant when it's low, so more current flows.
-------------------------------------------------
This technique _will not work_ with NiCd and NiMH cells. The problem is that there isn't a nice steady relation between voltage and state of charge.
++++++++++++++++++++++++++++++
Ted sed:
no, i find with the voltage fine tune knob, i can set the forward voltage a fraction of a volt above the battery while under charge, and the nicad or nimh will rise and the current will switch off..
----------------------------------------------------
Instead what you need to do is continue feeding current into the battery until you get some indication that the battery is fully charged. If you simply use the max current until a certain voltage is reached technique, then you will very likely overcharge or undercharge the battery, and only if you are very lucky will the charge current taper off just as the battery reaches full charge.
+++++++++++++++++++++++++++++++
Ted sed:
maybe it's a great power supply. what you call "max current" is limited to 200ma in this case.
the voltage is also limited, and has to be upped a bit to start the charging again when the set level is reached. or you can set the voltage a little above the battery max possible voltage -- but how much higher ? .25 volts? .50 volts? 1 volt? 9volts?
--tedtl
-------------------------------------
Regards,
Jonathan Edelson</font><hr /></blockquote><font size="2" face="Verdana, Arial">