Hello J C,
Unfortunately, the science of electro-chemistry does not support your ideas...
Really? Many battery manufacturers pose a suggestion but also give data for proper charging at lower rates. They would not do it if it were unacceptable. As I wrote in my last post, most electronic devices do slow(er) charge.
Many people use batteries for non essential tasks, but there are some that depend on them. When you can tolerate a battery failure as a minor inconvenience, you are free to charge batteries however you want to.
These are a minority, so it then follows that if a statement is made about the importance of a particular charge method, it is always the burden of the one making the statement to include this qualification.
Further, there is no reason to believe someone who depends on their battery cannot use a slow charge, in fact the only real issue is does the lifespan and capacity meet the need and in most cases it does. Quite a few devices do not follow the ideal charge method proposed, and aside from flashaholics you won't find the average emergency/security/etc personnel versed in how their devices handle charging beyond how long it takes and how long it runs.
However, if your batteries are mission critical, it is best to treat them in the best way possible in an effort to get the best performance from them.
Untrue. Mission critical does not mean one has to do anything except have equipment suitable for the mission. If this mission is long enough that having 85% versus 100% battery capacity really matters, the margins where too thin already and a different power source or replacement battery would be used.
Actually, the real critical uses tend to gravitate towards primary lithium cells not rechargeable.
I am one of those people that is unable to complete my job if my light fails, so I have taken the time to do a lot of research on the best way to take care of batteries. I have also interviewed many chemical engineers involved in the development of batteries, and run many tests to verify the information they presented.
This does not in any way disprove that slow charging works fine. People did their work and managed to survive for many years before modern battery capacity levels existed. To suggest it is really important to have the max possible capacity no matter what is rejecting the truth that either way we now enjoy more performance than ever before in history.
The result of all of this effort is that when dealing with NiMh chemistry, the best charge rate is in the 0.5 - 1.0C range. I favor 1C charging, but not all chargers are capable of that.
I could propose the best way for someone to walk, drive, part their hair and paint a house. Idealizations are a folly we can usually ignore, so long as the end result is acceptable. There is a middle ground between being a slave to some regimen and having an unusable device.
Further, there is no evidence that merely deviating from what is claimed the best is really a problem. The battery manufacturers themselves sell chargers that deviate from this! Most people buy the off-the-shelf chargers that deviate from this, most devices which recharge deviate from this, and NiCd or NiMH battery failure from modern chargers are exceedingly rare, most often the failure come from devices which allow draining the cells too far and reverse charging one in series.
I have lost track of how many law enforcement and emergency services people have come to me complaining of poor performance of their battery operated devices. In reviewing their charging practices most of the problems disappear when we change from slow charging to charging in the 0.5 - 1.0C range.
This goes against science. There is nothing that keeps a battery from fully charging at a lower rate than 0.5C. Seems they merely had poorly designed chargers or insufficient knowledge of when the battery is fully charged.
I too thought I was having excellent results with slow charging, but I have been amazed at how much better my cells perform when charging according to the manufacturers recommendations. When I started monitoring and testing the various chargers, I was pleased that the charger manufacturers incorporated various back up termination methods because the -dV termination signal was frequently not strong enough to terminate the charge.
That typically means the charger needs replaced, not that a different charge rate is needed. As I've mentioned in a past post, randomly picking 3 chargers with charge rate below 0.5C, all have no problem terminating charge. Perhaps many years ago this was more of a problem, but today, as in many industries, more info has been gathered to the point that chargers can detect below 0.5C rate, AND batteries are designed to allow for sitting on a slow charger and topping off.
There are still problematic devices, those made cheap but only intended to last through the warranty duration so they reduce charge time with an excessive rate, but such devices are becoming more and more rare, segmented into the cheapest of type products.
This results in a minor overcharge. Now the effects of this minor overcharge are minimal when the charging rate is at 0.1C or below, but once you get above 0.1C, damage is done to the cell. Most of this damage results in higher cell impedance, so unless you are testing or tracking it, it is not readily apparent. You just suffer reduced performance.
I agree if the overcharge duration is long enough but slow chargers are typically 0.1C or below, medium ones at least have a timer shutoff (but this type of charger best avoided), and any higher rate, yet still below 0.5C, can and will detect the Delta V threshold and terminate the higher charge rate.
Perhaps you too closely followed the suggested practices and haven't much used typical chargers in recent years? I and many others do routinely and can assure you they work fine.
I am glad that you are pleased with you chargers, but it is no myth that you can get much better performance if you charge in the 0.5 - 1.0C range.
Tom
It is a myth. Ask any battery manufacturer plainly if you leave a battery charging below 0.5C, will it still reach full possible capacity. The answer is yes, and many do outline lower charge rates in datasheets. The reason they promote higher charge rates is to cover their own asses, to shift the burden to charger manufactures because no matter how clearly they define other charging parameters, the more latitude and options that are presented, the more ways they will find someone ignored an important part of the puzzle and they suffer battery RMA and reputation issues.
There is a threshold below which a charger shouldn't use Delta V detection but today that threshold is below 0.5C. There is also a region of charge rate a battery shouldn't be constantly subjected to if one wants to avoid immediate damage which is above 0.05C, and a region in which the battery needs to be disconnected from the charger within a few hours which is between 0.05C and 0.1C.
ALL of these methods are capable of completely charging a NiCd or NiMH. ALL have exactly the same performance if the battery is left charging until full. ONE of them, extended overcharging close to 0.1C, will reduce cell life by about a dozen percent.
These facts are not isolated to one battery brand and model or one charger, they are seen all day every day by the majority of battery users. While there is a wrong way to design a charger and it certainly happens, there are also plenty of devices out there which demonstrate sufficient performance without a 1-2 hour charge rate.
I always wonder why there are two camps, those who pose a theory and those who actually do it without problems. Probably the charger, try a different one just as someone would do if they picked a different 0.35C charger to replace a faulty one instead of a 1C charger.