In fact they do, and I specifically linked the reference documents above.
In fact I have 2 Eneloop datasheets here that plainly state:
(AAA) "Single cell capacity under the following condition.
Charge: 80mAx16h, Discharge: 160mA*E.V.=1.0V) at 25C"
(AA) "Single cell capacity under the following condition.
Charge: 200mAx16h, Discharge: 400mA(E.V.=1.0V) at 25C"
So not only do they find this an acceptable charge rate, it is the rate at which the capacity itself is guaranteed, not at any other rate.
They changed their tune over trends, but not quite how you assume. Back in NiCad days, slow chargers worked just fine, as NiCads were very tolerant of it. When NiMH consumer cells were introduced, the chargers did not drastically change. If you know the precise capacity of the cell in question, you can charge at a slow rate on a timer. Since it was easier to use the same chargers and just change the time, it was an easy assumption to simply assume consumers would recharge cells only when depleted. This is also when you started to notice more strong language to the effect of "use only our cells in our charger". Put a lower capacity cell in a charger meant for higher capacity, and you can damage the cell.
You're simply backwards arguing towards a position supportive of your assumption. Your assumption was that if/when a battery manufacturer touts the ability to fast charge, that they are also suggesting it shouldn't be done any other way. If that were what they were suggesting, THEY would have stated it, not leaving ambiguity for 3rd parties to reinterpret.
They clearly list charge rates THEY use for their own testing which take 16 hours.
...That drove smart chargers to a -dV/dT condition to sense end of charge. But to sense this signal, the charger must be able to measure it. A typical NiMH cell will show this signal more prominantly >0.5C charge. Below that and the signal is VERY difficult to measure, or the cell may not even display one at all. Above 1C charge, you can get into other issues with charging the cell too fast.
So you keep claiming, but in fact the signal is not so difficult to measure, every single low-priced charger I've seen in recent years that has Delta -V detection, detects it just fine below 0.5C. ALL of them, so obviously what is difficult in your mind, is not so hard to do when a charger is designed to do this very thing. They not only detect Delta -V at lower levels, they do it successfully with only one cell having reached Delta -V if the cells were at different enough capacity that both did not reach Delta -V at an overlapping moment in time.
Certainly there is a charge rate low enough that Delta -V is too low to be detected, but once again (why is it necessary to keep repeating what chargers prove to be true?) if you look at the chargers, those which detect Delta -V, have a high enough charge rate to do so, they aren't making 200mA chargers that detect Delta -V for this reason, but are making chargers below 0.5C charge rate because it clearly works.
0.5-1.0C is a sweet spot where you can both measure a -dV/dT signal, and not overcharge the cell. If your equipment is capable of it, that is the current "best" method to charge NiMH cells of an unknown state of discharge.
I can put a 90% charged AA battery in a low end 5 year old 700mA rate charger and it Delta -V terminates just fine. What you claim is not supported except perhaps on the first generations of Delta -V capable chargers but as with most electronics, successive generations of product improved in performance.
The Delta -V change doesn't need to be as high or fast as possible, only enough that a charger can detect it which they do.