If average current control was by pure PWM, then presumably there'd need to be a resistor of reasonable size in the circuit to limit peak current to a decent value, not overloading the PSU when multiple channels were operating?
A 50mOhm FET, a ~0R25 load resistor and a single AA cell would pull quite a current from the supply.
Also, if the transistor was just going on/off, unless it had a pretty poor drive circuit, then it shouldn't be getting particularly hot, or significantly hotter with a slightly higher PSU voltage.
If there was a weak drive, if the frequency stayed the same, and minimal heat was dissipated when the FET was on, the heat dissipation in the FET should presumably be similar at all the different power levels, since it would only be dissipating during transitions between zero and the same peak current.
I suppose that with a known 3V supply, it would be possible to design a drive circuit that used the FET in linear mode, but which had a ballast resistor which in combination with the sense resistor would soak up most of the excess voltage in the case of a full (~1.5V) cell at full drive current (1A)
In that case with a properly tweaked design, the FET might normally only have to deal with a small voltage. Even when a pretty flat cell was inserted, its voltage would fairly quickly rise to ~1.2V (or maybe be rejected), or could be dealt with by starting flat cells on a lower charge current.
That kind of circuit possibly would be highly sensitive to small increases in supply voltage, since the dissipation in the FET could increase substantially, but I got the impression that that wasn't how the BC-900 worked