Lets do some load resistance calculations.
To drive a 60A PD with a resistive load, it take:
R = 14.4V / 60A = 240 m ohm
When driving the same converter with a battery that's at 12.2V, it takes:
R = Vconv - Vbat / 60 = 14.4 - 12.2 / 60 = 37 m ohm
The PD can't deal with the 37 m ohm load.
I'm probably stepping into the middle of something, but I just can't agree with that last statement. The first calculation 0.240 ohm is the resistance needed for a resistive load test that dumps 60 amps when driven at 14.4 volts. I agree with that. The second is the model resistance when modeling a real world battery as a perfect voltage source and a .037 ohm model resistor. But from the point of view of a DC output voltage charger, it never sees that 37 m ohm model resistance. All it knows is that it has to increase the output to 14.4 volts before 60 amps will flow.
I agree there are big differences when the output voltage is changing (with a resistive test, current flows at low voltage, while in the real world, current flow is negligible until the battery voltage is exceeded), but assuming the charger has reached steady state 14.4 volts, at 60 amps, the charger sees a 0.240 ohm load in both cases, not a .037 ohm load. I can't see any reason why a charger would know or care why it takes 14.4V to get 60 amps to flow.
And I'd again hope for a gentle tone in all replies. Perhaps I've missed something. I'm always happy to learn something new.