As far as 'slower to recharge'
When Absorption voltage is reached, the charging source ensures voltage does not keep increasing. And as a result amperage has to taper.
When amps taper to 0.5% of capacity, the AGM battery can be considered full.
My Mid life AGM simply takes longer for amps to taper to 0.5%, compared to when it was new.
This 'longer' time/slower to recharge, is also related to how many cycles it has accumulated since the last true full charge. more cycles to less than 100%, more time is then required for amps to taper to 0.5%
This 'longer' time, on my TPPL AGM, is also influenced by the amount of cycles accumulated since it was high amp recharged. 7 cycles of low and slow solar only, then each solar recharge required more and more time for amps to taper to 0.5%. And now the days are simply not long enough, making higher amperage applied early morning more and mroe important.
After a high amp recharge from a well depleted state on my TPPL AGM, the time required for amps to taper to 0.5% at absorption voltage, is much less, and would be charted as a much steeper dive on a plotted chart.
I Doubt One could take detailed measurements, Plot an accurate chart, and Always be able to plot the same acceptance curve on the chart as your 'ugly graph' shows. Similar, I'll accept, but exact, never.
In my experience, the time it takes amps to taper to 0.5% of capacity is ALWAYS different, which in my opinion makes any egg timer based algorithm on an automatic charging source, as to how long to hold absorption voltage, overwhelmingly asinine. 'Trends and tendencies' prove this Time, is always different depending on depth of cycle, the amperage rate at which it achieved absorption voltage, the amount of partial state of charge cycles accumulated sinvce last true full charge, the temperature, and what one had for breakfast in addition to the location of Jupiter in the solar system.
While some automatic chargers have amperage limitations such as 2 or 12 or 25 amps, our charging sources are rated at X amps
When the battery is depleted and battery voltage is low, the charging source puts out its maximum Amperage until the maximum voltage allowed is attained. Once attained then the amps required to maintain that voltage drop, and the amount is dictated by the battery itself, not the charger.
So 'amps required to maintain voltage' is constantly tapering once absorption voltage has been reached, ignoring any possible loads on DC system at thattime.
A charging source is a voltage limiter, the amps dictated by its maximum output and what the battery can accept at the at particular voltage.
Watching an Ammeter and a voltmeter when charging from bulk through absorption, this becomes very obvious. Watvhing an Ammeter with an adjustable voltage charging source when charging, makes this overwhelmingly obvious.
Once max voltage is attained, less amps are required to maintain it. If amps stayed steady, then battery voltage would have to keep rising, and rise to levels considered dangerous for lead acid batteries.
https://www.youtube.com/watch?v=GC1jIKpC7Zw&list=UUoPqTkOluQsuu3RpGnxVwFw&index=17Trying to determine full charge on an AGM battery without an Ammeter is the definition of insanity.