We are comparing charging efficiencies for generator time saved.
How many minutes is "significant", and how did the side by side comparison testing get measured for equal AH restored in each case? What is the heat loss allowance in the particular monitors used?
Trimetric uses 94% for FLA charging efficiency but notes some batts are better than that so they say the 94% could under-measure AH restored a bit. Also says once the batt is gassing it is crazy.
Taking a possible case where the FLA and LFP are in Bulk constant amps and charging at 60 amps for two hours using a gen to run the charger, and measuring with Trimetric for the FLA and just timing the LFP, we get:
FLA: 120AH plus 6% for heat loss = 127 minutes
LFP: 120AH minus no loss for heat = 120 minutes
I would not say the 7 minutes is significant in my scenario but it might be to somebody else's.
Of course once you try to do a side by side test that includes the gassing stage for the FLA it would be hard to show how much time difference is from earlier tapering with the FLA and how much is from heat loss. Now the time saved could be "significant" mostly from the longer time in Bulk for the LFP to a particular high SOC amount.
The Renogy monitor I am getting has something called "battery attenuation", value not given, which might be their charging efficiency built in, don't know. Your monitor might have a default for charging efficiency.
"Battery attenuation ratio: After the battery Capacity cumulatively once per cycle,The capacity value is automatically changed according to this ratio" (not well translated!)