The battery accepts the amps from the charger and from the converter in proportion to the spread between the battery's voltage and the charger's voltage and the spread between battery and converter voltages.
Say the battery is 12.3v and the converter is 13.8v, while the charger is 14.4v. When the two chargers are turned on, now the battery voltage goes to 13.3v, leaving a spread of only 0.5v for converter and 1.1v for charger.
The charger does its full 10 amps and you are only getting 10 amps from the converter instead of what it is rated for (35? 45?). As battery voltage rises during recharge the spread to 13.8 gets smaller and the converter's amps taper to zero, leaving the charger doing its 10 amps. Later as battery voltage rises further towards 14.4, the 10 amps will taper down too until the battery is full.
What you want is a converter that will do 14.4 instead of 13.8 so you get full converter amps to the battery, which will be more than 20. Then the charger's 10 amps can be on top of that.
You want both chargers to be nearly the same voltage for them to add their amps all the way through the recharge.