The real answer is that the converter does not 'draw' the power, the LOAD does. The nature of the LOAD, combined with the SOURCE voltage, determines the amps. The converter is neither the source nor the load, it merely introduces losses into the system.
Consider a LOAD of 1500 watts at 0.85 PF. This is also a load of 1765 VA (1500/0.85). With a source voltage of 108V, the amp draw is 1765/108= 16.3A. At 124 volts, the current is 1765/124= 14.2A. In each case, the LOAD remains the same.
Current may only be determined from a LOAD which is given in Volt-Amps. Calculating amps from a watt load is only valid for loads with a power factor of 1.0. Resistive heat strips and incandescent lighting loads have a power factor of 1.0. Motors and electronic circuits have a power factor of something less than 1.0, so their watt load is less than their V-A load (PF= watts/V-A).
Further complicating the issue is the difference between constant impedance loads vs constant V-A loads. Resistive heat is a constant impedance load. In this case, the amps go up and down with voltage according to V=I*R.
Motors are constant V-A loads (within their operating ratings). As voltage goes up, current goes down (and vice-versa).
So unless one knows the exact electrical nature of EVERY connected load, the correct answer will be a mystery.