road-runner wrote:
IMO the maximum current that the converter draws will go down as the supply voltage goes up. I don't think however that it's valid to assume a linear relationship for four reasons:
1) The PF may change as the voltage changes.
2) The converter efficiency may change as the voltage changes.
3) The converter might supply more output power as the voltage goes up.
4) The generator's waveform may change as the load requirement changes.
I think the only reasonable way to get at the answer for any converter-generator combination is actual test and measurement. IMO a PD converter is a good match with a Honda inverter generator because the converter doesn't supply full output power when the AC voltage sags, so as the generator nears its maximum output, the converter will to some degree back off on its power requirement.
Some are very sensitive to input 120v value, like that thread a while back about ISTR a Magnum inverter charger that would only do 80 of its 100 amps. Also the ones about PDs on MSW- like 120 some Onans do.
The problem is if the 120 is low because of too long a line from the receptacle, then it drops more under the load so the charger struggles. 108 isn't far from the 105 lower range limit.. IMO if you tried to run the converter at 108 it would not do its full DC output.
So IMO that spec at 108 is just to establish its VA requirement, but you really need 120+ to get it to work right.