Example, two 100W "12V" panels (17.9V VMP) in parallel gets you about 11.5A (about 5.75A each panel)
same panels in series now gets 35.8V VMP at 5.75A..
Both add up to 200W..
17.9 x 11.5 = 205.85W
38.8 x 5.75 = 205.85W
------------------------
Here is how I have seen that doing measurements. Assume high noon, full sun, in June. Battery bank accepting all available amps.
PWM amps is what Isc is at the time. 100w panels rated at 6.2a Isc (you don't use the Imp spec with PWM! That's for MPPT)
So, amps to battery 200AH = 12.4 amps (any voltage drops don't show in lost amps)
Now quickly before the sky changes any and the sun moves much, swap to the MPPT controller.
200w rated, lose 10% from panel heating, now at 180w. Lose 2% panel to controller wiring, 3.6w so now at 176.4w input. Controller efficiency 96% (approx) so output watts is 169.3w.
Pretend no loss on 12v wiring to battery, amps to battery is W/Vbatt. Pick three Vbatts for example:
169.3/12.5 = 13.5a
169.3/13.5 = 12.5a
169.3/14.5 = 11.7a
Compare with 12.4a using the PWM.
Note I see this all the time and have reported results before on this. Using my three 100s I get 18.6a with PWM, I swap to MPPT and get 18.x and swap the array to series and with the MPPT--still 18.x.
The problem is to see those results at high noon when the batts are by then nearly full, so they don't accept all the available amps. You have to run the batts down so you can do the test.
EDIT--I have suspected the MPPT 20 ampers used are clipping their amps and instead of seeing 20 amps clipped it shows as 18.x. However, I have seen 19.x at times with "cloud effect", so I don't think so. Testing with two panels would eliminate that possibility for sure.