The battery does not get all the watts. The watts at the battery is the amps it is getting and battery voltage.
So with my 130w panel doing its max 8.2amps with battery voltage rising through say 13.5v, watts at the battery just then is
8.2 x 13.5 = 111w so the MPPT guys all snicker and say I am "losing"
130-111 = 19w that they would be busy making more amps with.
You can't make many amps out of 19w but can cost you an extra $200 for a controller that will squeeze them out.
The PWM case is that voltage drop on the wires just adds to that 19w loss but you still get the 8.2 amps so you don't care. The MPPT guys go nuts because losing any watts at all hurts them in the amps.
Also the voltage drop on the wiring and the efficiency of the MPPT controller itself means a slight loss in watts between panel and battery, so even there battery watts are less than panel watts.
I didn't answer the math question if the panel is getting more light so is at max amps but has some voltage drop on the wires is that as many amps as with less light but no voltage drop.