I think of it in a different way.
When measuring for R on a wire, as taught by Salvo, you know the current and then you take the voltage between the ends using extra long probes on your voltmeter (I have speaker wire for that so I can do a 30ft wire, for instance.)
eg, you can find 0.13v ("voltage drop") with 20.6a so R = 0.0063, or you might get
0.16v with 27a so R is 0.0059 (R is the same but testing margin of error there)
The solar panel is a current maker, but the current varies with sunlight as well as the voltage, but the thing that shows on the calculator is the wire does not change--same R no matter what the current. But the voltage drop does change as the current changes- is more with higher amps.
I am not doing it right somehow for getting the change in amps at the battery for a particular voltage drop from panel-controller. You use the percentage of the voltage drop wrt panel rated watts? And that is your loss in watts at the controller? Then you have say 13.5v at the battery. Now what again? :( Just hopeless at this stuff.