The electrical codes are guide lines, they’re made from electrical theory and if you don’t understand the electrical theory, then you won’t understand the code.
Very simply put the length of copper wire, gauge, insulation and wiring installation method you use to calculate wires current rating. Voltage drop is giving off in the form of heat along the conductor and reduces the voltage supplied to the load at the end of the wire. The voltage drop of a wire is proportional to the current through it because the resistance of the wire in mostly constant. Ohm’s Law states that Voltage Drop = amps x resistance (V=IR). So the longer the wire, the resistance increases and so does the Voltage drop. And a motor is designed to run at a certain voltage and if the voltage is reduced by voltage drop, the motor current will increase. This will increase the voltage drop, although nonlinear, hopefully the motor protection in the AC unit protects it
By theory, the code will allow you to have a no more then a 3% voltage drop at an outside receptacle and if you plug in your RV to run your AC, it states that it will allow 5% or less at the AC unit. This is the voltage difference from your house panel to the load.
My point is that a common house 14 gauge wire is rated for 13.5 amps with an AC load (motor) up to 50 feet. At my house, my power panel is 50 feet from my outside receptacle and in my RV the AC unit is on the roof and over 20 feet from the panel in the RV. So it doesn’t matter what gauge extension cord I use from the house to the trailer. The code tells me I can’t, the electrical math tells me that I don’t want to run my RV AC unit at 101 volts and ruin it.
FYI:
A 50 foot 14 gauge extension cord at 13.5 amps will give off 46 watts of heat due to voltage drop.
(P = I x I x R).