profdant139
Apr 06, 2022Explorer II
Break-even point between cable length and voltage drop?
I have two cables for my portable solar panel – one is 40 feet long, and the other is 70 feet long. (We boondock, and we like to park the trailer in deep shade and put the panel out in the open sun.).
Obviously, the longer cable gives us more reach and flexibility. The trade-off, of course, is that the longer cable causes a slight voltage drop. How do I decide if the extra length is outweighed by the voltage drop? Is there a break-even point?
Here are some more facts: our "suitcase" solar panel is rated at 120 watts and supposedly puts out about 14 volts in bright sun. Both cables are ten gauge wire. When I hook up my 40 foot cable and measure the voltage at the battery terminals, it reads 13.8 volts. The 70 footer reads 13.5 volts. I think that's about a 2 percent drop (0.3 divided by 13.8).
Does that matter? Does a 2 percent drop mean, for example, that it will take the solar panel 2 percent longer to charge my battery? If so, that's probably trivial, and I will use the longer cable. For example, in a typical day, the solar panel is putting out juice for eight hours, or 480 minutes. Two percent of 480 is about ten extra minutes: no problem.
Or is this some sort of non-linear function, where a two percent voltage drop means that it takes a lot longer to charge the battery? In that case, I will use the shorter cable.
By the way, and in case it matters, we use a group 31 12V flooded lead acid 110 amp/hour NAPA battery, which is supposedly a true deep cycle marine battery. I always keep it on a BatteryMinder Plus when we are at home, and I never let it get below 12.1 volts when we are traveling. Our little solar panel has almost always fully recharged the battery, every day.
(And in case you're wondering, the 70 foot cable was a gift from a generous neighbor who was clearing out his garage. It weighs 25 pounds and is very bulky, but I could not pass up the chance to have a super-long cable.)
Thanks in advance for your wisdom and expertise!
Obviously, the longer cable gives us more reach and flexibility. The trade-off, of course, is that the longer cable causes a slight voltage drop. How do I decide if the extra length is outweighed by the voltage drop? Is there a break-even point?
Here are some more facts: our "suitcase" solar panel is rated at 120 watts and supposedly puts out about 14 volts in bright sun. Both cables are ten gauge wire. When I hook up my 40 foot cable and measure the voltage at the battery terminals, it reads 13.8 volts. The 70 footer reads 13.5 volts. I think that's about a 2 percent drop (0.3 divided by 13.8).
Does that matter? Does a 2 percent drop mean, for example, that it will take the solar panel 2 percent longer to charge my battery? If so, that's probably trivial, and I will use the longer cable. For example, in a typical day, the solar panel is putting out juice for eight hours, or 480 minutes. Two percent of 480 is about ten extra minutes: no problem.
Or is this some sort of non-linear function, where a two percent voltage drop means that it takes a lot longer to charge the battery? In that case, I will use the shorter cable.
By the way, and in case it matters, we use a group 31 12V flooded lead acid 110 amp/hour NAPA battery, which is supposedly a true deep cycle marine battery. I always keep it on a BatteryMinder Plus when we are at home, and I never let it get below 12.1 volts when we are traveling. Our little solar panel has almost always fully recharged the battery, every day.
(And in case you're wondering, the 70 foot cable was a gift from a generous neighbor who was clearing out his garage. It weighs 25 pounds and is very bulky, but I could not pass up the chance to have a super-long cable.)
Thanks in advance for your wisdom and expertise!