What counts is the voltage drop in the wire between the battery and the inverter. That is determined by the resistance of the wire. Assuming that you have 12' of wire between the battery and the inverter and wire is used for both positive and negative, that is 24' of wire. With #10 wire at the full rated current of your inverter, 40A, you would have a 0.96v voltage drop so if your battery voltage was 12.1v (50% charge), you would have about 11.1v at the inverter, pretty marginal. #10 wire is only rated for 30A anyway.
With the phones and computers you are planning to charge, you could be using about 200 watts so you would have about half that voltage drop or 11.6v at the inverter.
As a rule of thumb, a decrease of 3 in the wire size cuts the resistance in half and therefore the voltage drop in half. However there is no such thing as #7 wire so you would have to use #6.
Sorry if I an getting too technical. Just trying to explain the tradeoffs.
Edit: Your power distribution panel is a long way from the battery. I don't know what the wiring is between the battery and the panel but you would have to take that into consideration if putting the inverter there and taking the 12v from the panel.