landyacht318 wrote:
we posted only 3 minutes apart so I do not know if you saw my post.
But what you decided on is the 200 amp version of the 100 amp version I employed and found to be inaccurate at low currents. I'd have to assume the 200 amp version is even more so at low currents.
I was never able to get it calibrated so it would read both 38 and 1.5 amps accurately, in fact anything under 1.5 amps could not be calibrated correctly. It would read 0.0 amps when 0.78 amps were still flowing.
That 9 dollar meter might cause 100$ worth of headaches.
Real amps accuracy doesn't matter for my purpose and the amps will not be making up an AH count so they don't matter for an accurate count there. I will check the accuracy against the usual amps I have seen on the Tri for various things to make sure it is in the ball park though. The voltage is cross-checked with other voltmeters so I will know if it reads high or low and by how much.
I just want the ammeter for when battery charging at 100a or so to make sure the charger(s) are running properly at start up, and to see when amps are tapering to my target for turning off the generator. Eg 5 amps per battery is approx. 90% SOC, so at 20 amps with the four 6s.
Also I look at the amps to see that the inverter is behaving right when I turn on the microwave or something.