Salvo wrote:
Your accuracy methodology is new to industry standards. I did a search: "meter accuracy AT full scale", and Google corrected me: "meter accuracy OF full scale". What do you mean by 1% accuracy AT full scale (100A)? Do you mean 1% of reading? Why reference full scale then? What's the total error when displaying 5.0A?
Sal
Funny, Google didn't correct me when I use the exact same words.
"1.0% at full scale (100a)" is the same thing as "1.0% of 100a when the meter reads 100a".
Full scale error in terms of ADC output (which is what the display derives its measurements from) is the sum of offset error and gain error -- gain error being the greatest at full scale. It's a very common spec. Lots of references on it if you do a Google search.
Error at lower values has typically been observed at a couple tenths of an amp. Don't know if I have an exact spec for you at 5.0A. No one yet has gotten flustered over their ammeter being off by an extra tenth of an amp. Most charge controllers are off by much more that that.