I would like to get my Imp to resolve this. However, I do already have enough info to estimate my controller efficiency AFAIK.
Trying that again---
In the Spring example, I have 44C panel temp which is a 7% loss of power on the usual graph that shows 10% loss at 50C. 230w panel with 7% ( 16w) off is 214w.
I have a 1% measured voltage drop panel-controller so 1% of 214 is 2 so ASSumed input is 212w. With measured 29v Vmp, that makes ASSumed Imp 7.3 (Panel rating is 30 x 7.7)
Measured output at that time was 205w (Vbat 13.5 , amps 15) where the Vbat was taken with meter and also seen on Trimetric. Amps from Trimetric and solar controller display agree. So lots of cross-checking there confirms controller watts display of 205w output.
212-205 = 7 and 7/212 is a 3.3% loss so controller efficiency is 96.7% using that input.
For controller efficiency to be worse, the input would have to be higher, not lower. We know the output is not lower.
For the input to be higher than 212 you have to get the panel higher than its 230 or ignore temp loss and voltage drop loss or both. Insolation was near STC as shown by the Isc being just above its rating as it should be with the panel temperature being above 25C at 44C.
I have no doubt the reality is a little different with the real Imp, but it can't be much different, so I just can't see where the controller could be faulty. It appears to be making the same sort of amps other controllers do.
People with the Morningstar 15L on their 230w panels report 15 amps (might be getting a more but has a 15a cut-off) and I get 15a in the Spring. PWM 230w would get 14.5a so that also shows my 15 is about right too.
So what's wrong?