gatorcq wrote:
how about this. Are you using an external amp meter with shunt? or are relying on the digital display of the bluesky.
Of course you need 2, in / out
I don't have a BlueSky, I have an Eco-Worthy MPPT and also a Solar30 PWM. I use the Trimetric (with its shunt) as an ammeter, the controllers' ammeters on their own outputs, and my multimeter to cross-check those readings.
I have an IR thermometer you aim from about a foot away. It gets the same ambient temps as others so it seems accurate. To measure panel temp, I aim it up under the tilted panel.
At 16C ambient I get 44C panel (diff 28) and with 25C ambient I get 51C (diff 26) panel. Some blurbs say your panel temp is about 25C or higher than ambient, so that works.
That suggests that with the 25C panel temp they use for STC, ambient must be freezing out, at 0C.
The problem is the changes in amps to the battery don't match the change in temps according to the graph above. At 44C I got 15.5a (13.2 Vbat) and at 51C I got 13.5a.(13.5 Vbat) Power out went from 205w to 182w.
In the graph, there is not much change in power between 44C and 51C panel temp. At 51C I should have a panel power loss of 10%. 230-23 is 207. I lose another 1% with wire loss. With 205 in and 182 out that is a 23w loss from 205 = 11.2% which is way too high for controller efficiency loss.
The output at 44c of 205w from 230w panel says the controller is ok. At 44C panel should be down say 8% on that graph? So 230- 18.4= 211.6 and 211-205 is 6 for 3% so controller efficiency would be 97% which is within its specs.
So how could things go so wrong between 44C and 51C? Just one of the mysteries I need to solve this summer.