Salvo wrote:
That's an amusing post. First you disagree with me, and then at the end do agree.
Fact is, no matter where you are, temperature rise will degrade mppt gain. It's just a matter of how much.
If you got 1000W/m^2 irradiance, then a one meter square panel will absorb about 800W continuous. It doesn't matter what the ambient temp is, the panel will get hot!
Sal
ktmrfs wrote:
Salvo wrote:
You're beginning to see the negative sides of mppt. From the get-go, mppt has about 30% advantage over pwm. But much of that advantage evaporates. Sal
not really. MPPT will do better, ...
And if I was strictly using a panel in the summer in the southern US in high temps, not sure that MPPT would be enough gain to justify the expense.
If your going to quote me, don't pick and choose what I say w/o giving the full context of the post. I stated that in my application MPPT gave consistent significant improvements over PWM. I then explained how high temp will reduce MPPT advantage.
here is more detail on WHAT I said, and leave it up to users to decide how it might affect there application.
so fall and winter full sun MPPT may actually give you a suprise in power if it is cool enough to keep the panel below 25C (75ishF).
In my almost 2 years using my panels with a morningstar sunsaver 15A MPPT controller I have never seen peak clear sky sun output currents into a battery that will accept full current to be less than or even as low as the Imp that a PWM controller will give. It usually is at least a 1.5-2A gain, 10A peak vs. 8.5 Imp from my original PWM controller. At times I have seen over 11A.
However, the gain over PWM is going to be lower in the arizona summer sun than it would be in northern US fall, that's for sure. And the 2m/C drop is why panels in Oregon with the many cloudy days will often actually outperform full sun arizona panels on an annual basis for total power.
And if I was strictly using a panel in the summer in the southern US in high temps, not sure that MPPT would be enough gain to justify the expense.