Wayne Dohnal wrote:
when I was using the 9160A to heat a bucket of water, in one setup it supplied a constant 61 amps at 14.34 volts, which is its max voltage when loaded heavily.
Hmmmm. I forgot you'd reported that. I admit, it's not completely consistent with my theory above - but it could differ from one converter model to another.
I modified my PD9280 to increase the target voltage boost mode to 14.8 volts. I was expecting to see constant current for longer when I did that. Unfortunately, it started at 80 and dropped, pretty much as before.
I had the automatic data capture running, but my 9 volt battery died and I lost that data. I'm repeating it now after I rewired my data converter to pull power directly from the battery it's measuring.
I think I'm going to see a slight increase in current at each voltage because the target voltage is now higher, so the error voltage will be greater, but I don't expect a lot of differnce based on what I saw as it was running.
I added a second battery voltage line to the TriMetric monitor to record the converter output so I could monitor both the voltage at the battery and at the converter, but they were never more than 0.1 volts different.
That annoyed me. The battery was at some low voltage. The converter was also at nearly the same low voltage and yet the current was way below 80 amps. Why wasn't the converter pumping out 14.8 volts? I considered that it might be running flat out and just be unable to supply more power, but I did some simple tests and I don't think that's it.
That's when I really focused on the circuit.
It took me a while to realize how the voltage and current signals and limits interact.
(BTW, I also checked out the possibility that the duty cycle short circuit current limiter circuit (diode D4) was the trouble - as discussed earlier in this thread - it wasn't that either.)
I could be way off base with all this, but it just seems to me that the current limiting only appears when the output voltage is far from the target (large error voltage) and the dropping current is exactly matched by the dropping error voltage (target voltage minus increasing output voltage).
All that seems to me to tie in with the gain of the error amp. If I can just get the error voltage increased, the output current will increase - provided the circuitry can handle it.
Of course, this is just theory, but it makes sense to me and it's pretty easy to test (or it would be if I'd mounted the @#$% converter in a more easily accessible spot. :)