Salvo wrote:
You're right, the converter knows it must increase voltage to get 60A out. Those who took measurements of their PD have recorded max output voltage of around 14.1 to 14.2V during the initial stages of charging. The converter never gets to its rated current. Why?
I don't know, and won't pretend to know. If I find the same problem with performance, I'll at least try to find out why. It's quite possible there are real world issues that are not apparent from the world of theory coming from the circuit diagrams I've studied.
However, it should be possible to answer your question by looking at the inputs to the Unitrode control chip. If it's not being told to raise the voltage, then it won't raise it, and it should be easy enough to check what voltage it thinks it should go to.
I, and others have measured battery resistance. It's in the neighborhood of 10 to 20 m ohm. The converter load (battery) has 10 times less impedance than the resistive load Wayne used.
Yes and no. :) The battery load has an effective resistance (I called it a "model resistance") that's roughly ten times less. I completely agree with you on that. You calculated it to be 37 milliohms previously. However, the battery also has a perfect voltage source (with zero impedance) in series with that 0.037 ohm model resistance. The two together make the battery model.
When you force DC current through a voltage source with zero impedance, the voltage source looks exactly like a resistor having an impedance equal to the voltage divided by the current you are forcing through it. In the case of 12.2 DCV and 60 amps, that's 203 milliohms. The "ideal" 12.2 DCV voltage source in the model looks like a 0.203 ohm resistor when 60 amps are going through it. Add that to the 37 milliohms of the model and you're at the 240 milliohm level. That's what the converter sees when outputting 60 amps. It can't tell if it's really a 240 milliohm resistive load or a 37 milliohm load in series with a 12.2 volt source unless it changes the current.
Why can the PD output 60A at 14.4V with a 240 m ohm resistive load but falls short when charging a battery that has 10 m ohm resistance? What's the difference between the two scenarios?
In a perfect world, with steady state voltages, there wouldn't be any difference, at least not in current or voltage. We both know that the real world can differ quite a bit from theory. Small fluctuations in voltage are magnified in the battery case as they are applied to a much smaller resistance. I can think of lots of other subtle real world differences, but none that would clearly result in the type of difference you've seen. All I can say is I'll keep my eyes open, look for any problems, try to track down the cause and hope to find solutions.
I appreciate your comments. I'm in the midst of installing a shunt and rearranging some battery interconnections so I can get out to my place of isolation this weekend. In the hills, by a lovely stream as it enters a lake. No TV, no power, no other campers, not even any cellphone coverage. :)