Forum Discussion
DrewE
Jun 01, 2016Explorer II
smkettner wrote:
Poor power factor increases the apparent power needed not the actual power consumed. Amps and volts stays the same. Power factor describes what % of the sine wave the device can use. Must use the entire wave to get the full generator rating.
Read this Xantrex White Paper on Power Factor
Power factor indeed does affect the apparent power and not the actual power; it's defined as the ratio of actual power to apparent power. Apparent power is defined as the RMS voltage multiplied by the RMS current. Actual power is subtly different: the RMS value of the instantaneous current multiplied by the instantaneous voltage. (In all of these cases, you can basically think of "RMS" as meaning integration over time if you happen to like thinking about calculus.)
Based on the definition of power factor, it's only true that the current and voltage levels remain constant with varying power factors if the apparent power remains constant, and therefore the actual power varies with the power factor.
If the actual power is constant, then varying the power factor must very much lead to differences in the voltage and current. With utility AC power, which is for practical purposes a fairly pure AC voltage source, the current is the component that will change. A device with a poor power factor will use more current than a device with a good power factor that consumes the same actual power. Incidentally, this is very much real current, measured by an ammeter, and causing circuit breakers to pop and fuses to blow if sufficiently high.
The white paper states as much in the second paragraph: "The improved power factor results in approximately 30% less AC input current required to deliver the same DC charging current." When talking about a converter, the DC charging current can serve as a rough measure of or equivalent to the actual power.
About Technical Issues
Having RV issues? Connect with others who have been in your shoes.24,209 PostsLatest Activity: Feb 24, 2025