Misconceptions about CFL's and Power Factor
I've seen a handful of reports over the years about CFL's (compact florescent lightbulbs)
having bad power factor. This is well understood and definitely true; however the representation
of power factor problems in the media has been very misleading, and often outright wrong.
Power factor is a measure of the phase of the voltage versus the phase of the current in the
system; this phase difference causes higher current flow than would be expected for a given
power output, but it does NOT increase the power consumed.
Where it does cause problems is that the increased current causes increased losses in the supply
lines, and it puts a heavier load on those lines. A supply line that can carry a thousand amperes
can only carry a thousand amperes; if part of that is due to bad power factor, less power can
be transferred when the line is operating at capacity.
For most environments, the supply lines run well below capacity, so it's not like this is a
big deal unless all of the loading begins to come from low power factor CFLs (which isn't very
likely at 10-20 watts each.)
But the real misrepresentation comes about when statistics full of Watts and VA (volt-amps) are
bandied about. In fact, the only additional losses are in the supply lines and supply
hardware, which now must carry more current. That said, supply line losses are intentionally
kept very low - typically less than 1% of the power delivered over them. With this, we can
construct a contrived worst case example for demonstration purposes:
Assume a 50% loaded line that loses 1% of the power delivered over it at a power factor of 1.0.
That same line will be 100% loaded at a power factor of 0.5, due to a doubling of the required
current to provide the same power. This line will now suffer a 4% power loss. (P = I*I*R)
In short, the actual losses due to bad power factor are only a few percent, even in a worst
case scenario. Much more important is the higher current caused by the bad power factor.
But utilities and electric companies aren't stupid. They already employ extensive power factor
correction at substations and other locations; in some places, the power factor correction is
even dynamic, so as to optimize the power factor at any given time. Why would they bother to
do this? Isn't power factor a new problem? Not even close.
All electric motors and inductors have power factor issues by their very nature. The power
factor issues with motors have been known and understood for over a century, and these things
make up the backbone of nearly all electromechanical subsystems. To top this all off,
nearly all homes will have fans, compressors for refrigeration, and other rotating equipment.
If anything, the presence of CFL's will serve to compensate for this, as the CFL power factor
and the inductive power factor cancel each other.
This same brand of hysteria and hype was very much in vogue in the late 1990's, only at that time
computer power supplies were the culprit. The same technologies and mechanisms that allowed
us to survive the 'power factor crisis' of computer power supplies will allow us to peacefully
ignore the power factor issues of CFL's.
Anyone who thinks otherwise is simply pushing an agenda.
To write Dentin a letter, send mail to: firstname.lastname@example.org
Looking for my
Alter Aeon - an online RPG