Recently I bought a plug-in power meter, along these lines:
It’s a nice little invention to have. You can measure both instantaneous power consumption as well as total energy used. Given that and your power bill, you can see how much it costs to run a particular appliance in real time. The answer is usually “not much”, since it doesn’t really matter if a toaster oven uses 1500 watts but is only on for two minutes. On the other hand, a computer and monitor that use 300 watts but is on for hours ends up costing a significant amount over the course of a month.
Right now the power strip that feeds my computer and monitor is connected to the meter. Let me take a reading…
Voltage: 125.5 volts
Current: 1.84 amps
Power: 176 watts
Let’s see, voltage times current equals power, so 125.5 V * 1.84 A = 231 W. Huh? That’s nowhere near 176 W. How is it that I’m using that little power when the classic P = IV relationship tells me the power should be higher?
The issue is that our home electrical sockets don’t provide a constant voltage. Instead, they provide alternating current, with the voltage varying from negative to positive and back in a sinusoidal fashion. The RMS average of that voltage is what the power meter measures. As such, the current flowing into my PC’s power supply also varies sinusoidally.
Where I’m just picking arbitrary units for clarity. The curve representing power is equal to the current times voltage. It’s always above the y = 0 line, so the average power is positive and in fact equal to 1.
But a real circuit – as in a computer – might also have components like capacitors and inductors. These can store energy in the form of electric and magnetic fields over part of one cycle, and return it in the form of voltage at another part of the cycle. And this doesn’t have to be in phase with the voltage coming from the power plant. In total, this means that the applied voltage doesn’t necessarily match phase with the current in the device. So let’s assume those effects are happening in our graph, and plot the exact same thing with the current shifted over a bit. The power is still just volts times current:
But take a look at that power – it doesn’t spend all of its time in positive territory. For part of the cycle, it’s actually negative. It’s actually sending power back into the grid. Therefore the device is actually using less power than you’d expect by just measuring the average of the volts and amps separately. The phase relationship between those quantities reduces the actual power compared to the apparent power. The ratio of actual to apparent power is called the power factor, and with the numbers from my computer, it’s about 0.76.
This is good, right? Well, not really. Yes, if you’re using 176 W and the apparent power is 231 W, you’re returning that extra 55 W to the grid. But the full 231 W had to be sent from the power plant, down the lines, and into your house. The utility company lost some of that power due to natural losses in the power lines and transformers. If they only had to send the 176 W that you actually used, those losses would have been reduced.
For industrial users who have particularly large out-of-phase loads, utilities charge for this extra overhead. As such industrial facilities often install capacitors and inductors in order to manually shift their currents into phase, eliminating the need for that extra delivered power.
Generally residential users have had overall high power factors, close to 1. Heaters and toaster ovens and incandescent lights are all mostly resistive circuit elements with very high power factors. But electronics like computers and flat-screen TVs and compact fluorescent all tend to have inductive and capacitive elements, so power factors in residences have been declining. In these days of conservation and smart grids, it might be something we’ll have to think about along with just the number of watts written on the package.