Watts up, doc?

Recently I bought a plug-in power meter, along these lines:

It's a nice little invention to have. You can measure both instantaneous power consumption as well as total energy used. Given that and your power bill, you can see how much it costs to run a particular appliance in real time. The answer is usually "not much", since it doesn't really matter if a toaster oven uses 1500 watts but is only on for two minutes. On the other hand, a computer and monitor that use 300 watts but is on for hours ends up costing a significant amount over the course of a month.

Right now the power strip that feeds my computer and monitor is connected to the meter. Let me take a reading...

Voltage: 125.5 volts
Current: 1.84 amps
Power: 176 watts

Let's see, voltage times current equals power, so 125.5 V * 1.84 A = 231 W. Huh? That's nowhere near 176 W. How is it that I'm using that little power when the classic P = IV relationship tells me the power should be higher?

The issue is that our home electrical sockets don't provide a constant voltage. Instead, they provide alternating current, with the voltage varying from negative to positive and back in a sinusoidal fashion. The RMS average of that voltage is what the power meter measures. As such, the current flowing into my PC's power supply also varies sinusoidally.

i-7b7142f1efa09b47c3b8242f89bcb9f4-graph.png

Where I'm just picking arbitrary units for clarity. The curve representing power is equal to the current times voltage. It's always above the y = 0 line, so the average power is positive and in fact equal to 1.

But a real circuit - as in a computer - might also have components like capacitors and inductors. These can store energy in the form of electric and magnetic fields over part of one cycle, and return it in the form of voltage at another part of the cycle. And this doesn't have to be in phase with the voltage coming from the power plant. In total, this means that the applied voltage doesn't necessarily match phase with the current in the device. So let's assume those effects are happening in our graph, and plot the exact same thing with the current shifted over a bit. The power is still just volts times current:

i-3fdee7de4197c08986e3407853fa09fa-graph2.png

But take a look at that power - it doesn't spend all of its time in positive territory. For part of the cycle, it's actually negative. It's actually sending power back into the grid. Therefore the device is actually using less power than you'd expect by just measuring the average of the volts and amps separately. The phase relationship between those quantities reduces the actual power compared to the apparent power. The ratio of actual to apparent power is called the power factor, and with the numbers from my computer, it's about 0.76.

This is good, right? Well, not really. Yes, if you're using 176 W and the apparent power is 231 W, you're returning that extra 55 W to the grid. But the full 231 W had to be sent from the power plant, down the lines, and into your house. The utility company lost some of that power due to natural losses in the power lines and transformers. If they only had to send the 176 W that you actually used, those losses would have been reduced.

For industrial users who have particularly large out-of-phase loads, utilities charge for this extra overhead. As such industrial facilities often install capacitors and inductors in order to manually shift their currents into phase, eliminating the need for that extra delivered power.

Generally residential users have had overall high power factors, close to 1. Heaters and toaster ovens and incandescent lights are all mostly resistive circuit elements with very high power factors. But electronics like computers and flat-screen TVs and compact fluorescent all tend to have inductive and capacitive elements, so power factors in residences have been declining. In these days of conservation and smart grids, it might be something we'll have to think about along with just the number of watts written on the package.

More like this

"...often install capacitors and inductors in order to manually shift their currents into phase..."

If they do it by hand (manually) then why do they need the capacitors and inductors?

NoAstronomer: Something did get lost in translation there, but I think Matt meant that these users install custom phase shifters which consist of capacitors and inductors in order to eliminate this problem. Most likely they install such devices anyway to act as surge suppressors. A surge suppressor is basically a low-pass filter, and in general it will induce a phase shift in the incoming wave form (the power from the grid in this case). So they design the filter to counteract the phase shift in the machine's power consumption.

By Eric Lund (not verified) on 12 Feb 2010 #permalink

Most consumer electronics these days tend to have switching mode power supplies (SMPS). This is especially the case with computers. The first part of such a supply is a diode bridge rectifier, followed by a reservoir capacitor. Without some kind of power factor correction circuit the way this circuit loads the line is pretty nasty. Current is drawn from the line only when the (absolute) line voltage exceeds the voltage of the capacitors. And within that period of time, the capacitors must gain the charge they lose in the time between the line voltage peaks. This means that a lot of current is drawn briefly every 120th of a second (assuming 60Hz line frequency) and no current at all in between.

Your power factor of 0.76 corresponds to a 40 deg phase angle. That is quite a lot!

But why did you draw the voltage leading the current if we expect the power supply will be dominated by a capacitive reactance, as Flaky noted? And are you going to put a scope on it to see what is really going on?

By CCPhysicist (not verified) on 12 Feb 2010 #permalink

There's a lot of electronics hooked up to this strip, and pretty much all of them require some sort of transformer to turn the AC to 12 VDC or whatever the case may be. Add to that the fact that the computer is both old and home-brew, and I'm not so surprised the phase is so bad. In fact, switching off the monitor drops the power factor to a ludicrously awful .56. I need to buy a new computer.

As for drawing the phase with that particular sign, I confess to not even thinking about the direction. In any case it should get the qualitative point across.

Matt, don't miss embracing a fantastic opportunity. CONSULT! You have nearly given away a power company special surcharge on line impedence "to recover stolen energy." Resistive users will pay a modest but increasing fee covering periodic inspection. Abusive users (anybody with a computer, TV, microwave oven, or battery charger) will be wrung dry in several tiers. Both pay an additional 40% Federal excise tax on their surcharge to sweeten the deal.

10% of the whole will be charitably aimed at the poor to elevate them, requiring an 11.1% across the board surcharge increase right at the get go. Yer gonna be wearing silk suits and alligator shoes, boyo.

Matt, I have that same meter -found it on the internet for something like $20. If you have a discontinuous load -like say your fridge, you can leave it plugged in -one of the measurements available is accumulated KWhr (to .01 KWhr) since the time the meter was plugged in, and another measures the time its been plugged in (i.e. had power).

I'm surprised at your voltage 125, that sounds a bit high, I think 117 is kinda the target. My photvoltaics inverter measures twice that; the so called 220 is two 110 volt lines with opposite polarity, so hot to hot doubles the voltage, but my inverter gives 234 to 235, which should be line voltage as it is grid tied.

The kill-o-watt does have its limitations. I don't think it can work for 220 (it won't plug into the socket for one, but if you forced it I suspect it might self-destruct). They also claim it can only handle about 15amps, so things like your dryer, washer, A/C etc can't be measured with it. Also my house like most has a lot of things like ceiling lights for which you can't plug the meter in. So you can only measure some of your home's powerhogs.

But it useful for figuring out which appliances standby power is large enough to warrant putting on switchable power strips, so you can for instance turn them off at night.

I assume the utility has capacitors (the load is probably already inductive) in place to eliminate the "average" phase shift.

By Omega Centauri (not verified) on 12 Feb 2010 #permalink

Power factor correction and harmonic current reduction circuitry is becoming increasingly common in switchmode power supplies, driven to some extent by EMC (electromagnetic compatibility) regulations in the EC.

If a switchmode PSU has a "universal" input, i.e., an input rating something like "90-260VAC, 47-63 Hz" it's highly probable that it incorporates PFC and THD control circuitry.

BTW, if you operate it under the right conditions a synchronous motor can be made to draw a leading, i.e., capacitive current. Plants with low power factor due to large motor loads (which tend to be inductive) have used this to correct their power factor while getting useful mechanical work out of the process. Look up "synchronous capacitor" for more.

@simon: there's no shortage of true RMS voltmeters and ammeters on the market, which will give the correct RMS value for non-sinusoidal voltage or current waveforms (provided that their crest factor isn't too high). Getting a true power- as opposed to a volt-ampere reading basically requires multiplying the voltage and current waveforms instant-by-instant and integrating the product over the period of one cycle. Electronic watt-hour meters used by electric utilities have to use this approach, as the electric company bills for watt-hours, not VARS-hours.

With A/D conversion and microcontroller chips being cheap and plentiful I wouldn't be at all surprised to see a consumer product like the one in this post that actually does the math in real time.

By Ktesibios (not verified) on 12 Feb 2010 #permalink

Which just reminds me that I recently wrecked my AC/DC adapter. It made a little pooff and then I recalled I'm back in Europe.

You forget: what if you're heating your house, because it's cold outside? Assuming a .1 watt nightlight lights the inside of the house only (no light goes out the windows) then that's .1 watt of electric power heating your house. If an expresso machine like we have draws 1300 watts 25% of the time - average 325 watts - then that's 325 watts of electric power heating your home. Electric heat is more expensive than natural gas, true, but it's not wasted energy.