90,000 KWhr for half a month

I just can't pass this up. One of the chemistry majors had a complaint with the power company. After an unusually high previous power bill, they checked the electric meter half way through the month. The meter said they had used 90,000 kilowatt-hrs of electricity. That is a lot of energy. It is also a lot of money. If I estimate 10 cents per kilowatt hour (the prices varies in space and time), this would be a $9,000 bill.

So, what is the deal? Is it a broken electric meter? Could be. If it is not, maybe it is a short. Shorts usually blow a fuse, but if it were an older house, could it be possible? Maybe I should give some short background info (on power and energy and current).

Power is the rate that energy is used (or produced). A common unit for power is the watt which is 1 Joule per second. If you multiply power by time, you get energy. Thus the kilowatt-hour is a unit of energy. It is the energy equivalent to running something that uses 1 kilowatt for 1 hour. If I want to convert 90,000 kilowatt hours to joules, it would be:


I was totally going to put a link to my post about unit conversions - but it appears I never moved it to my new blog. I will have to do that. Anyway, I will tell you one of the secrets from that post. Type 90,000 kilowatt hours in joules into the google search. The google calculator will do the conversion for you.

Ok, that is not really what I wanted. I want to see what kind of current would use 90,000 kwatts hrs (or 90 mega watts hrs). Suppose that this thingy is running constantly for 15 days (the amount of time to produce the 90 mwhrs of energy), what kind of steady power would produce this? All I need to do is divide that energy by the number of hours it was on. 15 days is 15*24= 360 hours. So, the average power used during that time would be:


Well, some of this power must be used to run the house anyway, right? How much power does it take to run a typical house? One way to answer this is to look at your power bill (assuming your meter works). I don't really look at the bills too much, so I will use a different method (my wife pays the bills - thanks Ashley!). I do have experience with running a house on a generator (living in south Louisiana). My generator is only 2 kilowatts and I can not run everything. Many people have a 5kwatt generator that can be used to run much more stuff. If you want a whole-house generator, you would likely need a 10 kilowatt generator. So let me assume they are using 10 kilowatts for normal stuff (although this is sort of high for a small house that is not using the AC). That still leaves 240 kilowatts.

How do I get from power to current? I can use the following relationship:


Household currents are not constant, but they alternate. Nonetheless, if you assume a voltage of 120 volts, you can pretend like it is DC. This would give a constant current of:


Just for reference, a vacuum cleaner runs on the order of 8 amps. If you run a vacuum for a while and feel the power cord, it is warm. Imagine 8000 amps in a wire (even if that wire is thicker). This would probably start a fire or at least melt the insulation off the wire. Example, take a plain copper wire and connect it from one terminal to the other for a D-cell battery. Try to hold on for as long as you can (actually, don't do this - you will hurt yourself). It gets pretty darn hot pretty darn quick.

I am not an electrician, but it seems like it must be a problem with the meter (although I did tell the student that this should be checked out as soon as possible to prevent a fire). Maybe there are some alternate explanations. Perhaps this student created a lab in the garage that has large power requirements. Maybe she is trying to open a wormhole or something.

More like this

I think you can do a simpler analysis for a short to ground. In that case, all the power used (well, all less whatever is used in running the house) would be dissipated in the short. So the 240 kW would all go to heating up the wire or pipe or whatever is carrying the current. That's the power consumed by 2400 100 W lightbulbs. I don't think it can be a short.

Love your site, BTW!

By Chris Goedde (not verified) on 19 Jan 2009 #permalink

The usual case of excessive power consumption is that the house is a front for a plant grow operation (e.g., hundreds or thousands of plant growth light bulbs in the basement). Such houses usually also show up on thermal imaging scans quite readily, too, due to the amount of heat dissipated (You get one guess as to the type of plant typically grown!). But, failing that, it's hard to see how a house could dissipate 250KW without something abnormal going on.

Note that it would be quite surprising for a house to even be able to pull 1000 Amps (240 Volts at 1000 Amps = 240 KW). The size of wire necessary for this would be well in excess of 4/0 (Probably equivalent of about three 4/0 conductors for each side.). Since a 4/0 conductor is almost a half-inch in diameter, that's a LOT of Copper, much more than most meters/breaker boxes are wired to handle.


Another consideration is that most pole-pigs installed by the utilities won't supply 250 KW, unless it happens to be an exceptionally hefty one.


Thus, it's probable that the meter is defective. But, in the interest of safety, the prudent course of action would be to have it investigated.


Here's an alternate explanation: the meter reads the delivered energy since installation. Monthly energy consumption is calculated by taking the difference in the reading between the beginning and end of the month.

By Anonymous Coward (not verified) on 21 Jan 2009 #permalink

@Anonymous Coward

This is, in fact, the correct explanation. Analog meters measure the energy used since "the beginning". You can even see this on your bill (they give the two readings and the difference). Although, I now have an electronic meter that can be read remotely (and instantaneously - to get real time info on my power use so they can turn off my AC and pool pump with the remote switch they installed :)