I just can’t pass this up. One of the chemistry majors had a complaint with the power company. After an unusually high previous power bill, they checked the electric meter half way through the month. The meter said they had used 90,000 kilowatt-hrs of electricity. That is a lot of energy. It is also a lot of money. If I estimate 10 cents per kilowatt hour (the prices varies in space and time), this would be a $9,000 bill.
So, what is the deal? Is it a broken electric meter? Could be. If it is not, maybe it is a short. Shorts usually blow a fuse, but if it were an older house, could it be possible? Maybe I should give some short background info (on power and energy and current).
Power is the rate that energy is used (or produced). A common unit for power is the watt which is 1 Joule per second. If you multiply power by time, you get energy. Thus the kilowatt-hour is a unit of energy. It is the energy equivalent to running something that uses 1 kilowatt for 1 hour. If I want to convert 90,000 kilowatt hours to joules, it would be:
I was totally going to put a link to my post about unit conversions – but it appears I never moved it to my new blog. I will have to do that. Anyway, I will tell you one of the secrets from that post. Type 90,000 kilowatt hours in joules into the google search. The google calculator will do the conversion for you.
Ok, that is not really what I wanted. I want to see what kind of current would use 90,000 kwatts hrs (or 90 mega watts hrs). Suppose that this thingy is running constantly for 15 days (the amount of time to produce the 90 mwhrs of energy), what kind of steady power would produce this? All I need to do is divide that energy by the number of hours it was on. 15 days is 15*24= 360 hours. So, the average power used during that time would be:
Well, some of this power must be used to run the house anyway, right? How much power does it take to run a typical house? One way to answer this is to look at your power bill (assuming your meter works). I don’t really look at the bills too much, so I will use a different method (my wife pays the bills – thanks Ashley!). I do have experience with running a house on a generator (living in south Louisiana). My generator is only 2 kilowatts and I can not run everything. Many people have a 5kwatt generator that can be used to run much more stuff. If you want a whole-house generator, you would likely need a 10 kilowatt generator. So let me assume they are using 10 kilowatts for normal stuff (although this is sort of high for a small house that is not using the AC). That still leaves 240 kilowatts.
How do I get from power to current? I can use the following relationship:
Household currents are not constant, but they alternate. Nonetheless, if you assume a voltage of 120 volts, you can pretend like it is DC. This would give a constant current of:
Just for reference, a vacuum cleaner runs on the order of 8 amps. If you run a vacuum for a while and feel the power cord, it is warm. Imagine 8000 amps in a wire (even if that wire is thicker). This would probably start a fire or at least melt the insulation off the wire. Example, take a plain copper wire and connect it from one terminal to the other for a D-cell battery. Try to hold on for as long as you can (actually, don’t do this – you will hurt yourself). It gets pretty darn hot pretty darn quick.
I am not an electrician, but it seems like it must be a problem with the meter (although I did tell the student that this should be checked out as soon as possible to prevent a fire). Maybe there are some alternate explanations. Perhaps this student created a lab in the garage that has large power requirements. Maybe she is trying to open a wormhole or something.