WiFi and Radiation

I note an interesting short piece by James Hrynyshyn about a bit of a local controversy in Nova Scotia about the installation of a tower to provide wireless internet to the area. Leading the opposition is a guy worried about the health effects:

"I think over a period of time it will change the DNA of the garlic because it shakes up the molecules," he said Tuesday.

EastLink uses microwave transmission to provide high-speed internet access to rural areas outside its wired network.

Levine said he moved to the country to get away from pollution, and he sees the radiation from the towers as another form of pollution.

"I view it with dread, fear and panic," he said. "I don't want to grow food under those conditions."

It's tempting to roll your eyes and dismiss it out of hand, and if in fact you did you wouldn't be wrong. But why? It would certainly better if more people knew a little more about molecules and light, and maybe the next generation of farmers could work out for themselves what the effect of microwave radiation on crops is likely to be.

To do some estimation, we need to know about how light carries its energy. It's relatively accurate to think of light as a stream of particles called photons, each with a specific amount of energy proportional to its frequency. We also know that light has wave properties such as diffraction and interference that can't be described in terms of "billiard ball" photons, but for a relatively simple task like this where we're not worried about wave properties we can just keep in the back of our minds that these photons do have a wave nature and aren't only particles in the classical way we think of particles. With that as a caveat, it's completely true that light does come in photons and they all have a certain energy. What is that energy per photon? It's this:

i-d4bb9fe09d6b78eb832985821d8a307f-1.png

This will give you the energy of a photon in electron volts given the wavelength in nanometers. Visible light ranges from roughly 750 nanometers in the red to 380 in the blue. In terms of energy, our formula tells us that these photons range from about 1.6 to 3.3 electron volts. This is roughly on the order of the energies involved in electron energy levels, which is the basis for chemistry. Chemical reactions can make visible light (glow stocks, fireflies, etc) and light can made chemical reactions (camera film, dyes fading, etc). Ultraviolet light from the sun is of a shorter wavelength, say, 300 nanometers or so from the rays you should protect yourself from at the beach. These are maybe 4.2 eV, strong enough to more excite more energetic electron energy levels and dissociate the molecules. If it's a DNA molecule, these damages and breaks from absorbing the photons can eventually lead to skin cancer and other health problems.

So what kind of energies are we looking at in WiFi signals? WiFi operates in the 2.4 GHz frequency range, the same as a microwave oven. The wavelength of that light is about 12.5 centimeters, which is about 125 million nanometers. Each photon therefore carries almost exactly 1/100,000 of an electron volt worth of energy. This is nowhere near the ~1+ energies involved in electron energy levels, and so this kind of radiation can't damage DNA. The energy just isn't there. Furthermore, because of the quantum nature of electron energy levels, you can't just stack 100,000 microwave photons to cause a 1 eV transision. You have to actually have a 1 eV photon. (Technically there is such a thing as a multiphoton transition, but it's a strongly nonlinear process with a probability that's already very low for 2 photon transitions and exponentially worse as the number increases. 100,000 is out of the question.)

But if microwave photons can't damage DNA, how can they cook food and boil water in your microwave oven? The answer is simple: the energy they have is plenty enough to excite the much less energetic rotational and vibrational states of the molecules. This motion produces heat. Heat is just heat though, there's nothing special about heat produced by a microwave any more than there is by an electric heating element or a heat lamp or a seat warmer. And the total power output by a WiFi transmitter is many orders of magnitude less than a microwave oven - 1 watt tends to be an upper limit for home and business transmitters, while any person standing around would only absorb a tiny fraction of that tiny fraction. The temperature increase from absorbing WiFi signals is not measurable, and mathematically speaking is itself dwarfed by other radio/microwave sources such as cell phones and (depending on your location) broadcast radio and TV.

WiFi just isn't going to hurt you, your DNA, your crops or their DNA, or anything else other than the attention spans of college students when WiFi lets them spend class on Facebook.

More like this

nice essay. when people get scared of the radiation from WiFi, i wonder why they don't question the effects of the local radiostations, TV, cell phones, CB radios, pagers, etc. basically anything with an antenna.

as you pointed out, the UV from the sun is far more worrisome than the long wavelengths from WiFi.

One number we kept in mind in undergrad was "room temperature is 1/40 eV". Anything that is significantly less than .025 eV is swamped out just by random thermal noise.

Yes, this guy is a yahoo who is worried about nothing. It's not the first time I've seen it: Check out Robert Parks's archive (What's New) about the furore over high voltage transmission lines (the frequencies are 60 Hz in North America and 50 Hz in Europe). Mr. Levine's argument sounds very much like that of the power-lines-cause-cancer crowd.

That said, there is one piece missing from your argument, which would be a comparison of electromagnetic intensities due to WiFi and in a microwave oven. I'm sure the former is much smaller than the latter (I've had a wireless network in my house for five years now and haven't been cooked yet), but it would shut up all but the most willfully ignorant critics to show the magnitude of the difference.

By Eric Lund (not verified) on 17 Sep 2009 #permalink

I sort of alluded to intensity, but your point is exactly right: the intensities are very different. Microwave ovens emit about a kilowatt, home WiFi emits maybe half a watt. Further, that half a watt attenuates proportional to the square of your distance from the transmitter. While you might absorb that half a watt if you ate the transmitter, being (say) 15 feet away from it reduces your exposure to even that 0.5 W by a factor of roughly 100 or more.

I expect the tower radiates more, but on the other hand it's on a stinkin' tower. The incident flux at a given location is by design going to be roughly the same as normal home transmitters.

In science we go by experimental results. Experiments show a wide variety of biological effects of sub-optical radiation exposure that are, in fact, inconsistent with the heating model. (Veterinarians routinely accelerate fracture healing with it.) That these results can make deployment of radiative sources inconvenient is not a scientific argument in favor of pretending they don't exist, however successful that consideration has been in interfering with efforts to further investigate the phenomena.

Still, numbers matter. Whatever effects we discover, we can expect that lower exposure levels, all else being equal, will have smaller effects -- after taking hormesis into account. (This is complicated by the discovery that modulation is very important; a constant CW exposure has much smaller biological effect than, e.g., pulses.) Probably the garlic farmer has much less to worry about than his niece with her cellphone, or you.

By Nathan Myers (not verified) on 17 Sep 2009 #permalink

As far as I know all of those experiments deal with modulated magnetic feilds, not radio waves. Do you have any refrences to experiments that deal in anything other then direct magnetic or nearfeild RF interactions?

By Robert S. (not verified) on 17 Sep 2009 #permalink

i wonder why they don't question the effects of the local radiostations, TV, cell phones, CB radios, pagers, etc.

But they do, they do! In Germany there is a massive movement against cell phone antennae based on hysteria, lies and misinformation.

In Germany there is a massive movement against cell phone antennae based on hysteria, lies and misinformation.

Yes, it's the same here in Britain, sadly. It's one of those 'your facts don't suit me, therefore they are irrelevant' situations.

By Michael Finn (not verified) on 17 Sep 2009 #permalink

massive movement ... based on hysteria, lies and misinformation

When authority figures insist against experimental evidence that the only possible biological effect of EM is heating, others reasonably conclude that they either don't know what they're talking about, or are lying. Misinformation leads to hysteria. Once it's started, it's hard to stop. In some cases there may actually be grounds for concern, but who knows which without doing the research?

The same thing happened with vaccinations. Authority figures insisted, despite an entire lack of evidence of any kind, that ethyl mercury in vaccines was perfectly safe. Many years later, we have evidence that it doesn't seem to cause autism (though we still don't know about other maladies), but the damage was done by the original disingenuity, and it will take many more years before the hysteria dies down.

By Nathan Myers (not verified) on 17 Sep 2009 #permalink

talking about how this nonsense is physically impossible is unlikely to ease the concerns of anybody who would fall for it --- i would bet a lot of the folks who succumb to this hysteria take one look at the phrase "ionization energy" and their eyes glaze over. equations are just a way to lose the attention of such crowds.

epidemiology might be a better approach. if microwave-frequency EM radiation were really a concern, it ought to be a global one --- communications satellites, after all --- so whatever harm such might cause, should be showing up blatantly, obviously, and everywhere in population health statistics. if it isn't showing up, well then...

By Nomen Nescio (not verified) on 18 Sep 2009 #permalink

It might help to explain something I haven't seen clearly explained yet -- how vibrating molecules with a source of microwaves can alter the proportion yield of some chemical reactions (I think it simply makes some particular one of several configurations of one of the molecules last longer so that particular configuration participates in more reactions out of the total; this may just be something any form of heating would do).

This sort of thing (just picking from the first page of a Scholar search result):
http://www.springerlink.com/index/T1R5252R3403842V.pdf

There's plenty in the chemistry journals about using this method, but I don't know of anything in the biology journals looking for it 'just happening' -- no doubt it's been looked at and I haven't found it.

By Hank Roberts (not verified) on 18 Sep 2009 #permalink

Did somebody tell said yahoo that his body temp is radiating blackbody microwave radiation? Think of the racial implications. Think of the noisy WiFi connections.

When the Devil strode the Earth onions appeared under his right foot and garlic under his left. Complainant is obviously a Luddite commie satanist. You can't take those people seriously - that is why we elect them to Congress, where they won't bother us.

Simple solution: If garlic wants to surf the Web, Rural Electrification will do stdies about supplying subsidized Cable access. Problem solved.

If I were explaining this I would start by saying that the types of radiation that can damage molecules (X-rays, UV) have higher frequencies/energies than visible light. Microwaves have lower frequencies than visible light. Lower even than infrared, which is emitted by everything around you. Almost as low as radio waves.

(1) You mentioned DNA molecules suffering "damages and breaks" from absorbing UV photons. Actually, direct hits by UV photons on DNA isn't the main factor. A cell is mostly water, and by volume DNA is quite a tiny target. So, the biggest threat to DNA is indirect: UV photons break the water molecules into H+ ions, hydrated electrons, and, worst of all, hydroxyl radicals. All of these are factors, but #1 on the DNA threat list is a hydroxyl from UV-dissociated water wandering over at random and delivering a chemical @$$-whuppin' to the DNA.

Of course there is still some direct UV damage to DNA, but the mechanism is lot more interesting than the simple blast-damage picture most people have. See:

http://www.innovations-report.com/html/reports/life_sciences/report-482…

(2) You have omitted the final step of your energy calculation. To properly complete the analysis, you need to first sum up the total lifetime exposure to Wi-Fi radiation, and then compare that total energy against the energy delivered to the back of a crackpot's head by just a single dope-slap.

By Emory Kimbrough (not verified) on 18 Sep 2009 #permalink

Let's not forget satellite TV signals.

I suspect most people think they are being "beamed" directly into their antenna, not bathing their entire house (and yours) in electromagnetic radiation.

@6: you couldn't pick up UHF TV with a circular loop antenna if there weren't time varying magnetic fields in that EM wave. It is a question of intensity and modulation.

Along a similar line, our brains operate with electromagnetic signals in the ELF range, so it must be a matter of epidemiology and experiment to discover if external fields in that range might pose a problem.

The one thing we know with confidence is that they will not produce cancer via the same mechanism that people associate with "radiation" such as x-rays or UV. As noted @14, in this respect microwaves are safer than the IR that provides heating from the wires in a toaster.

By CCPhysicist (not verified) on 18 Sep 2009 #permalink

"Authority figures insisted, despite an entire lack of evidence of any kind, that ethyl mercury in vaccines was perfectly safe."

This is the kind of history rewriting that really ruffles my feathers...

For one, the authority figures in question were not merely authority figures, but were so because they were experts in their field. You may not have meant it as such, but failing to distinguish lumps in experts with elected offices and political appointments.

For two, there were plenty of studies on which they based the claim. Every vaccine we use had to go through a screening process for effectiveness and safety. Just because they didn't do a study that singled out that one item until after the manufactrovercy, doesn't mean that there was an "entire lack of evidence of any kind." Doing the narrowly-focused study simply turned a reasonable, well-evidenced claim into an absolute slam-dunk.

For three, the ecperts by-and-large pointed out not that it was "perfectly safe," but that it was safe in the sense that the benefits far, far outweighed the (potential, as they were unproven) risks.

One thing that seems to be implied by the above discussion (including comments) is that, the higher you go in frequency, the worse the EM radiation will be on our bodies because the photons will have higher energy. Electromagnetism and our biology aren't so simple. 5 GHz radiation isn't going to be used in microwave ovens, even though it has more energy per photon than 2.4 GHz, because 2.4 GHz happens to make the water molecules resonate, producing a very efficient heating.

When you get up to 60 GHz, you get a different effect. EM radiation at that frequency is absorbed by moisture in the air, more than any other RF frequency, but can't even penetrate your skin.

I think this issue is more an issue of biology than physics. You put your head next to a Wi-Fi transmitter and the water in your head *might* get slightly warmer. Who knows what long term effects that might have? Our bodies are extremely complex and don't always work like we think. I understand the logic here, and if I were to bet I'd say we're operating in safe conditions, but I think the systems involved are complex enough to warrant a long-term study. 2.4 GHz is special enough to warrant its own consideration, even though TV, radio, and satellites have been transmitting on lower (and for satellites sometimes higher) frequencies for decades.

By multipath (not verified) on 18 Sep 2009 #permalink

Multipath: you could use 5 GHz in a microwave, there's nothing special about 2.45 GHz. 5 would also work, though it would have lower penetration so would tend to heat less evenly. The only special thing about 2.45 GHz is that it's in a part of the spectrum available for unlicensed use (if I recall correctly it was originally for "scientific experiementation").

Industrial microwaves operate at lower frequencies to get better penetration.

Anyway, it's a myth that 2.45 GHz corresponds to a resonant frequency of water. The absorption spectrum for water does not have a local maximum there, or anywhere near there.

Mystyk: Really? They tested it on infants and toddlers, at a dozen times the concentration found in any individual vaccine? Or did they test individual vaccines on male adults? You present an excellent example of the behavior that caused the problem in the first place.

By Nathan Myers (not verified) on 18 Sep 2009 #permalink

CCPhysicist: Of course there are both E and H field components to any traveling RF wave. My point was that the E and H fields behave very very differently in their interactions with objects in the near vs far fields, and that experiments done in the near field do not necessarily translate into far field effects.

By Robert S. (not verified) on 18 Sep 2009 #permalink

It's not clear from the cited article whether the farmer's concern is with transmissions between the cell and the personal device or the transmissions between the cell towers and the central office. Do you know which it is? Knowing little about microwave transmission, it's only a guess that the CO/cell tower tx are higher frequency and higher power. Also likely guided.

By William Dunn (not verified) on 18 Sep 2009 #permalink

William:IIRC:Most if not all RF back-hauls operate below 10Ghz or so in order to minimize the effects of rain-fade (also why weather radars almost all operate above well it) The total amount of power put into a single back-haul link might be higher, the same, or even perhaps lower then that going into a single cell, it makes little difference to public exposure because the providers do all they can to make sure that energy ends up at the receiving station rather then cover an area of land like a cell.

By Robert S. (not verified) on 18 Sep 2009 #permalink

I've always wondered why so many people who work with electromagnetic waves characterize them by wavelength rather than frequency. Using frequency, your equation is the simpler E = fh. It is then very obvious that the higher the frequency, the higher the energy.

An extraordinary number of people don't know that light is a form of electromagnetic radiation. Frequencies above visible light have enough energy to knock around the electrons in the chemicals in your body--to create ions--and make unpleasant changes. Thus, the physicists' term "ionizing radiation." They are what people think of when they hear the word "radiation." Frequencies below visible light can't knock electrons somewhere else. They are non-ionizing radiation. So is ordinary red orange yellow green blue violet.

If you aren't afraid of turning on your light switch (or looking at a green plant), you probably shouldn't be afraid of a microwave tower.

By Roger Sweeny (not verified) on 19 Sep 2009 #permalink

If you aren't afraid of turning on your light switch (or looking at a green plant), you probably shouldn't be afraid of a microwave tower.

Spurious reasoning, we likes it.

By Nathan Myers (not verified) on 19 Sep 2009 #permalink

I have a question. I am not a scientist but I have noticed that where I live on the East Coast of the US, the vegetation is, in the space of a little over a year, in a death spiral. ALL the vegetation. Something has happened that is much worse than ozone for it to be so sudden and dramatic, and I have speculated it might be the recent addition of ethanol to gasoline.

However it has been suggested to me that perhaps the proliferation of cell towers is causing an intensification in the process of ozone and PAN creation that has heretofore occurred when the volatile organic compounds come into contact with UV radiation from the sun.

It seems quite unlikely to me but perhaps someone could tell me, is that possible?

Yesterday I sent a letter to several nurserymen and farmers who had authored articles in a horticultural publication, which is copied below, for background. The disastrous impacts of greenhouse gas pollution are quite obvious in our environment to anyone who troubles to actually look.

Here's the letter:

I picked up a copy of Gardener News and notice that many articles are unwittingly describing the effects of carbon emission poisoning on vegetation. These effects mimic the symptoms of drought, blight, excess rainfall and fungus but are in fact also well-documented to be produced by exposure to atmospheric toxins.

Just because the gasses produced by burning coal, gasoline and ethanol are invisible, it does not mean they are not deadly. The scorched and falling leaves, thin tree crowns and forest canopies, bare branches, and dropping pine needles are ubiquitous in New Jersey and up and down the Eastern Seaboard. The hundreds of trees that were damaged in a thunderstorm in Central Park was not from mere weather - the trees are weak, and they are covered with lichen, a harbinger of death.

It's quite likely that the relatively sudden and dramatic decline in trees is a result of the mandated addition of ethanol to gasoline.

It's well known that burning coal and gasoline emissions react to UV radiation, creating ozone. It's less well recognized that ozone is very detrimental to plants - and even less discussed is that the damage from ethanol may be worse. Ethanol emits acetaldehyde which is the precursor to peroxyacetyl nitrates (PANS), that are highly dangerous to vegetation (and people: see http://www.stanford.edu/group/efmh/jacobson/ClimateHealth4.pdf)

Before attributing this widespread and universal damage to individual diseases, excessive rain, pests, previous drought, and other blights on vegetation, which is what foresters, ecologists, and conservationists usually do, please consider this fact: the leaves of plants in ponds show the identical process of chloration - a loss of the ability to create chlorophyll. In the classic response to ozone and PANS, the leaves close their stomata, basically causing the organism to suffocate.

I would like to direct you to this report http://pubs.ext.vt.edu/430/430-022/430-022.html which describes exactly the condition of vegetation in New Jersey. I would question two statements that I believe may be out of date - one is that some species are more susceptible than others. Currently, it's impossible to find any species that isn't affected. The other is that PANS are less of a problem than ozone.

The evidence of this phenomena is readily detectable in an objective, cursory inventory of any woods, park, back yard, farm, arboretum, mall parking lot, pond, or nursery. It is irrefutable that the composition of the atmosphere is the primary causative agent for what is rapidly becoming an existential threat. Note that there is not one species that normally photosynthesizes that is immune, and that trees of every spectrum of age, and plants in every situation, whether wild or in nurseries, in pots or ponds or in the ground, share the same degree of impact.

If we do not stop squandering fuel we are headed straight for ecosystem collapse and mass extinctions, not to mention crop losses.

I am not a chemist but if it could be determined to be primarily linked to ethanol, we can consider ourselves lucky. We could stop this wholesale slaughter of trees and go back to the slower path of destruction through climate change.

You who are directly involved in agriculture and landscaping should be in the forefront demanding that the government take swift and strong action to enact clean energy legislation, because it is your livelihood that is at stake. Of course everybody who eats, and every species that depends on trees for fruit, nuts, shelter, and shade has everything to lose as well. But you will be the first to be impacted when your crops fail to produce adequate income for you, and people and businesses give up purchasing and planting shrubs and saplings in their landscapes because it will be a waste of money. Eventually they will notice that nothing they install will thrive.

Please fell free to write or call if there is anything I can clarify; and/or visit www.witsendnj.blogspot.com where there are many links to scientific research, and photos documenting the carnage.

Sincerely,

Gail Zawacki
Oldwick, NJ

So we have molecular rotational and vibration degrees of freedom that can effected by MW radiation. Isn't it conceivable that some of the many molecules involved in RNA transcription and translation could be adversely effected? Why are you only considering "cooking" effects?

The real life 'facts' of wifi radiation don't hold up to the theory that it is safe. Its not really acceptable to say 'its low power so it must be safe'.
The dect phones in my house give of far more radiation when measured with an emf meter than my wifi router does and I have never had a problem with dect phones but always have a problem with wifi.

Its not the power of the signal that must be causing the problem but rather the specific frequency, and the fact that not only will wifi 'jump' around the 2.4ghz band to avoid interference but it is also not a constant power strength like the dect phones due to its pulsing.

This i believe causes the biological effects - the constant on/off, changes in strength, and changes in frequency.

I cannot ignore the facts which i have seen with my own eyes - despite the theory people say explains that it is safe.

For example i have seen pets go very disorientated,dizzy and clumsy in friends houses after the wifi has been switched on.
I have also experienced my 5 year old niece on several occasions go from happy and well behaved to becoming very tired, misbehaving and angry within an hour of a wireless router being switched on.

The fatigue and tiredness seems to be the most experienced effect i have seen. Second place seems to be lack of cognitive ability - a lack of concentration and short term memory.
I have seen this short term memory loss in many people (and experienced it myself), and after convincing them to disable their wireless their cognitive abilities have improved within weeks.

This is not a placebo affect and we should be treated it seriously rather than saying 'it doesn't effect me so can't affect someone else'.
If someone is allergic to cheese and gets ill from eating it, would you say to them' don't be stupid - cheese is perfectly safe - everyone eats it'.

Wifi is a great technology and certainly isn't going anywhere. All it needs is a little redesign.
Wouldn't it be silly to have a mobile phone that is constantly transmitting even when we weren't making a phone call, so why is wifi like this?

Consider that actually there may be a problem - we don't understand fully why but there may be a problem and then make wifi routers smart - make it adjust its power level depending on how far away you are with your laptop - make it switch off its transmission when there are no laptops connected to it - make the laptops the ones that 'wake the router up'.

I'm wondering about the new routers that operate at 5 ghz?