There's a (Shannon-entropy limited) map for that.

You probably haven't been able to avoid seeing the televised bombs AT&T and Verizon have been throwing at each other over the maps of their coverage. Both sets of commercials (to differing degrees) fail to make it especially clear just what their maps mean to the consumer. For instance: Verizon's commercial brags of overwhelming coverage while showing AT&T's comparatively sad little sparse map. AT&T fires back with a different map and a claim that they cover 97% of the US population. Both sets of claims are correct - if you pay close attention. Verizon's map indicates 3G coverage specifically, AT&T's maps indicate their voice coverage.

So to the extent that Verizon makes you think that AT&T's voice and non-3G data coverage is spotty, it's misleading. And to the extent AT&T makes you think its 3G coverage is as omnipresent, it's misleading. Sadly, I don't have any reason to care. None of this makes any difference to me since my low-budget phone and plan can't do anything more sophisticated than text and picture messages anyway.

What does interest me is the science behind these maps and this news story:


Tiered AT&T pricing to target heavy data usage

AT&T wants its iPhone users to use less wireless data, and it plans to introduce new pricing models to curb users' data usage as it tries to keep up with growing demand.

At an investor conference in New York on Wednesday, Ralph de la Vega, AT&T's head of wireless, announced that the wireless operator plans to introduce new pricing for heavy wireless-data users.
...
AT&T seems to realize that this is not a long-term solution. And not only is the carrier upgrading its network, but it's also asking the Federal Communications Commission to find more spectrum to auction off that can be used for wireless-data services. Jim Cicconi, senior executive vice president of external and legislative affairs for AT&T, said in a separate interview with CNET on Wednesday that something needs to be done to deal with the flood of wireless-data traffic.

Cicconi and AT&T's CEO Randall Stephenson met with FCC staff members earlier this week to discuss the spectrum issue.

"Clearly, there is a looming crisis that needs to be addressed when it comes to spectrum availability," Cicconi said in an interview at his office in Washington, D.C. "Wireless-data usage is growing far faster than anyone had expected. And if we don't do something soon, we will run out very fast. And then we will have to start telling wireless customers that they can't do all the things they want to do with their devices."

FCC Chairman Julius Genachowski has made freeing up more spectrum a top priority. And he has already proposed that the FCC look into taking some spectrum away from TV broadcasters to give to wireless operators to deliver more wireless-broadband services.

Spectrum. Now there's a word near and dear to my heart. But what's it have to do with iPhone jockeys siphoning in torrents of data from the internet?

As we know, wireless communication of pretty much every stripe uses radio waves to send data from the transmitter. The receiver measures those waves and converts them to data. We hear that data on the radio, see it on the TV, and read it on our mobile devices. Let's take a closer look at this process and make up a wave to our specifications:

i-7af7d8bbb982ec2af88c817b26b8368e-graph1.png

It's a nondescript wave with a frequency of 100 Hz (if we take the x-axis in seconds), and short of a very simple sort of "On if by land, off if by sea" message we can't actually do anything with it. We need a more complex wave, and we can come up with a way to translate the particular type of complexity into a message. Transmit a 110 Hz wave on top of our previous one and let's zoom out the graph a bit to see how the waves look added on top of each other:

i-3fdee7de4197c08986e3407853fa09fa-graph2.png

The two waves coalesce into a train of pulses. We're getting somewhere. What if we add a 105 Hz wave on top of that?

i-d3fdbda776403268fadccbd8598f89a3-graph3.png

Little pulses and larger pulses, in a row. You can see that we're getting complexity, and that might suggest a way of sending messages. For instance, what if we write down our message, express it in binary numbers, and vary the amplitude of the 105 Hz wave with time so that the relative size of the small pulses changes? A very tiny small pulse might represent a "0" and a relatively larger small pulse could represent a "1". This works, though unfortunately the very process of varying the amplitude of the 105 Hz wave itself requires you to add in other different-frequency waves. The total slice of the spectrum that you use is called the bandwidth.

In a perfect world there would be no limit to the data transmission rate no matter how small the bandwidth. You could imagine a 100 Hz wave on top of a 100.11000010001011000000... Hz wave, where the decimal places corresponded to the bits of the data you wanted to send. Which is fine if you can measure to arbitrary accuracy, but if there's a source of external noise or error that starts cropping up around (say) the 6th decimal place, you're stuck. If you want to send data faster, you must have more bandwidth so the wave carrying the signal is more complex and can fit more modulations in a smaller time window.

Due to the brilliant scientists Claude Shannon and Ralph Hartley, we have an equation that tells us the maximum data rate we can squeeze out of a given bandwidth:

i-d4bb9fe09d6b78eb832985821d8a307f-1.png

C is the data rate in bits per second, B is the bandwidth in Hz, S/N is the signal strength divided by the strength of the noise. Here log_2 is the base-2 logarithm.

So the stronger the signal is in relation to the noise, the higher the rate of data you can send. But it's a logarithmic improvement, so improving your signal strength enormously doesn't actually buy you all that much. Bandwidth is the key.

The actual spectral allocation for this sort of thing is always a matter of political wrangling with the FCC, but typically there's a few blocks of spectrum around 50 MHz wide allocated to this kind of mobile data transfer. The signal/noise ratio can't be improved indefinitely, because what's signal to my phone will be noise to yours. As such with this kind of frequency allocation there's somewhere in the vicinity of 50 Mbps available for all the phones communicating with a given tower. The basic finite nature of the available spectrum and the iron laws of information theory prohibit you from doing very much better.

The way around this is to make more spectrum available. If you look at the FCC's spectrum allocation, you'll see that huge bands are taken up by broadcast TV (this chart dates back before the DTV transition, but the overall allocation is not all that different today). "But doesn't everyone have cable or satellite by now? Isn't our application more pressing for society?", quoth the telecoms. Well, it's not a bad point but it's not one the TV broadcasters are likely to accept without a fight. It's a pickle for sure. I personally lean toward the "internet data is more important" school of thought, but I'm no policy wonk.

You might wonder why not just use higher frequencies, since there's a heck of a lot of MHz between (for instance) 1 THz and 1.1 THz. Unfortunately physics limits the practicality of higher frequencies with respect to mobile communications. High frequencies perform poorly when asked to pass through or around obstacles like walls and buildings, which is not acceptable for cellular communications.

What do I propose for keeping everyone happy? Beats me. I don't envy the FCC's job at all.

More like this

Yeah, sort of how my bluetooth earphones suffers horrible interefence if the transmitter source is more than 18" away when I'm outdoors. When indoors no problem. I can be 25 feet from the iPod and still hear the music.

The earphones do give me one thing. I can find all the WiFi networks just walking down the street. 2.4GHz will do that.

I was under the impression that the Analog TV frequencies are already in disuse, just not reallocated yet. I could be wrong though.

Nice, explanatory, post

Produce value, then demand outstrips supply. Produce crap, then demand and supply remain scaled. An advocate makes virtue of failure. Therefore... government, in all things.

The solution to the US national healthcare crisis is for doctors to butcher not cure. Our only hopes are iatrogenic and nosocomial. Diversity!

The solution to the US national energy crisis is for efficient generation to be banned and all generation to be profoundly taxed. Our only hope to assure supply is to make it unavailable. Enviro-whinerism!

The solution to the US War on Terror is to destabilize every aspect of normal life. Our only hopes are to subjugate the innocent and subsidize the enemy. Homeland Severity!

FEMA, NASA, Department of Education... do we detect a trend?

Wow. That was really neat. Thanks Matt.

Produce value, then demand outstrips supply. Produce crap, then demand and supply remain scaled. An advocate makes virtue of failure. Therefore... government, in all things.

I'm not really sure what you're suggesting here. Right now, the cell phones want more bandwidth, but the television companies control the space. Your options are for one or the other to give, for both to just grab bandwidth without limit (hurray anarchy!) and produce a lot of crap for all our troubles, or for us to maintain the status quo.

Given that the companies are in direct conflict of interest, you need a third party mediator. And given that the public at large is the biggest benefactor / victim of change, it makes sense that public representatives (aka The Government) step in to mediate a solution.

The digital TV transition was a happy compromise. TVs were transitioned to higher quality signals, cell phones received a bigger bandwidth, and the public got more product (in the form of more tv/radio broadcast channels and more cell phone bandwidth).

...do we detect a trend?

I don't know, do we?

Part of the political back story here is that the FCC was not particularly diligent about keeping the spectrum organized during the Bush administration. Wardrobe failures were a bigger problem.

100 Hz means 100 cycles per second. This graph shows just under 16 cycles. I suspect that its frequency is 100/(2Ï) or about 15.9155 Hz, so that its Ï (=2Ïf) is 100.

Good catch, JimB. I have this alarming tendency to think in angular frequency, but the actual frequency here is indeed 100/(2pi), etc.

Great information in this post. Thanks for the clear and concise explanation.

As far as policy goes, it seems to me that the most efficient method of spectrum allocation would be to auction off fixed-term licenses to specific blocks (maybe some blocks could be 5 year, others could be 10, and others could be 25) to strike a balance between providing a stable environment and requiring that license-holders maximize the utility of a finite public resource. Another proposal was to allow devices to check for whitespace and use unused spectrum. Finally, we should probably start freeing up more of the current military spectrum to the public (through auctions), while pushing the military towards less bandwidth-intensive digital radio communications.

Television stations don't control the space... the FCC auctions off time-limited licenses (e.g. 10 years) that cost a bunch of money. So, corporations just 'borrow' it from us... for a fee (that, in theory, should offset taxes needed to pay for running the FCC, etc. - although ultimately we pay through advertising {enough of the Economics class for now}).

Although of lesser significance than the good 'ol days there still seems to be a place for OTH television. The decreasing value though can be seen in the decline in the number of channels and the spectrum reuse that has taken place over the years.

Many years ago 14 channels (70 to 83) went away for phone service (84 MHz re-purposed). More recently another 18 channels (52 to 69) are being eliminated to be recycled for public and public safety data uses (another 108 MHz). Don't forget that in addition to this attrition radio astronomy, public safety, and other allocations exist simultaneously within the rest of the TV channel spectrum (it is not all whitespace even if there is no TV channel present).

Where is TV in the U.S. now? 5 chunks of spectrum, 50 channels (6 MHz for each channel) = 300 MHz total for TV (auctioned off at non-trivial prices). To put it in perspective, consider for a moment that we all have FREE access to more than that same amount of spectrum for our various gadgets if we just play by the rules (i.e. the various 'free' bands such as 915, 2.4, 5.3, and 5.8 used for goodies like Wi-Fi, Bluetooth, etc.).

Perhaps a good idea would be to determine how many channels are really needed (e.g. how many stations in a given area and what gaps are necessary to prevent interference from other regional stations)? The current 50 channels sounds like a lot but maybe not so much when considering some of the details. If we WERE able to recycle some channels it would probably make sense to start at the bottom and work our way up... getting rid of the lower channels that have technical issues for digital TV (and are potentially really sweet spectrum for data services).

It really IS about bits per herz. I've been to presentations that go on about which modulation technique, error correction, blah, blah, blah. It really comes down to how efficiently the stuff can be made to flow through the pipe. And as your clear explanation states, turning up the volume (e.g. doubling it) doesn't perform magic. If fact, quite the opposite can occur - just more noise for everyone (e.g. one of the issues with some high density implementations such as Wi-Fi mesh).

Aint nothin' free. Optimization requires thinking about the details.