Processing Power

Tom Vanderbilt has a fascinating article on the infrastructure of data centers, those server farms that make Google, Facebook and World of Warcraft possible. Every keystroke on the internet (including this one) relies on shuttling electrons back and forth in a remote air-conditioned industrial hangar. These are the highway ribbons of the future, the grid that's so essential we don't even notice it.

The article also mentions the energy costs required to run such server farms, which has real scientific implications. As size of data sets continues to rapidly increase, one significant hurdle is the practicality of working with and storing such massive amounts of information. Here's Vanderbilt:

Data centers worldwide now consume more energy annually than Sweden. And the amount of energy required is growing, says Jonathan Koomey, a scientist at Lawrence Berkeley National Laboratory. From 2000 to 2005, the aggregate electricity use by data centers doubled. The cloud, he calculates, consumes 1 to 2 percent of the world's electricity.
[SNIP]
Power looms larger than space in the data center's future -- the data-center group Afcom predicts that in the next five years, more than 90 percent of companies' data centers will be interrupted at least once because of power constrictions. As James Hamilton of Amazon Web Services observed recently at a Google-hosted data-center-efficiency summit, there is no Moore's Law for power -- while servers become progressively more powerful (and cheaper to deploy) and software boosts server productivity, the cost of energy (as well as water, needed for cooling) stays constant or rises. Uptime's Brill notes that while it once took 30 to 50 years for electricity costs to match the cost of the server itself, the electricity on a low-end server will now exceed the server cost itself in less than four years -- which is why the geography of the cloud has migrated to lower-rate areas.

This practical challenge reminded me of the Blue Brain project, that epic attempt to simulate consciousness on a supercomputer which I wrote about in 2007. While Henry Markram's team has demonstrated, in principle, that it's possible to simulate a neocortical circuit from the bottom up, the challenge is finding the computers to put multiple circuits together:

"We have already shown that the model can scale up," Markram says. "What is holding us back now are the computers." The numbers speak for themselves. Markram estimates that in order to accurately simulate the trillion synapses in the human brain, you'd need to be able to process about 500 petabytes of data (peta being a million billion, or 10 to the fifteenth power). That's about 200 times more information than is stored on all of Google's servers. (Given current technology, a machine capable of such power would be the size of several football fields.) Energy consumption is another huge problem. The human brain requires about 25 watts of electricity to operate. Markram estimates that simulating the brain on a supercomputer with existing microchips would generate an annual electrical bill of about $3 billion . But if computing speeds continue to develop at their current exponential pace, and energy efficiency improves, Markram believes that he'll be able to model a complete human brain on a single machine in ten years or less.

For now, however, the mind is still the ideal machine. Those intimidating black boxes from IBM in the basement are barely sufficient to model a thin slice of rat brain. The nervous system of an invertebrate exceeds the capabilities of the fastest supercomputer in the world. "If you're interested in computing," Schürmann says, "then I don't see how you can't be interested in the brain. We have so much to learn from natural selection. It's really the ultimate engineer."

This is yet another reason to study the human brain: our electrical microchips, with their fatty membranes and ion pumps, are an ideal computational tool. Human computers may never become conscious, but if our need for data continues to grow at an exponential rate (and why wouldn't it?) then our machines will need to become more metabolically efficient. And that's a problem that the cortex has already solved.

Categories

More like this

Bit of a mistake in your reasoning. Simulating anything at the level they're doing it is going to take many orders of magnitude more processing power than the original provides. If you simulate a normal computer at the same level - actually model every gate as an analog circuit - it's going to take a large amount of processing power too.

Yes, the biological neural system is pretty great. Just not quite as great as your post implies :-)

This is just another example of our reliance on energy and it's growing shortage. Besides developing more energy efficient processing chips that run cooler and require less cooling energy. We need to continue utilizing current alternative energy sources, but need to develop new alternative energy technology. Technology that is more efficient and cheaper to produce.

I recall that Seymour Cray once described himself as a refrigeration engineer. The other problem from an electric power grid point of view is the the loads are reactive rather than resistive (like aluminum smelters) and can cause all manner of unwanted transient behavior on the grid.

By David Kerlick (not verified) on 15 Jun 2009 #permalink

Markram estimates that in order to accurately simulate the trillion synapses in the human brain, you'd need to be able to process about 500 petabytes of data (peta being a million billion, or 10 to the fifteenth power).

In my Googling, I've found lots of quotes of people throwing out numbers like that, but I can't find an explanation of how these numbers were produced. Can anyone help me?

Hofstadter is skeptical about the 10^15 number. I'm skeptical too, but only because the numbers are conjured out of nowhere for all I can tell! I'd like to see how the calculation was done.

I don't know if Sweden is an appropriate reference for the power needs of server farms, since Sweden represents about 1/900th of the world's population. For you personal edification or future reference, what you may want to research is the brownouts/blackouts of Silicon Valley in the early 2000s.

By Steven P. Mitchell (not verified) on 07 Mar 2011 #permalink