This is cool. From Wired:
When the Top 500 list of the world’s fastest supercomputers was announced at the international supercomputing conference in Austin, Texas, on Monday, IBM had barely managed to cling to the top spot, fending off a challenge from Cray. But both competitors broke petaflop speeds, performing 1.105 and 1.059 quadrillion floating-point calculations per second, the first two computers to do so.
These computers aren’t just faster than those they pushed further down the list, they will enable a new class of science that wasn’t possible before. As recently described in Wired magazine, these massive number crunchers will push simulation to the forefront of science.
Scientists will be able to run new and vastly more accurate models of complex phenomena: Climate models will have dramatically higher resolution and accuracy, new materials for efficient energy transmission will be developed and simulations of scramjet engines will reach a new level of complexity.
I’m not too sure about this though:
“The scientific method has changed for the first time since Galileo invented the telescope (in 1609),” said computer scientist Mark Seager of Lawrence Livermore National Laboratory.
Supercomputing has made huge advances over the last decade or so, gradually packing on the ability to handle more and more data points in increasingly complex ways. It has enabled scientists to test theories, design experiments and predict outcomes as never before. But now, the new class of petaflop-scale machines is poised to bring about major qualitative changes in the way science is done.
Leaving aside those annoying philosophers who insist on denying that there is a clearly defined thing called “the scientific method,” I don’t see how superfast computers really change anything fundamental about the nature of science. The novelty here is that awesome computing power makes possible very detailed simulations that were previously impossible. But technological advance making possible new sorts of observations and experiments is not exactly a new story.
Telescopes and microscopes, to pick just two examples, revolutionized astronomy and biology respectively. But they didn’t change the scientific method. They simply made possible new types of observations. Why should we regard computers any differently? Methinks Wired is being a wee bit sensationalistic here.