This is cool. From Wired:
When the Top 500 list of the world's fastest supercomputers was announced at the international supercomputing conference in Austin, Texas, on Monday, IBM had barely managed to cling to the top spot, fending off a challenge from Cray. But both competitors broke petaflop speeds, performing 1.105 and 1.059 quadrillion floating-point calculations per second, the first two computers to do so.
These computers aren't just faster than those they pushed further down the list, they will enable a new class of science that wasn't possible before. As recently described in Wired magazine, these massive number crunchers will push simulation to the forefront of science.
Scientists will be able to run new and vastly more accurate models of complex phenomena: Climate models will have dramatically higher resolution and accuracy, new materials for efficient energy transmission will be developed and simulations of scramjet engines will reach a new level of complexity.
I'm not too sure about this though:
“The scientific method has changed for the first time since Galileo invented the telescope (in 1609),” said computer scientist Mark Seager of Lawrence Livermore National Laboratory.Supercomputing has made huge advances over the last decade or so, gradually packing on the ability to handle more and more data points in increasingly complex ways. It has enabled scientists to test theories, design experiments and predict outcomes as never before. But now, the new class of petaflop-scale machines is poised to bring about major qualitative changes in the way science is done.
Leaving aside those annoying philosophers who insist on denying that there is a clearly defined thing called “the scientific method,” I don't see how superfast computers really change anything fundamental about the nature of science. The novelty here is that awesome computing power makes possible very detailed simulations that were previously impossible. But technological advance making possible new sorts of observations and experiments is not exactly a new story.
Telescopes and microscopes, to pick just two examples, revolutionized astronomy and biology respectively. But they didn't change the scientific method. They simply made possible new types of observations. Why should we regard computers any differently? Methinks Wired is being a wee bit sensationalistic here.
- Log in to post comments
<Captain Jack Sparrow>Hello? Wired!</Captain Jack Sparrow>
Wired has been making wildly exaggerated remarks like this since about 1995. Every time a new supercomputing performance 'barrier' is broken, they're sure it will 'revolutionize science' or 'enable a new paradigm' or something equally inane. It's a magazine filled with premature technogasms. But the weird part here - this is actually a quote from a computer scientist from LLNL - not one of Wired's reporters or opinionators. Strangely enough, Wired is not at fault for this particular remark.
(a) You can always find that one guy in the lab group who has an exaggerated opinion of what he does; quoting him is much more fun than quoting those boring people, with their "perspective" and stuff.
(b) There's no reason to presume he said exactly what he's quoted as saying — it's Wired, after all.
You're 100% correct that supercomputers don't change the scientific method. What they do is make new science, and even whole new fields of science, possible. That is indeed a huge change.
One example is bioinformatics, and in particular genomics. These fields are only possible because of incredibly powerful computers. Of course, for bioinformatics, FLOPs (floating point operations per second) is a useless measure since genes are not floating point.
If they find a way to falsify some types of previously unfalsifiable propositions, that could be equivalent to a change in the method. Or not.
I'm reminded of an old article I saw someplace that explained that military operations such as the Normandy invasion were so complicated that we could never have won WWII without computers.
Yeah, and how abut those A-bombs!!
It'll still only run Crysis at 30 FPS on max graphics settings.
(Nailed it.)
You have to expect a certain amount of breathless hyperbole and limited self-awareness from Wired. These are the people, after all, who manage to print issue after issue about the death of dead tree media.
I have often wondered whether it makes sense to argue that the prospects for commercial fusion are materially improved by the scale of the supercomputers likely to be built over the next twenty years. People are going to be able to simulate the life of a reaction chamber to an amazing level of detail...
If they find a way to falsify some types of previously unfalsifiable propositions, that could be equivalent to a change in the method.
I find most unfalsifiable propositions to be that way simply because they are vague. It would be easy to make them falsifiable: ask the proposer to be more specific. Not really a change in the scientific method there, as we already don't allow "and then a miracle occurs..." types of explanations.
AL, There have been intuitive propositions that would appear to be quite clear, rational and viable, except that they are largely untestable by any known systems. Some have even argued that the theory of evolution is such a one.
I think that one prediction can be safely made. The increase in resolution of climate models made possible by the dramatic increase in computing power will in no way convince the climate change deniers that global warming is happening.
In a purely speculative way, I think it's possible to imagine a situation in which very fast computers seriously changed what we think of as science. Imagine a computer being fed observational data and running a learning algorithm to guess what future data will be. It may be that it builds a model of the data that is completely incomprehensible to us and full of ad hoc fixes but which outperforms the brightest human theorist in terms of prediction. If this were the case we would have to think seriously about the role of understanding and theory in science.
I think you're taking the term "scientific method" too specifically. He seems to be addressing methodologies, not theory structures.
The "scientific method" is not dead but is apparently living in a little closet down the hall.
Sure there's a scientific method: falsification and parsimony. :-|
And the faster a computer, the more hypotheses can be tested against each other for which one is the most parsimonious.
Yeah its from Wired
Anyway i hope local servers have this capacity soon
Contrary to the quotation by Mark Seager above, Galileo did not invent the telescope.
GIGO. Now you can GI at over one petaflop. Wowza.
I don't know about the rest of ya, but it is a sexy computer, I would sensationlise it too! With that sleek cat on the side and all that dreamy green. And it's BIG! Geeky gurls like big. :-D Oh, did I mention long? 'cuse me. I got a little excited.
It's like the next new roadster or sports car, do they change driving as a whole? No, driving is driving, it's how you feel when you are doing it.
Maybe now, with these petaflops we can finally understand turbulence among others. It WILL expand what we know. Who knows, maybe some of those results could challenge the way we look at things now.
I couldn't agree more.
That's the beauty of the scientific method... new methods of observing and analyzing data don't change the process involved in developing and testing hypotheses.
A machine that can both formulate and deal with abstractions at a higher level than previously found possible can be expected to improve on and change the process of both developing and testing hypotheses at some future point. Are these those machines? Perhaps.
We don't know that they aren't.
cwfong,
That's a fair point, but I guess I didn't state my position clearly enough.
How you develop and test hypotheses may change, but the idea of building and testing hypotheses with observation and empirical data will not.
Just because you increase the resolution (or depth) of observation or the speed in which you can analyze data doesn't "change" the scientific method.
It just allows you to apply the scientific method to experiments that may be currently out of reach.
Yes, but method has to include methodology or you are just stating that the logic behind scientific exploration will not change. But I'd argue that the method of putting our logical processes to use can conceivably change, just as the logical systems themselves have changed, or at least broadened, since the scientific method was conceived.
And I'd argue that if we can, for example, more accurately predict the evolutinary progress of one or more life forms in a relatively short term using the power of these machines, we will have significantly improved on our falsification methodologies.
I think the point usually made is that before computer simulations, science was an interplay between theory and experiment. You made a theory and then performed an experiment to see whether its consequences are what you would expect them to be. Now you can make a theory and *compute* its consequences to check whether they would yield the phenomena you try to explain.
I agree that the method itself does not really change, though, it just has a new tool not available before.
The thing about this computer being the biggest thing since the telescope is just trying to make it sound impressive. Why wasn't the last supercomputer, which was within a couple orders of magnitude as fast, the biggest thing since the telescope? What's so great about this one? MAYBE you can get away with a claim like that once you invent quantum computing. Maybe.
I'm not so sure about this, Jason. You may be correct overall, and I certainly agree that there's no single thing that can be called "the scientific method" - the methods of inquiry used in science are continuous with those used in other kinds of rational inquiry. Perhaps at the most fundamental level the nature of rational inquiry has never changed and we have always done "science" in some sense.
Still, I do think that something important changed with new kinds of instruments that expanded human perceptions, such as telescopes and microscopes. Those kinds of instruments make it possible to make observations (frequent and accurate ones) that were not possible with the unaugmented senses, and that had huge implications. Consider the whole Galileo affair - the use of telescopes pretty much wrecked the entire worldview of the time very quickly (those pesky moons circling Jupiter just didn't make sense within the old paradigm of concentric spheres around the earth), and encouraged a new approach that enabled people to postulate more and more entities and phenomena that were beyond the capacity of people without scientific instruments to observe (yet, perhaps, open either to observation with instruments or at least open to observation of predictable phenomena). Modern science is full of talk of things no one can ever observe with their unaugmented senses - prehistoric animals, DNA sequences, planets circling far away stars.
It seems to me that something qualitative did change in the seventeenth century such that we can say that modern science begins then. When Galileo first turned his telescope to the heavens in 2009/2010 it was an epochal event (one we must all celebrate, with the 400th anniversary coming up). I don't know that the use of incredibly highly accurate computer models will have the same effect, but I don't think we should rule it out.
Just my top-of-the-head thoughts, mate. I've been missing in action lately, deeply immersed in a couple of big projects (see my blog), but I reckon I'm back.
OMG! Russell Blackford is posting from the future of an alternate reality! I knew Greg Egan was telling the truth.
lol, Blake. Eeek!
I meant 1609/1610, of course. I meant to say: So when Galileo, blah, blah in 1609/1610 ... with the 400th anniversary coming up in 2009/2010, blah, blah.
And, yes, I know you knew that. I guess something about the subject matter made me kind of, er, telescope a couple of my thoughts.
www.muziknette.com mp3 dinle mp3 indir
Thanks
@cwfong: some might argue that, but they would be wrong. Predictions don't have to be about things that will happen in the future, only about things which are tested in the future. The ToE makes all sorts of predictions, e.g. about the sorts of fossils you'll find above other fossils in the sedimentary layers. It's as falsifiable as a theory can be, yet here we are.
Predicting the future form of any particular species in any but the most general of ways is impossible, for the same reason you can't say exactly when, where and how strong the next major hurricane will be.
thanks.
thank you admin
Thanx admin very nice god.
Thank You
I'd love to take one of those for a spin. We need a lambo rental service in Pittsburgh. Any takers.
thanks adimin very good