Doug Smith and colleagues have a new paper in Science Improved Surface Temperature Prediction for the Coming Decade from a Global Climate Model. The abstract is Previous climate model projections of climate change accounted for external forcing from natural and anthropogenic sources but did not attempt to predict internally generated natural variability. We present a new modeling system that predicts both internal variability and externally forced changes and hence forecasts surface temperature with substantially improved skill throughout a decade, both globally and in many regions. Our system predicts that internal variability will partially offset the anthropogenic global warming signal for the next few years. However, climate will continue to warm, with at least half of the years after 2009 predicted to exceed the warmest year currently on record.
Most of the paper is deeply technical and I can't say I've really read it in detail, though I have read all the words :-). There is only a little bit at the end about the future, a smaller fraction of the article than it is of the abstract, but (of course) thats got all the attention.
The obvious spin on this is Global warming: Met Office predicts plateau then record temperatures which is exactly what the Grauniad says (note that their illustration is of drought, not of high temperatures...); others have similar.
But how do the septics spin it? The Grauniad sez The forecast of a brief slump in global warming has already been seized upon by climate change sceptics as evidence that the world is not heating but I couldn't find them. Probably because, if you're a skeptic, you can't believe the study at all, since it takes the GCMs seriously and predicts warming, and we all know that GCMs are part of the Vast Sekret Kabal and so on.
Meanwhile, my take would be not to take these predictions too literally. From the paper, while the results are indeed better than not assimilating the inital state, the gain isn't vast and indeed according to the graph that Nature conveniently provides, the system entirely missed the 1998 El-Nino. Nature points this out: This prediction is still rather tentative, though... the system's forecasts will be far from perfect. The predicted temperatures carry healthy error bars, but Smith points out that the flattening it foresaw in the first few years seems so far to be accurate.
My (inexpert) understanding of climate models is that incorporating 'current state' data at the start of a run is a great leap forward - are we going to see this becoming routine fairly quickly or am I holding the wrong end of the stick?
Does the new model make better simulations of El Nino-like behaviour, even if it's not in the right year?
If this turns out to be true, and if the septics are stupid enough to believe what they're saying, then we can make a little bit of betting lemonade out the natural variability masking the AGW trend. My bet's reference period ends in 2009, so the odds of winning just got a little better.
Politically though, we'd be much better off if natural variability made things warmer rather than masking the long-term signal.
Thanks for the post. It's useful to get a non-press version of it.
Tamino has more here: http://tamino.wordpress.com/2007/08/14/impure-speculation/
I am interested in SomeBeans question as well. (I tried on the impure speculation page linked above but didn't get anywhere.)
They used HADCM3 so presumably easier to roll out with that model than others?
Will it be routine or only worth the effort if you are modelling 10 years or less? Or is comparability with other modelling considered so important that the same old inaccurate initial conditions will continue to be used despite initialising with more accurate data being possible.