The Australian‘s War on Science 31

Ian Musgrave has written the post I was going to write on Jon Jenkins’ article in the Australian, so I just want to emphasize that fitting a degree six (yes, degree six) to temperature data does not produce a meaningful trend line in any way shape or form. Go read.

Note that if the editors at the Australian had bothered to read their own paper just three days earlier they would have known that the Jenkins’ claims about the Oregon petition and global cooling were rubbish.

News Limited blogger Grame Redfearn also pointed out the enormous holes in Jenkins’ arguments and talked to Australia’s acting chief climatologist Michael Coughlan:

Of climate change contrarians such as Jon Jenkins, Coughlan has this to say.

“We have produced rebuttals of all of these arguments – they have all been addressed. But they just keep trotting them out. No matter how many times you tell them they’re wrong, they just keep going. The general approach seems to be – if we keep banging away at an untruth, people will start to believe it”.

Let’s not forget that these contrarian views are not being expressed on a bit of street press or some fringe web site somewhere – they’re being repeated over and over in Australia’s only national newspaper. So now comes the revelation – and that is Coughlan’s view of The Australian newspaper itself.

“The Australian clearly has an editorial policy. No matter how many times the scientific community refutes these arguments, they persist in putting them out – to the point where we believe there’s little to be gained in the use of our time in responding.”

Since I already uploaded them, I might as well include my versions of the graphs that Musgrave posted.

Here’s the graph that the Australian printed.

i-01b1a75f56cfdb329979b00cc559b2b2-jenkins.png

It’s been copied from Lorne’s Gunters’ article in the National Post (Canada’s version of the Australian), but with this caveat removed:

Moreover, while the chart below was not produced by Douglass and Christy, it was produced using their data

and the ridiculous sixth-degree fit attributed to the University of Alabama at Huntsville.

That graph ends in July was already out of date when Gunter’s article was published in late October. Here it is with the latest data (up to November) included.

i-141306e9ab0ac163a1acc608e54485ea-uahtlt5.2.png

Note that the latest observation lies right on the linear trend line, completely refuting Jenkins claim that satellite data showed that warming “had completely reversed by 2008″.

(Hat tip to Tobias Ziegler for pointing me to the Readfern post.)

Update: Ian Musgrave has more on what is wrong with high order polynomial fits.

Comments

  1. #1 Jim Peden
    January 9, 2009

    Here’s an open question for this group: How long should one beat a Red Herring before it becomes Herring Sauce?

    No one denies that the planet is warming. It’s been warming for about 12,000 years, with small dips here and there caused by a variety of things, but in general it is going to continue to warm until it decides to cool again and we are again under a mile of ice.

    The alarmists tell us that man’s production of CO2 is causing it to warm unusually fast, because the CO2 is catching the predominant 14.77 micron wavelength IR photon and causing the atmosphere to heat even more than it is by conduction and convection alone. And, we are told, the more CO2 we pump into the air, the more the air will be heated.

    I guessing the folks who are saying that ( the alarmists ) didn’t stay awake very well in physics class, and don’t understand the mathematical term we call “non-linear”.

    So, here’s a simple analogy.

    Let’s suppose you set up a baseball pitching machine on the pitcher’s mound and it starts tossing balls in the general direction of the batter’s plate. The pitching machine has a wobbly base, so the balls don’t all go in the same exact direction but tend to fan out in an arc.

    Next, let’s put a catcher behind the plate. After a few pitches, we learn than one catcher can only catch about 10% of the pitches thrown, so we add a second catcher. Catcher #2 manages to catch 10% of the thrown balls that were not caught by catcher #1, or about 9% of total balls thrown. Adding yet another catcher, we find it capable of catching 10% of the balls not caught by #1 and #2, about 8.1 %. At this point the 3 catchers combined are snagging 27.1% of all the balls thrown.

    Starting to get the picture? By the time we have 10 catchers at work, most of all the balls thrown will be caught.

    And at that point, adding more catchers won’t materially affect the number of balls caught because that number is dependent on the number of balls thrown, which is fixed. Let me reemphasize that: adding more catchers beyond a certain point can not significantly increase the number of balls caught.

    Thus it is with CO2 in the atmosphere. At our current CO2 levels of about 385 PPM, the CO2 is “catching” pretty near all of the bandwidth being “thrown” at it. We call that “near saturation”. Adding more CO2, even doubling the current atmospheric content, can’t significantly increase the atmospheric temperature beyond the current levels, because present levels are absorbing most of the IR band associated with the molecular absorption of CO2. The “greenhouse effect” was really poorly named… it should have been called the “blanket effect”.

    The simple physical reality is that the CO2 molecule does not have a permanent dipole moment – eliminating a rotational mode – and leaving only a single bend mode with a
    resonant wavelength of 14.77 microns, and over 99% of the IR energy radiated from the Earth in this band that can be captured by CO2 has already been captured. This leaves only 1% of the greenhouse effect already achieved by CO2
    left to be achieved by further increases in CO2 .

    Adding more CO2 can not increase the amount of IR radiation being projected by the warm planet, and since most of the appropriate bandwidths are already being absorbed, there’s no reason to continue to worry about the effects of pumping more CO2 into the atmosphere. And the only effect from that will be your turnips growing bigger.

    And you can stop beating on that Red Herring now….

    Note: This, like most simplified analogies, is obviously not numerically (precisely) correct. For a more exact development of the mathematics, you have to go to something like http://arxiv.org/PS_cache/arxiv/pdf/0707/0707.1161v3.pdf

  2. #2 Dano
    January 9, 2009

    Jim Peden:

    speaking of beating, “your” long-ago refuted “arguments” were long ago beaten down, refuted, put to rest, moldered, broken down into soil and replanted. Many of “your” “points” have been numbered for easy reference [1. , 2. ] and some of “your” “points” have been made into a fun game [ A.].

    No one knows why you trot them out here, unless you want us to play the game to see how many points you won.

    HTH.

    Best,

    D

  3. #3 Brian G Valentine
    January 9, 2009

    If CO2 in the atmosphere had never been raised as an issue by anyone, do you think the temperature graphs would be cause for alarm?

    Do you think these would appear pretty normal over any decadal span?

    I tend to think that people would focus their attention to more important things- like cricket matches – if Al Gore had chosen an alternate method to be a con artist.

  4. #4 Dano
    January 9, 2009

    Sod in 94 erroneously claims:

    Tim is not angry. he is correcting errors.

    Silly sod.

    Don’t you know that pointing out errors makes Tim a HAYTUR? He is angry. He hates America…er…Australia. He’s shrill. He hates freedom.

    You warmers in the AGW camp are too busy being Cassandras to see the truth in front of your eyes.

    Best,

    D

  5. #5 P. Lewis
    January 9, 2009

    Hmm, Gerlich and Tscheuschner eh?

    See Eli here and
    here (and possibly elsewhere) and see Arthur Smith here.

    Gerlich and Tscheuschner eh? Don’t think so.

  6. #6 Brian G Valentine
    January 9, 2009

    Eli Rabbet – now THERE’S a trusted name in scientific thought. Another Isaac Newton!

    Everyone ought to pay heed to what Eli has to say.

  7. #7 P. Lewis
    January 9, 2009

    Ad hominem says it all really.

    And what mud would you like to sling at Arthur Smith and his analysis of Gerlich and Tscheuschner?

  8. #8 frankis
    January 11, 2009

    Jon Jenkins is on a noble, Nobel mission to “Debunk Junk Green Science” so almost any abuse of science and mathematics can be rationalised away. He’s been published by The Australian. Gaze on his works ye mighty and despair.

  9. #9 Robert Grumbine
    January 11, 2009

    Why not run whatever polynomial you like through data and draw your conclusions afterwards? Jenkins illustrates why not, and Lambert gives the smoking figure for it.

    In science, as I was mentioning in my blog on deciding climate trends, we try to avoid ‘choice’. Objectivity is important. High order polynomials have the problems mentioned. So, as we see here, the conclusion drawn depends on whether you run a 6th order polynomial rather than a 1st order polynomial. Jenkins conclusion depends sensitively on behavior of his polynomial near its end points — where we know that they are much less reliable. Over the central portion of the record, the fit show an even higher rate of warming than the linear trend.

    The method for making an objective decision is already well-known, though apparently not to Jenkins (if we grant honesty to him). That is, you take your data and first fit a 1st order polynomial (straight line) to it and see whether the fit is statistically significant. If not, then straight lines can’t be used. Either way, you may then look at a second order polynomial. See whether that has a statistically good fit. If it does, but so does the 1st order, then you have to make another test: “Is this 2nd order polynomial a significant improvement over the first order?” — This is a standard test. If the answer is no, then you don’t use the second order polynomial. Repeat this process until your higher order polynomial doesn’t explain a significantly greater amount of the variance.

    Go back and look at the 6th vs. 1st order fits Tim shows. The 6th is nearly the same as the 1st over most of the range of the data. It explains little additional variance, and does so at the cost of using many more parameters.

    This is where we objectively say ‘don’t use a 6th order polynomial to draw your scientific conclusions about this data set’.

  10. #10 Bernard J.
    January 11, 2009

    Jennifer Marohasy.

    You are, according to your CV, a biologist. As such you should presumably have employed techniques such as mark-recapture estimations of population size and of survival, and you should therefore be aware what happens at the terminus of such time-dependent data. For a variety of mathematical and physical reasons, endpoints in all sorts of data series are a nuisance.

    As another example, you are no doubt also aware of boundary effects in ecosystems, or indeed in any biological system including cell culture plates and nasal passages.

    So why the hell are you supporting (at #87) the use of a 5/6th order polynomial to describe the trend in a limited time series?! And supporting it you must be, because without this absurd trendline Jenkins has no case at all.

    Even with it he still has no case, because the trendline is complete mathematical rubbish.

    Are you seriously hitching your scientific star to this wheel-less wagon of imbecilic ignorance?

  11. #11 Chris O'Neill
    January 11, 2009

    Jon Jenkins:

    let g(s) be the climate equations,

    if g(s) = N(s)/p(s) (where N(s) and P(s) are polynomials in (s), which they have to be because that is how the equations are parameterised, see **Note below)

    then

    L^-1g(s) = Sigma[r=n..r=1] [n(sr) e^(srt) / p'(s)]

    Where n = number of zeros of p(s)

    in other words the solution will be a massive polynomial.

    Not exactly. The solution would be a massive number of combined exponential functions of time with a massive number of time constants, real and imaginary. The polynomial is in the s-domain, not the time domain.

    It follows that the T curve will also be a massive polynomial in (t) and will be best approximated by the largest polynomial possible.

    Not exactly. It follows that the T curve might be best approximated by the largest combination of exponentials possible.

    Tell me (and the rest of the world) when the next El Nino event is going to occur and if you are correct you win the debate.

    Climate models are not intended to predict El Ninos any more than probability theorems predict rolls of a dice. Even though probability theorems can’t predict the next roll of a dice, they can predict statistics of dice rolls. In a similar way, climate models can’t predict individual El Ninos but they can predict statistics of weather.

    I think Jon Jenkins should try to debate his ideas with a real climate scientist so that he can clarify his statements for us.

  12. #12 Bernard J.
    January 11, 2009

    Climate models are not intended to predict El Ninos any more than probability theorems predict rolls of a dice. Even though probability theorems can’t predict the next roll of a dice, they can predict statistics of dice rolls. In a similar way, climate models can’t predict individual El Ninos but they can predict statistics of weather.

    Bingo, Chris O’Neill.

    Anyone who has even a kernel of methematical/statistical knowledge will see why Jenkins’ statements are a scientific embarrassment, and anyone who doesn’t understand your point is demonstrating that they do not have the nouse to be involved in the discussion in the first place.

    Odds are that this won’t stop them though…

  13. #13 paul
    January 11, 2009

    >No one denies that the planet is warming. It’s been warming for about 12,000 years,

    You need to get out more.
    There are plenty of people that deny the planet is warming.

  14. #14 Anton Mates
    January 11, 2009

    HORSE SHIT.
    It might be represented by a power series but it sure as hell isn’t going to be a polynomial. The fact that you are pretending that they are the same is all the proof we need to know that you are a liar.

    And, of course, even if a curve *was* represented by a polynomial, it doesn’t follow that a polynomial of the same degree would be remotely good for fitting to it once error’s taken into account. If you take data points from a degree-100 polynomial curve, add a little noise to them, and then try to fit another degree-100 polynomial to those points, you may well get absolute garbage.

  15. #15 sod
    January 11, 2009

    i wonder if Jenkins will return for a comment.

    his take at the graph has been seriously DEBUNKED!

  16. #16 Paul
    January 11, 2009

    Totally and utterly agree with Robert Grumbine in 106.

  17. #17 z
    January 11, 2009

    “triple x was causing a net filter to block the post.
    Our idiot communications minister plans to make such filters compulsory.

    You should try doing breast cancer outcomes research

  18. #18 z
    January 11, 2009

    “but when it’s winter in the northern hemisphere, it’s summer in the southern hemisphere, and vice versa”

    which reminds me… from the retinue of australian deniers telling me about the record cold in europe and the US as proof of global cooling… are you guys having a cool summer down there, or are the Usual Suspects just doing their usual thing? i can’t find a lot of commentary on aussie weather from up here.

  19. #19 z
    January 11, 2009

    “In effect you can use any curve fitting method you like to achieve any effect you desire. But this has never stopped the AGW people from using it! ”

    and that’s the problem in a nutshell. that a person in a position of authority, one who apparently has a background in computer analysis of biochemical science no less, does not know the difference between modeling and curve fitting, is really depressing. it’s a basically huge difference in science which predates computers by a few centuries.

    let me try it again, for those who requested clarification and for those who unfortunately need clarification but shun it. there is modeling, and there is curve fitting, and the two are not at all the same. curve fitting is where you take a bunch of data and find a “predicted” curve on the sole basis of what fits the points best. it makes no assumptions regarding the underlying mechanism, nor does it result in any conclusions. the curve could be anything you want; polynomial, power series, exponential, any combination, or just something made up: “Y=Xsquared+1 except where X is evenly divisible by three, then Y =0″. whatever makes the curve fit the points best. the purpose is typically to allow you to mathematically predict what Y would be for a value of X without having to go read it off the graph. for instance, if you are trying to calibrate an instrument to a sensor which has a nonlinear response curve. instead of having to go look up the graph and find what 7 millivolts means in terms of whatever you are measuring, you can use your fitted curve to calculate what the value is for each meter reading, and make a new meter face which, instead of 7 millivolts, says “3 gigawatts per cubic furlong” or whatever.

    polynomials have very specific properties, as discussed at length; every time you up the degree (x squared to x cubed to x to the fourth, etc.) you add another bend into the curve; obviously for x points, if you use a polynomial of x-1 degree, you can fit every single point perfectly; but what the fitted curve does in between the points can be pretty crazy, and at the ends it’s completely unconstrained. raising the degree of the polynomial is like wrestling with a bump in a carpet; you can’t make the error go away, you just make the curve fit the points better and shove all the error into the spaces between the points and at the ends. since you’ve done such a good job fitting to the datapoints and you don’t have any estimate of the error between the points, you claim victory. so you use high degree polynomials with about a ton of salt, and only if you must.

    whereas, modeling goes the other way; you have an ideal of how what you are looking at works, and you develop the equations that describe that, then you adjust the parameters of those equations to fit the data. it’s the basis of all science, finding generalizable mathematical formulae which describe observed behavior; computers only enter into at as an improvement over slide rules and hand calculation. for instance, you note that falling objects have a constant acceleration; you therefore model that distance fallen = half the “acceleration” mutiplied by the square of time falling [d=(at^2)/2]; so you fit that equation to your measurements of distance d and time t and adjust the value of “acceleration”, a, to find the value that best fits the curve to the data; then you publish this as your estimate of a real physical parameter, the acceleration of gravity. in the day of newton, galileo, et al this was laboriously worked out by hand; nowadays, you use Excel or something similar. voila, “computer” modeling.

    the difference here is that with a model, you are really saying something firm and predictive about the behavior of the system, even at values of X where you don’t have data. nobody’s got time/distance data gathered over 24 hours of free fall, but if you plug that amount of time into d=(at^2)/2 with the value of a calculated from the data via the previous paragraph, any physicist will bet his life on the answer. (yes, it turns out acceleration a varies as a function of the distance between the two masses; as that was found by similar modeling process, it emphasizes the point)

    and of course, the modeling procedure(s) for climate prediction is discussed at length all over the literature and on the net, and it’s modeling not curve fitting. individual measurements of all the myriad components that go into atmospheric behavior and oceanic behavior and the interface between the ocean and atmosphere from years of observations and calculation of these simpler processes are assembled into a grand picture; before computers this would be way too complex to dream of solving, but with computers even if you can’t find each parameter precisely, you can calculate a range of results which are realistically probable.

    to argue against modeling “But this has never stopped the AGW people from using it!” by conflating it with curve fitting on the grounds that “you can use any curve fitting method you like to achieve any effect you desire” is like arguing that you can’t predict the rate of growth of money in a bank account which has a guaranteed interest rate and is federally insured, because “look at the stock market! financial predictions are totally unreliable!”

  20. #20 z
    January 11, 2009

    “many of the processes are chaotic in nature and CANNOT (and I mean in the mathematical sense CANNOT) be modelled at all.

    yes; for instance the movements of individual molecules in the atmosphere are extremely chaotic in nature, therefore all those people who think they can speak of fictions like “air presure” and “diffusion rates” are deluded, at best.

  21. #21 Chris O'Neill
    January 11, 2009

    Lank:

    For most of the past 10,000 years temperatures have been 1 to 3 degrees Celsius warmer than they are today.

    Jim Peden:

    No one denies that the planet is warming. It’s been warming for about 12,000 years, with small dips here and there caused by a variety of things, but in general it is going to continue to warm until it decides to cool again

    I wish you guys could get your act together and tell the same story. You might achieve more than zero credibility if you do.

  22. #22 z
    January 11, 2009

    “the Russian Academy of Science has been using chaos theory for their climate predictions for some time. And I’ll leave it to you to do the research as to what they predict ”

    do you mean khabibullo whatshisname the space scientist’s prediction of solar cooling? or the 6 guys’ prediction of cosmic ray mediated cooling? or both? or another one? is one of them official? which one uses chaos theory?

  23. #23 z
    January 11, 2009

    “The bottomline is that the global temperature, however it is measured but in particular by satellite, shows cooling and this is not how you have predicted it should be. ”

    See also:
    “if the olympics doesn’t set a new record for running the mile, this is proof that the continual progression of better athletic performance by humanity since Roger Bannister has begun to reverse itself”

  24. #24 z
    January 11, 2009

    “At our current CO2 levels of about 385 PPM, the CO2 is “catching” pretty near all of the bandwidth being “thrown” at it. We call that “near saturation”. ”

    irrelevant. if all the earth’s radiation is absorbed within the first mile, say, of atmosphere, then you have a new radiating body consisting of the much warmed atmosphere at that level; if you absorb that radiating energy in the upper levels of the atmosphere it warms the atmosphere. if it is completely absorbed in the next, say 5 miles of atmosphere (miles 2-6) then that level of atmosphere becomes you new warming body. etc. all the way up. the question is does IR escape from the topmost layers of the atmosphere? and it does, so that is the opportunity for increased CO2 to reduce heat radiation.

    look at it another way; given your model, the actual radiating surface of the earth is not the outside of the dirtball, but a level of atmosphere below which all the IR is captured. by definition, essentially. as you add more CO2, the height at which that level of atmosphere is situated becomes higher. higher levels of atmosphere have lower temperatures, therefore lower radiated energy.

  25. #26 Gaz
    January 12, 2009

    Chris O’Neill (January 11, 2009 10:31 PM) #118

    “I wish you guys could get your act together and tell the same story. You might achieve more than zero credibility if you do.”

    I’m sure Jim Peden will get back to you once he’s counted the number of times “water vapor/vapour” is mentioned in the IPCC’s latest report, so he can re-word this claim (on another site): “Curiously enough, the UN IPCC reports don’t even mention water vapor, since it is technically not a ‘gas’ in the atmosphere.”

    Could take a while though – it’s mentioned a heck of a lot of times.

    There might be a further delay while he tries to work out what all those baseball catchers do with their balls, which is surely the big problem with his “simple analogy”.

  26. #27 WotWot
    January 12, 2009

    which reminds me… from the retinue of australian deniers telling me about the record cold in europe and the US as proof of global cooling… are you guys having a cool summer down there, or are the Usual Suspects just doing their usual thing?

    Mixed bag. Spring was well above average, but the one month of summer so far was below average.

    Spring 2008 Temperatures (Sep-Nov):

    Both daytime maximum and overnight minimum temperatures for the season were well above normal with seasonal anomalies for Australia of +0.85°C (11th highest since 1950) and +1.01°C (4th highest) respectively.

    http://www.bom.gov.au/climate/current/season/aus/summary.shtml

    Summer Temperatures (December 2008 only):

    National maximum temperatures were 0.37°C below the long-term average (25th lowest of 59 years)

    http://www.bom.gov.au/climate/current/month/aus/summary.shtml

    And for the first quarter of 2009:

    The national outlook for daytime temperatures averaged over the March quarter (January to March) shows a moderate shift in the odds favouring warmer than normal conditions over the tropical north. Overnight temperatures are also likely to be higher than normal in the tropical north, as well as for most of the rest of WA.

    http://www.bom.gov.au/climate/ahead/temps_ahead.shtml

    Overall outlook for summer 2009 seems to be about average.

  27. #28 Gaz
    January 12, 2009

    Latest UAH satellite data show an anomaly in December of +0.411 for the Northern hemisphere.

    Of the period covered by the satellite data (December 79 to December 2008) that’s higher than all but two months prior to 1998 and 0.08 higher than the average for the period from January 1998.

    When were those cold records set?

    BTW the SH anomaly was -0.045 (seems to be tracking ENSO which is still a bit on the la Nina side of things but what would I know).

    The global anomaly was +0.183 in December after +0.251 in November, so I guess this means we’re heading for another ice age. Again.

    And it could be a bad one!

    The 6th order polymonial “trend” shows the global anomaly will be -21.5, more or less, by the end of 2020.

    Sigh.

  28. #29 Tristan
    January 12, 2009

    Z @ 117:

    “many of the processes are chaotic in nature and CANNOT (and I mean in the mathematical sense CANNOT) be modelled at all. ”

    yes; for instance the movements of individual molecules in the atmosphere are extremely chaotic in nature, therefore all those people who think they can speak of fictions like “air presure” and “diffusion rates” are deluded, at best.

    Or, in terms closer to the esteemed professor’s field: in a molecular dynamics model of a protein, the movement of the thousands of water molecules involved, and their interaction with the protein, is entirely chaotic. Yet the model still manages to maintain the protein’s structure.

  29. #30 Bernard J.
    January 12, 2009

    z at #116, and Tristan at #126 have added to the pile of homework that Jon Jenkins has to do in order to even begin to justify his position.

    I hope that, in the spirit of scientific progress, he will attempt to do so.

  30. #31 P. Lewis
    January 12, 2009

    Re #127 … as might one also hope of Jennifer Marohasy, don’t you think (in light of #107)?

  31. #32 Barton Paul Levenson
    January 12, 2009

    Jennifer Marohasy writes:

    The bottomline is that the global temperature, however it is measured but in particular by satellite, shows cooling

    No it doesn’t.

    http://www.geocities.com/bpl1960/Ball.html

    http://www.geocities.com/bpl1960/Reber.html

  32. #33 Dano
    January 12, 2009

    I hope that, in the spirit of scientific progress, he will attempt to do so.

    Yer dreamin’.

    Best,

    D

  33. #34 elspi
    January 12, 2009

    “many of the processes are chaotic in nature and CANNOT (and I mean in the mathematical sense CANNOT) be modelled at all. ”

    No, that is not actually the problem. The problem is in mathematics itself.
    Almost all chaotic process can be modeled very well by some differential equation.

    The problem is with differential equations. Any complicated differential equation is going to be chaotic. The problem with predicting the weather is not the lack of a good model. The weather equations are a very good model. The problem is that the weather equations are every bit as chaotic as the weather.

    I should say what I mean here. Any solution to a differential equation depends on initial conditions (or a boundary conditions). That is to say we need the current state of the system. Chaotic means that very very small change in the initial condition (or boundary condition) can lead to a very large change in the solution. Since there is always error measuring the state of a system, this is a real problem. You can say what the solution would be if your initial value had no error, but it does so you are left with many possible solutions, which diverge from one another fairly quickly.

    Thus we are pretty good at predicting the weather 2 days from now, but very bad at predicting the weather 2 weeks in advance (by two weeks the possible solutions have diverged very far from one another).

  34. #35 Jon Jenkins
    January 12, 2009

    When this blog was started I had hopes for some real discussion of the science of climate modelling, it started well with a discussion about curve fitting and the various merits and predictive capability (or lack of!). Then we got onto the “hockey stick” and opinions as to whether the two Macs, the Wedgman Report and “absence” of the Med Warming/Little Ice Age has affected its credibility ( I believe they do and others don’t). But like the IPCC process which I also have issues with it may be interesting but it is not science.

    But then we descended into the usual personal stuff. The initiator of the blog insists on attacking me personally to discredit me somehow. To be brutally honest it is irrelevant that I have studied medicine, science, have an honours degree, a PhD and non degree enrolments in other stuff. And yes as ashamed as I am to admit it (perhaps it was the drugs…) I also have also dabbled in the humanities and have a Dip. Ed. It is irrelevant because I claim no special expertise in climatology and do not ask nor care the background of the other contributors to this blog (including Tim) other than the quality of their contributions to the debate. So I had given up….

    Then post #116 came up and got to the real guts of the issue about curve fitting versus the prediction. The poster “z” is spot on: CURVE FITTING IS USELESS FOR PREDICTING RATHER IT IS ALL ABOUT THE MODELS AND WHAT THEY PREDICT!!

    So if Tim and any others want to have a real discussion about climate change then let’s discuss the models and their skill at predicting the future. The only publicly available source code is NASA (Hadley and the Germans do not release their code). I would really like to get a look at the CSIRO code but unfortunately they will not release it to the public so if anyone has access?

    I am happy to participate in either anecdotal discussions about the IPCC collection of models and their relative merits including their previous success (or lack of!) at predictions. But I would rather have a detailed technical discussion about the models and mathematics themselves. So if you are up for it lets go…..

  35. #36 luminous beauty
    January 12, 2009

    Jon Jenkins,

    First I’d like to know if you understand that GCMs are physical models and not statistical models, and if you understand what the difference is?

    Second, do you understand that the utility of global AOGCMs is that they can produce projections based on unpredictable human inputs and aren’t useful for discrete predictions, and, again, what the difference is?

    Just saying yes isn’t enough. I’d like to hear some explication in order to estimate your degree of understanding.

  36. #37 Dano
    January 12, 2009

    But I would rather have a detailed technical discussion about the models and mathematics themselves. So if you are up for it lets go…..

    To get up to speed, here. From the phraseology and argumentation, one suspects there’s a learning deficit.

    Best,

    D

  37. #38 Chris O'Neill
    January 12, 2009

    Jon Jenkins:

    I am happy to participate in either anecdotal discussions about the IPCC collection of models and their relative merits including their previous success (or lack of!) at predictions. But I would rather have a detailed technical discussion about the models and mathematics themselves. So if you are up for it lets go…..

    I’m certain that the professional climate scientists (including university lecturers) who write for realclimate are “up for it” so why don’t you ask them about your concerns about climate models? They actually have an active thread on climate models running at the moment. I’m sure they will answer your questions about climate models much more clearly and concisely than anyone who usually writes on this blog. So if you are up for it then do it.

  38. #39 Dano
    January 12, 2009

    Chris:

    prediction: this will be like the amen chorus at CA when I offered to sign them in to the dendrochronology listserv to argue their opinions.

    Best,

    D

  39. #40 bi -- IJI
    January 12, 2009

    Jon Jenkins:

    > When this blog was started I had hopes for some real discussion of the science of climate modelling,

    You started by throwing out a bunch of equations that made absolutely no sense. When I asked you what s meant and how you managed to conclude something about T(t) when neither T(.) nor t was mentioned in your previous working, you simply ignored the question.

    > But then we descended into the usual personal stuff.

    Oh, but of course it’s our fault. Boo-hoo.

  40. #41 WotWot
    January 12, 2009

    But then we descended into the usual personal stuff.

    J. Jenkins

    And of course your article in The Australian was an exemplary model of ad hominem and ideology free civility. For example:

    warmaholics (several times)

    They are then further doctored by a secret algorithm to account for heat-island effects. Reconstructions such as the infamously fraudulent “hockey stick” are similarly unreliable.

    The warmaholics, drunk on government handouts and quasi-religious adulation from left-wing environmental organisations

    the fraud of the IPCC,

    Etc.

    Pathetic.

  41. #42 John Nicol
    January 13, 2009

    It is indeed interesting to read the “discussions” in these many forums on climate change. I am heartened by the suggestion that the current list of bloggers might be interested in a proper scientific discussion on the models used to obtain the projections put forward by the IPCC. That could well be followed by a second discussion on the action of CO2 which has already been partly addressed by “z” at #121. I will make a comment on one of z’s points which is to say that the rise in the height at which the green house gases reverse their role and become ‘coolers’ rather than ‘warmers’, implies a colder radiating region. This reversal amounts to collisional excitation followed by radiation some of which goes out to space, rather than absorption of radiation followed by collisional heating of the surrounding air at the lower levels. Am I correct in assuming that this is because you are assuming that the radiation from the higher air sample is similar to that of a black body and therefore radiates significantly less according to a T^4 power law? This is not necessarily so, since gases do not, and in fact cannot, radiate directly as black bodies. They can only radiate at frequencies determined by their internal structure, at rates which are determined by the so-called “oscillator” strength of the various transitions available from each upper state, which, in all cases, has a population determined by the Maxwell-Boltzmann law from quantum statistical mechanics and which it is true depends on T but not in the same way as the black body function (Planck’s Law). So the rate of radiation both at each frequency and in total is much more complex than a simple black body calculation.

    Back to the curve fitting versus modelling debate which is also very interesting. My first point is that to use any polynomial to fit what is essentially a randomly behaved function with an underlying but unknown trend, is to court nothing but disaster. There is no point in doing this unless one is trying to demonstrate the correctness of an hypothesis which defines a functional expression that is expected to correctly represent that trend. One then uses a function with as many variable parameters as necessary to suit the hypothesis and which allows a “best” fit in terms of minimising the residuals using something like a Levenberg-Marquet algorithm to obtain the values of those parameters at that minimum.

    If the functional behaviour of the global temperature for instance is unknown, but is not totally random or chaotic over the long term, it must be assumed that it is at least dependent on a set of slowly varying drivers, be they increases in green house gases, changes in the earth’s orbit or variations of the solar constant. Thus in any short period of say two years, or whatever length of time seems reasonable given the known possible variation periods, these drivers of climate or global temperatures must be fairly constant over this short time, but will probably bear no relationship whatsoever, to their values at ten years on either side, let alone 100 years. Thus taking an average of the global temperatures over each two year period in the form of distinct averages or running averages, must surely provide the best available approximation to the variational function. Using any assumed function such as a polynomial, must invariably impose a constraint on the so derived mean temperature at every point, where part of that constraint is imposed by the value of a point which may, in the case of this temperature curve, be 50 years away. This is the type of error, although not quite the same, which James Hansen made in deriving his so-called, and unfortunate, “Hockey Stick” graph.

    With regard to models and curve fitting, while it is true that they are by definition quite different, the differences in the cases of the climate models are probably not all that well defined. This is because of the nature of the problem, which is obviously very complex, and the fact that a lot of the input parameters required to make the models work at all, are unknown and must be derived by attempting to “tune” their values (see IPCC AR$ 2007 Chapter 8) so that the resultant output from the models fits known data. This is not all that different from the curve fitting process where the coefficients in a polynomial function are also derived from fitting the thing to known data. In fact, the outcome is a little more positive in favour of the curve fitting, because, although I have criticised them above, at least there is no attempt to extrapolate to a date beyond the known data. In the case of the models, the values of the parameters obtained from “fitting”, if you like, to known data from earlier years, are then used in the models to project beyond the known period and in this case out to 50 or 100 years. This not a criticism, it is just a fact and must be done since there is no other method available to the modelers for deriving these unknowns.

    Another relationship to curve fitting will also be found in the solutions to the many hydrodynamic and other second, or higher order, differential equations which have to be solved. It is extremely unlikely that any of these equations will admit of analytical solutions and the numerical processes used in finding solutions will undoubtedly involve the use of functional approximations to a solution from which parameters will be obtained from the boundary conditions. And so on…..

    OK. You can now all tell me that I don’t know anything about the solutions to the models. Go for it. I was a bit disappointed, actually, to find that some people ducked Jon Jenkin’s challenge to become involved in a debate and simply referred him to “RealClimate” or somewhere for advice which he may or may not need.

    John Nicol

  42. #43 John Nicol
    January 13, 2009

    It is indeed interesting to read the “discussions” in these many forums on climate change. I am heartened by the suggestion that the current list of bloggers might be interested in a proper scientific discussion on the models used to obtain the projections put forward by the IPCC. That could well be followed by a second discussion on the action of CO2 which has already been partly addressed by “z” at #121. I will make a comment on one of z’s points which is to say that the rise in the height at which the green house gases reverse their role and become ‘coolers’ rather than ‘warmers’, implies a colder radiating region. This reversal amounts to collisional excitation followed by radiation some of which goes out to space, rather than absorption of radiation followed by collisional heating of the surrounding air at the lower levels. Am I correct in assuming that this is because you are assuming that the radiation from the higher air sample is similar to that of a black body and therefore radiates significantly less according to a T^4 power law? This is not necessarily so, since gases do not, and in fact cannot, radiate directly as black bodies. They can only radiate at frequencies determined by their internal structure, at rates which are determined by the so-called “oscillator” strength of the various transitions available from each upper state, which, in all cases, has a population determined by the Maxwell-Boltzmann law from quantum statistical mechanics and which it is true depends on T but not in the same way as the black body function (Planck’s Law). So the rate of radiation both at each frequency and in total is much more complex than a simple black body calculation.

    Back to the curve fitting versus modelling debate which is also very interesting. My first point is that to use any polynomial to fit what is essentially a randomly behaved function with an underlying but unknown trend, is to court nothing but disaster. There is no point in doing this unless one is trying to demonstrate the correctness of an hypothesis which defines a functional expression that is expected to correctly represent that trend. One then uses a function with as many variable parameters as necessary to suit the hypothesis and which allows a “best” fit in terms of minimising the residuals using something like a Levenberg-Marquet algorithm to obtain the values of those parameters at that minimum.

    If the functional behaviour of the global temperature for instance is unknown, but is not totally random or chaotic over the long term, it must be assumed that it is at least dependent on a set of slowly varying drivers, be they increases in green house gases, changes in the earth’s orbit or variations of the solar constant. Thus in any short period of say two years, or whatever length of time seems reasonable given the known possible variation periods, these drivers of climate or global temperatures must be fairly constant over this short time, but will probably bear no relationship whatsoever, to their values at ten years on either side, let alone 100 years. Thus taking an average of the global temperatures over each two year period in the form of distinct averages or running averages, must surely provide the best available approximation to the variational function. Using any assumed function such as a polynomial, must invariably impose a constraint on the so derived mean temperature at every point, where part of that constraint is imposed by the value of a point which may, in the case of this temperature curve, be 50 years away. This is the type of error, although not quite the same, which James Hansen made in deriving his so-called, and unfortunate, “Hockey Stick” graph.

    With regard to models and curve fitting, while it is true that they are by definition quite different, the differences in the cases of the climate models are probably not all that well defined. This is because of the nature of the problem, which is obviously very complex, and the fact that a lot of the input parameters required to make the models work at all, are unknown and must be derived by attempting to “tune” their values (see IPCC AR$ 2007 Chapter 8) so that the resultant output from the models fits known data. This is not all that different from the curve fitting process where the coefficients in a polynomial function are also derived from fitting the thing to known data. In fact, the outcome is a little more positive in favour of the curve fitting, because, although I have criticised them above, at least there is no attempt to extrapolate to a date beyond the known data. In the case of the models, the values of the parameters obtained from “fitting”, if you like, to known data from earlier years, are then used in the models to project beyond the known period and in this case out to 50 or 100 years. This not a criticism, it is just a fact and must be done since there is no other method available to the modelers for deriving these unknowns.

    Another relationship to curve fitting will also be found in the solutions to the many hydrodynamic and other second, or higher order, differential equations which have to be solved. It is extremely unlikely that any of these equations will admit of analytical solutions and the numerical processes used in finding solutions will undoubtedly involve the use of functional approximations to a solution from which parameters will be obtained from the boundary conditions. And so on…..

    OK. You can now all tell me that I don’t know anything about the solutions to the models. Go for it. I was a bit disappointed, actually, to find that some people ducked Jon Jenkin’s challenge to become involved in a debate and simply referred him to “RealClimate” or somewhere for advice which he may or may not need.

    John Nicol

  43. #44 John Nicol
    January 13, 2009

    It is indeed interesting to read the “discussions” in these many forums on climate change. I am heartened by the suggestion that the current list of bloggers might be interested in a proper scientific discussion on the models used to obtain the projections put forward by the IPCC. That could well be followed by a second discussion on the action of CO2 which has already been partly addressed by “z” at #121. I will make a comment on one of z’s points which is to say that the rise in the height at which the green house gases reverse their role and become ‘coolers’ rather than ‘warmers’, implies a colder radiating region. This reversal amounts to collisional excitation followed by radiation some of which goes out to space, rather than absorption of radiation followed by collisional heating of the surrounding air at the lower levels. Am I correct in assuming that this is because you are assuming that the radiation from the higher air sample is similar to that of a black body and therefore radiates significantly less according to a T^4 power law? This is not necessarily so, since gases do not, and in fact cannot, radiate directly as black bodies. They can only radiate at frequencies determined by their internal structure, at rates which are determined by the so-called “oscillator” strength of the various transitions available from each upper state, which, in all cases, has a population determined by the Maxwell-Boltzmann law from quantum statistical mechanics and which it is true depends on T but not in the same way as the black body function (Planck’s Law). So the rate of radiation both at each frequency and in total is much more complex than a simple black body calculation.

    Back to the curve fitting versus modelling debate which is also very interesting. My first point is that to use any polynomial to fit what is essentially a randomly behaved function with an underlying but unknown trend, is to court nothing but disaster. There is no point in doing this unless one is trying to demonstrate the correctness of an hypothesis which defines a functional expression that is expected to correctly represent that trend. One then uses a function with as many variable parameters as necessary to suit the hypothesis and which allows a “best” fit in terms of minimising the residuals using something like a Levenberg-Marquet algorithm to obtain the values of those parameters at that minimum.

    If the functional behaviour of the global temperature for instance is unknown, but is not totally random or chaotic over the long term, it must be assumed that it is at least dependent on a set of slowly varying drivers, be they increases in green house gases, changes in the earth’s orbit or variations of the solar constant. Thus in any short period of say two years, or whatever length of time seems reasonable given the known possible variation periods, these drivers of climate or global temperatures must be fairly constant over this short time, but will probably bear no relationship whatsoever, to their values at ten years on either side, let alone 100 years. Thus taking an average of the global temperatures over each two year period in the form of distinct averages or running averages, must surely provide the best available approximation to the variational function. Using any assumed function such as a polynomial, must invariably impose a constraint on the so derived mean temperature at every point, where part of that constraint is imposed by the value of a point which may, in the case of this temperature curve, be 50 years away. This is the type of error, although not quite the same, which James Hansen made in deriving his so-called, and unfortunate, “Hockey Stick” graph.

    With regard to models and curve fitting, while it is true that they are by definition quite different, the differences in the cases of the climate models are probably not all that well defined. This is because of the nature of the problem, which is obviously very complex, and the fact that a lot of the input parameters required to make the models work at all, are unknown and must be derived by attempting to “tune” their values (see IPCC AR$ 2007 Chapter 8) so that the resultant output from the models fits known data. This is not all that different from the curve fitting process where the coefficients in a polynomial function are also derived from fitting the thing to known data. In fact, the outcome is a little more positive in favour of the curve fitting, because, although I have criticised them above, at least there is no attempt to extrapolate to a date beyond the known data. In the case of the models, the values of the parameters obtained from “fitting”, if you like, to known data from earlier years, are then used in the models to project beyond the known period and in this case out to 50 or 100 years. This not a criticism, it is just a fact and must be done since there is no other method available to the modelers for deriving these unknowns.

    Another relationship to curve fitting will also be found in the solutions to the many hydrodynamic and other second, or higher order, differential equations which have to be solved. It is extremely unlikely that any of these equations will admit of analytical solutions and the numerical processes used in finding solutions will undoubtedly involve the use of functional approximations to a solution from which parameters will be obtained from the boundary conditions. And so on…..

    OK. You can now all tell me that I don’t know anything about the solutions to the models. Go for it. I was a bit disappointed, actually, to find that some people ducked Jon Jenkin’s challenge to become involved in a debate and simply referred him to “RealClimate” or somewhere for advice which he may or may not need.

    John Nicol

  44. #45 John Nicol
    January 13, 2009

    It is indeed interesting to read the “discussions” in these many forums on climate change. I am heartened by the suggestion that the current list of bloggers might be interested in a proper scientific discussion on the models used to obtain the projections put forward by the IPCC. That could well be followed by a second discussion on the action of CO2 which has already been partly addressed by “z” at #121. I will make a comment on one of z’s points which is to say that the rise in the height at which the green house gases reverse their role and become ‘coolers’ rather than ‘warmers’, implies a colder radiating region. This reversal amounts to collisional excitation followed by radiation some of which goes out to space, rather than absorption of radiation followed by collisional heating of the surrounding air at the lower levels. Am I correct in assuming that this is because you are assuming that the radiation from the higher air sample is similar to that of a black body and therefore radiates significantly less according to a T^4 power law? This is not necessarily so, since gases do not, and in fact cannot, radiate directly as black bodies. They can only radiate at frequencies determined by their internal structure, at rates which are determined by the so-called “oscillator” strength of the various transitions available from each upper state, which, in all cases, has a population determined by the Maxwell-Boltzmann law from quantum statistical mechanics and which it is true depends on T but not in the same way as the black body function (Planck’s Law). So the rate of radiation both at each frequency and in total is much more complex than a simple black body calculation.

    Back to the curve fitting versus modelling debate which is also very interesting. My first point is that to use any polynomial to fit what is essentially a randomly behaved function with an underlying but unknown trend, is to court nothing but disaster. There is no point in doing this unless one is trying to demonstrate the correctness of an hypothesis which defines a functional expression that is expected to correctly represent that trend. One then uses a function with as many variable parameters as necessary to suit the hypothesis and which allows a “best” fit in terms of minimising the residuals using something like a Levenberg-Marquet algorithm to obtain the values of those parameters at that minimum.

    If the functional behaviour of the global temperature for instance is unknown, but is not totally random or chaotic over the long term, it must be assumed that it is at least dependent on a set of slowly varying drivers, be they increases in green house gases, changes in the earth’s orbit or variations of the solar constant. Thus in any short period of say two years, or whatever length of time seems reasonable given the known possible variation periods, these drivers of climate or global temperatures must be fairly constant over this short time, but will probably bear no relationship whatsoever, to their values at ten years on either side, let alone 100 years. Thus taking an average of the global temperatures over each two year period in the form of distinct averages or running averages, must surely provide the best available approximation to the variational function. Using any assumed function such as a polynomial, must invariably impose a constraint on the so derived mean temperature at every point, where part of that constraint is imposed by the value of a point which may, in the case of this temperature curve, be 50 years away. This is the type of error, although not quite the same, which James Hansen made in deriving his so-called, and unfortunate, “Hockey Stick” graph.

    With regard to models and curve fitting, while it is true that they are by definition quite different, the differences in the cases of the climate models are probably not all that well defined. This is because of the nature of the problem, which is obviously very complex, and the fact that a lot of the input parameters required to make the models work at all, are unknown and must be derived by attempting to “tune” their values (see IPCC AR$ 2007 Chapter 8) so that the resultant output from the models fits known data. This is not all that different from the curve fitting process where the coefficients in a polynomial function are also derived from fitting the thing to known data. In fact, the outcome is a little more positive in favour of the curve fitting, because, although I have criticised them above, at least there is no attempt to extrapolate to a date beyond the known data. In the case of the models, the values of the parameters obtained from “fitting”, if you like, to known data from earlier years, are then used in the models to project beyond the known period and in this case out to 50 or 100 years. This not a criticism, it is just a fact and must be done since there is no other method available to the modelers for deriving these unknowns.

    Another relationship to curve fitting will also be found in the solutions to the many hydrodynamic and other second, or higher order, differential equations which have to be solved. It is extremely unlikely that any of these equations will admit of analytical solutions and the numerical processes used in finding solutions will undoubtedly involve the use of functional approximations to a solution from which parameters will be obtained from the boundary conditions. And so on…..

    OK. You can now all tell me that I don’t know anything about the solutions to the models. Go for it. I was a bit disappointed, actually, to find that some people ducked Jon Jenkin’s challenge to become involved in a debate and simply referred him to “RealClimate” or somewhere for advice which he may or may not need.

    John Nicol

  45. #46 bi -- IJI
    January 13, 2009

    > some people ducked Jon Jenkin’s challenge to become involved in a debate

    Jon Jenkins ignores the simple questions we pose to him, therefore it’s our fault. Boo-hoo, boo-hoo, boo-hoo…

  46. #47 bi -- IJI
    January 13, 2009

    WotWot:

    > And of course your article in The Australian was an exemplary model of ad hominem and ideology free civility.

    But WotWot, the phrase “The warmaholics, drunk on government handouts and quasi-religious adulation from left-wing environmental organisations” is the “We hold these truths to be self-evident” of Liberal Fascism! Therefore it’s OK.

  47. #48 Chris O'Neill
    January 13, 2009

    John Nicol:

    This is the type of error, although not quite the same, which James Hansen made in deriving his so-called, and unfortunate, “Hockey Stick” graph.

    So James Hansen derived the “Hockey Stick” graph. Looks like we learn something every day.

    I was a bit disappointed, actually, to find that some people ducked Jon Jenkin’s challenge to become involved in a debate and simply referred him to “RealClimate”

    Yes it’s so disappointing that Jenkins won’t debate with people who know what they’re talking about.

  48. #49 John Nicol
    January 13, 2009

    Dear bi — IJI,
    You said:
    “Jon Jenkins ignores the simple questions we pose to him, therefore it’s our fault.”

    bi, I don’t believe I mentioned it was your fault. I simply observed that people had not engaged in a debate which could have been interesting to observe and take part in. There seemed to be a number of respondents to Jon Jenkin’s posting who obviously are very well credentialled in climate science. It would be great if they could expand on their various comments and provide us with some insight into the difficult science and modeling that is used to project the values of global warming in 50 or 100 years time. This is very important I believe, since there will soon be very large economic changes arising from the ETS and from restrictions which many councils are now placing on coastal developments because of predicted climate change, including sea level rises.

    Unfortunately, for reasons known only to themselves, our publicly funded CSIRO refuses to divulge just how they arrive at the many predictions they are making concerning, for example, increases in storm surges on North Queensland’s coast line, causing the Townsville City and other North Queensland Councils to modify their town planning. What we get from CSIRO is nothing but a disclaimer assuring us that these predictions are only the best they can do at the present time and if they turn out to be wrong, as they well might, then we can’t blame them. This is OK perhaps for blue sky science where the research is simply aimed at a better understanding of a concept with the expectation that further publications and debate will lead to the correct solution. However, even there, it is necessary for the details of an experiment or theory to be published to allow anyone else to repeat the experiment or analyse the theory.

    The projections by CSIRO are being made for the purpose of giving advice to government, with the clear expectation that notice will be taken of what has been said, yet no one is allowed to check their processes. Fortunately this is not the case with many others of the modeling fraternity who are very happpy to share their knowledge and even computer programmes. It is therefore doubly encumbent upon CSIRO – public funding and advice which has very serious economic consequences – to be open in providing the background to their results and complete details of their processes, to anyone who may be interested in checking.

    I have asked them for a list of their peer reviewed publications on the subject – Silence!

    Not good enough. There was a time when CSIRO scientists actually interacted openly with other scientists and the public. After all, it is the public who pays for both their salaries and their research funding. This seems to indicate to me at least that they have an obligation to provide proper information in a transparent and open way.

    In the absence of information being available from the modelers, I had hoped that less formal advice might be available from someone else who knows all about the modelling process, CO2 etc.
    —————————————————–
    Chris O’Neill, you have said:

    “Climate models are not intended to predict El Ninos any more than probability theorems predict rolls of a dice. Even though probability theorems can’t predict the next roll of a dice, they can predict statistics of dice rolls. In a similar way, climate models can’t predict individual El Ninos but they can predict statistics of weather.
    I think Jon Jenkins should try to debate his ideas with a real climate scientist so that he can clarify his statements for us.”

    What makes El Ninos different from any other weather phenomenon. Remembering that El Nino is just one peak in a particular combination of cyclical behaviours of the atmosphere and oceans associated in some way with the Southern Oscillation, the other extreme being the La Nina. In between, the system is in some intermediate state which will have characteristics associated with El Nino or la Nina which you are saying climate models are not intended to predict. While the IPCC admits that the models can’t predict them, I don’t ever recall reading that they would not be intended to do so. It is just a matter of gaining enough experience and knowledge I believe. I wonder if your analogy of a dice whose values appear as random numbers for which there are very clear statistics as you point out is really very apt. Any random event obeys similar statistics but the El Nino is not believed to be random – it is just that we don’t yet know what drives it and determines when it will occur.

    The El Nino climate phenomenon is clearly driven by various agencies such as heat, currents and winds which can also effect other aspects of weather (in the short term) and climate (in the long term). If the models were ever likely to be successful, it seems reasonable, from direct scientific principles, that they should first be able to master one of the strongest influences known. It is not as if the characteristics of the El Nino and La Nina are not well known, and the changes in temperature distributions, winds and ocean currents well documented. No such documentation exists for the roll of a dice! There are also probably other more subtle phenomena which are not yet recognised but which also have a significant influence on climate, though are not currently being included in the models. After all, according to the IPCC AR$ 2007 Chapter 8, the models are not able to handle clouds properly either, which means they are not capable of properly including the effects of probably two of the most powerful influences on the climate as they stand today. This must surely place severe limits on the expected accuracy of projections 50 to 100 years hence, when it is also admitted in Chapter 8, that they cannot predict the climate in 2008, using the initialising parameters obtained from the earlier 100 years before say, 1960. Even more importantly, they point out, again in Ch 8, that even if they could, it would not yet be known what this success would mean in terms of their ability to predict further into the future. There is no doubt that the models are very well constructed and are called on to provide information from a very, very complicated system. Nevertheless, the acknowledgement of their limitations included in the most recent of the IPCC reports, goes a long way towards demonstrating clearly that any information provided by the models, must be treated with extreme caution. Perhaps, and hopefully, increases in computer size and speed in the near future together with a better understanding of the physics of the action of green house gases, will soon lead to a much better modelling regime where many, if not most people, will have well justified confidence in the future climate projections provided by the models.

    John Nicol

  49. #50 Chris O'Neill
    January 13, 2009

    John Nicol:

    What makes El Ninos different from any other weather phenomenon.

    Nothing.

    I wonder if your analogy of a dice whose values appear as random numbers for which there are very clear statistics as you point out is really very apt. Any random event obeys similar statistics but the El Nino is not believed to be random

    You’re reading too much into what I said. At the very least I didn’t say that El Ninos were random. I’m just saying that predicting a particular El Nino is different from predicting say, average temperature over the time that several El Ninos are expected to occur.

  50. #51 sod
    January 13, 2009

    sorry Jon, i can see why you would want to change the subject. your claims about your graph have been torn apart.

    multiple posts above would just require a one line answer from Jon Jenkins, that he doesn t give, because he can t answer it.

    why not simply admit, that it was an error to use the 6th degree polynomial?

    or will you finally give me a 3rd degree one, that shows a similar result?

  51. #52 Lee
    January 13, 2009

    John Nicol,

    What one would like from a very good dynamic model, is that it exhibit El Nino and La Nina behavior, and that the frequency and intensity of them corresponds to what is seen in the real world. That is climate.

    What one doesn’t need, and doesn’t expect, is that the model exhibit exactly the same weather, with each storm, each bit of ‘noise,’ and each El Nino/La Nina matching exactly in time and intensity to each one we see in the real world.

    IOW, we don’t expect the models to predict the weather in the real world. We do expect frequency and intensity of weather in the models, averaged over time, to match what we see in the real world averaged over time.

    That is why the modelers do ensembles – so they can average them, and therefore “average out” the weather events and leave the climate signal.

  52. #53 Barton Paul Levenson
    January 13, 2009

    John Nicol writes:

    This is the type of error, although not quite the same, which James Hansen made in deriving his so-called, and unfortunate, “Hockey Stick” graph.

    Gosh darn that James Hansen! It was really sneaky of him to change his name to Michael Mann, too.

  53. #54 John Nicol
    January 13, 2009

    Yes. I accept that Chris O’Neil. Sorry if I misrepresented what you had said.
    —————————————————
    Lee, It is true that one doesn’t expect the models to represent the much more random weather events. And from what you have just said, I believe we agree that the models should be able to account for the fairly long term (decadal) changes in “climate/weather” represented in the Southern Oscillation which cycles between La Nina and El Nino conditions. Without being able to do that, they will not be very useful as climate predictors. Thanks for your reponse.

    —————————————-
    Barton Paul Levenson

    OK Michael Mann derived the Hockey Stick. Sorry and apologies to any one named James Hansen.
    John Nicol

  54. #55 Tristan
    January 14, 2009

    I wonder if your analogy of a dice whose values appear as random numbers for which there are very clear statistics as you point out is really very apt. Any random event obeys similar statistics but the El Nino is not believed to be random

    But the values that come up from rolling the dice are not random – they’re the result of a chaotic system, just like the weather is. The value that comes up on top depends on the way in which the dice sit in the hand, the angle and force of the throw, the amount of spin imparted, the distance from the surface they land on, the type of surface they land on, etc., etc.

    Since they’re chaotic, we can’t predict what the next value rolled will be. But we can predict with a very high degree of confidence what an ensemble of a large number of rolls will look like. Similarly, the molecular movements at each step of a molecular dynamics model won’t match those in any real system (even if we could measure such), but results are starting to come in that show that, once run long enough, they will predict quite closely the final tertiary structure of simple proteins. Again, we can’t predict whether it will be raining in Memphis at 4pm on Tuesday next month, but the models do appear to represent quite well the long-term aggregate behaviour of the climate.

  55. #56 WotWot
    January 14, 2009

    once run long enough,

    Do you meant once iterated enough times?

  56. #57 WotWot
    January 14, 2009

    meant = mean

  57. #58 Gaz
    January 14, 2009

    Jon Nicol – “…I believe we agree that the models should
    be able to account for the fairly long term
    (decadal) changes in “climate/weather”
    represented in the Southern Oscillation which
    cycles between La Nina and El Nino
    conditions.”

    1) Do you mean the the models should account for changes in frequencey of El Nino or La Nina avents, something which which may happen on a decadal scale, or the events themselves as they occur on an inter-annual scale?

    2) What do you mean “should account for”. Do you mean the models should explain why the El Nino/La Nina events occur, what their effects are, why they beome more or less frequent, or actually predict discrete events (as opposed to simulate them)?

    3) What if the models can’t do one or more of these to your satisfaction?
    Would some uncertainty over the nature of these events lead you to the conclusion that anthropogenic global warming is not happening?
    Or do you envisage some possibility that a better understanding of the ENSO phenomenon might provide evidence that anthropogenic global warming is not happening?

    4)If the answer to either the second or third questions in point 3 is “yes”, what do you think might be happening to the heat energy absorbed by greenhouse gases?
    If not, why is modelling ENSO so important, aside from demonstrating why no-one should get to excited about the odd hot year (like 1998) or cooler year (like 2008)?

    I’ll leave to someone else to ask what you mean by “climate/weather”.

    Cheers.

  58. #59 Gaz
    January 15, 2009

    I guess congratulations are in order for David Evans.

    He seems to have gotten a retrospective promotion.

    It seems now it wasn’t just computer programming he did back in the day, he’s actually “a former adviser to the Australian Greenhouse Office, the precursor to the Department of Climate Change”.

    http://www.theaustralian.news.com.au/story/0,25197,24914441-11949,00.html

  59. #60 naught101
    January 17, 2009

    jade: “But have you seen his website?”

    Classic. Don’t you love the tagline? “Dedicated to debunking junk green science.” The best PR heads couldn’t workshop one better than that. Is it implying that all “green science” is junk science? Or is it actually closer to the truth -ie. Jenkins doesn’t like greenies, so he only attacks science that appears to side with them?

    “As one of the original “greenies” it causes me great pain…”

    Yeah, right… Jenkin’s it part of the huntin’, fishin’, four-wheel-drivin’ Outdoor Recreation party. Can’t get much more green than that, can ya?

  60. #61 naught101
    January 18, 2009

    I ran a 6th degree best fit over GISTEMP data from 1880-2008. Funnily enough, I didn’t see a downward curve at the end, but a distinctly upward one.

    Oh, and I extended the line to 1700 -> 2100, and I notice that the planet will be 7 degrees warmer in 2100, and that there was a global ice-age before about 1700. Nifty graphs here (down the bottom, last two images).

  61. #62 Douglas Watts
    January 18, 2009

    Another problem with the polynomial fitting is seen if you apply the graph above to the 161-game American baseball season — and apply a 6th degree polynomial curve to that point in the middle of the season when your team has just won eight games in a row. If you do this, your trendline will “prove” that your team will win all of the rest of the games that season and every season thereafter. If your team losing eight games in a row, it might as well disband because the trendline clearly shows your team will never win another game in a billion years.

  62. #63 Anton Mates
    January 19, 2009

    Actually, your baseball team could lose 159 games in a row–but if it wins the first and last game, a 6th-degree polynomial fit over the whole season will still prophesy victory for all future games…

  63. #64 Chris
    January 17, 2010

    Why use a old fashion polynomial. I have found that a Fourier series will more accurately model the data and based on my initial calculations it has shown that there is nothing to worry about it is just a bunch of oscillations.

  64. #65 David Irving (no relation)
    January 17, 2010

    Chris, it’s my understanding that a Fourier series can be designed to fit any data, no matter how noisy. (I may be wrong, as it’s been a while …)

    However, it’s really only an appropriate method if you start from the assumption that you’re modelling a bunch of oscillations.

  65. #66 codex
    February 17, 2010

    @jade
    >
    >So when do temperatures in the lower troposphere hit absolute zero?
    >

    Hopefully they won’t ever reach zero. I’m not sure what that would mean for humanity if the troposphere were that hot. If you mean the anomalies, they regularly pass through zero.

    @Tim

    As you have both maths and computing background I’d like to see a bit more info on what curve is a good fit to that data. Linear trend is simplistic and I seem to remember from my Uni days when we were finding curves to fit the el-nino cycle we had identified after processing a huge bunch of sea level air pressure data that the important criteria were finding something representive of the data and maybe ignoring the end points ??

    I can understand why someone wouldn’t like that (4th degree isn’t it ? did I see some one say it was 6th ?) fit line. But to my eye it doesn’t look that bad.

    I guess it depends on what you want.

    The linear trend is perhaps useful for determining whether there is an increasing tendency is it not ??

    Another type of curve (sine ?? polynomial ??) might be good if one was investigating evidence of cycles in the data would it not ???

  66. #67 codex
    February 17, 2010

    @Tim

    nvm mind about fit curves I’ve followed some links and found some good ones.

  67. #68 John Mashey
    January 11, 2012

    Just for the record, this graph got re-used again by the Heartland Institute in January 2009, p.5 … only they made it worse yet. That one says:
    “Since 2005, global temperatures have given back most of the warming that had occurred since 1980.”

    It also says, at bottom left, in minuscule characters:

    SOURCE: THE UNIVERSITY OF ALABAMA AT HUNTSVILLE and at bottom right: ANDREW BARR / NATIONAL POST

    and adds:
    “According to NASA satellite data—the only timely global temperature record–the global temperature in 2008 is no higher than it was in 1979.”

    Not only some words, but gridlines are different, suggesting that someone regenerated the graph with different labels.

  68. #69 John Mashey
    January 11, 2012

    OOPS, sorry, broken link.
    See Jan 2009 E&CN, p.5.

  69. #70 David Walker
    Ardmore, AL
    February 13, 2013

    All of you can rattle on about “science” and “consensus” and “objectivity”, but the elephant in the room is “human nature”; i.e. the lust for power and money; enthusiasm for the big lie when it means your side gets rich and dominates by force.

    The “global warming” context has never had anything to do with science, but about using the force of law to determine who makes money and who pays. It’s like any other crisis structure (war on poverty, war on drugs, war on terror, socialized medicine, and on and on).

    If you’re going to dedicate yourselves to science, top wasting ordinary folks’ toil and time; investigate whey charlatans like Albert Gore go about bleating and beating, and promoting the greatest fraud of our time. Again, the “global warming” context has nothing to do with science — it’s all about money and power. If you can’t recognize the historical patterns, you ought to find another line of work.