# Sixth-degree polynomial fits, just say no

When discussing Jon Jenkin’s use of a ridiculous sixth-degree poynomial fit to temperatures to argue that they were steeply trending down, I suggested that local regression (loess) was a much better method for showing trends. I was going to get around to plotting one, but Tamino has saved me the trouble by producing a loess smooth on GISS temperatures.

See Tamino’s post for more graphs, including one showing how Dennis Avery misrepresented temperature trends even worse than Jenkins did.

1. #1 Dunc
January 16, 2009

Oh man, I missed that the first time round… What possible excuse could there be for using a sixth-order poly, other than “I’ve been playing around in Excel and that gave me the trendline I wanted”?

2. #2 Paul
January 16, 2009

Yep, that looks correct.

3. #3 sod
January 16, 2009

i m still waiting for Jenkins showing a 3rd degree polynomial that bends downward at both ends. (he claimed the degree wouldn t matter…)

perhaps he ll show up to defend his claim, that all trendlines are equally right or false?
as the IPCC future scenarios have the same predictive accuracy as his polynomial has for the future…. (ice age 15 years from now..

4. #4 Ian
January 16, 2009

At first glance I read that as “Sith Degree” and was wondering which university offered it. I’d never heard of Polynomial University. It must be on the dark side of town…

5. #5 Joshua Zelinsky
January 16, 2009

He used a sixth degree polynomial to fit data? I expect I could find high school students who could explain why that’s probably a bad idea. Gah.

6. #6 Bernard J.
January 16, 2009

The truly frustrating thing is that Tamino has discussed lowess (and similar) smoothing on a number of occasions, and anyone with two scientific neurones and a synapse, who ventures into the blogosphere to discuss climatic (or any other type thereof!) time-series trendlines, should have at least a passing acquaintance with this eminently comprehensible work. It should certainly give any serious analyst pause for thought before reaching for the box of Polynomial Delights, and I still can’t believe that he hasn’t gagged on his 3rd-order comment, the one that sod keeps trying to draw his attention to.

Jenkins’s preposterous notions on the initial thread continue to baffle me, and I would love to be a fly on the wall of his departmental tea-room if ever his colleagues discussed this amateurish fitting of a squiggle. I’ve seen similar discussions about analyses-gone-awry in my own institutions at various times, and I can imagine the professional irritation and embarrassment Jenkins might be causing his colleagues.

The interesting thing is that in just about every case the offending academic ‘moved on’ after not-so-many years. However, as an adjunct professor Jenkins might not be quite so prone to mobility, and I am sure that there are many institutions that would regard askance any future application of his, especially as Google is now the Interviewer’s Friend…

There must be some peeved administrators at Bond.

7. #7 Dano
January 16, 2009

including one showing how Dennis Avery misrepresented temperature trends

Did Avery do that or Andura Smetacek? Who IS Monsanto’s new sock puppet these days, BTW?

But Zelinsky has an excellent point:

I expect I could find high school students who could explain why that’s probably a bad idea

One can conclude that denialists have less than a high school education, or a highly-developed selection/conformation bias. This point narrows the choice considerably, methinks

Best,

D

8. #8 luminous beauty
January 16, 2009

sod,

I believe one can get a third order polynomial to produce a curve that is downward at both ends by truncating the series between zero solutions of the derivative.

What that would mean for extrapolating from that function, though…

9. #9 SpotWeld
January 16, 2009

A high order polynomial could be a good method to model information, assuming you were only interested in appoximating the information “between” the known data points. It’d be horribly useless for any sort of prediction outside the “end” of the dataset.

10. #10 ben
January 16, 2009

They obviously should have used a 7th degree polynomial.

11. #11 mz
January 16, 2009

No, ben, that doesn’t help.

12. #12 ben
January 16, 2009

It’d be horribly useless for any sort of prediction outside the “end” of the dataset.

Right, but so would just about any other trend that isn’t model-based. How else would you make a prediction? Can a linear fit be used to project into the future? How far into the future? Will the temperature of the earth increase forever at the same linear rate?

13. #13 Anton Mates
January 16, 2009

A high order polynomial could be a good method to model information, assuming you were only interested in appoximating the information “between” the known data points.

Only if the data is completely noise-free. If you’ve got error, raising the order of the polynomial approximation will quickly cause it to become worse and worse…even between known points. You’ll get weird and meaningless peaks and dips. By and large, the noisier the data set, the lower the degree at which this starts happening. (Basically this is because noise does really bad things to your estimates of the derivatives, and the derivatives usually have more influence on the higher-degree terms of the polynomial.)

Even if the data is noise-free, there are a lot of curves for which polynomial interpolation just doesn’t converge. The absolute value function, for instance, and the function y=1/[1+x^2]. Again, raising the degree of the polynomial will make it a worse fit, not a better one. See Runge’s Phenomenon for a nice illustrated example of this.

14. #14 MarkG
January 16, 2009

No Ben. The linear trend is useful because the data already shows significant linear trends. Any such derived trends get posted with lots of ‘IF’ statements. If you feel bold you might try an exponential, bearing in mind that the atmosphere is in no way exponentially heating. If you don’t know maths or physics or you’re trying to deceive ignorant people you can use a polynomial.

The linear fit is useful for extrapolation if and only if the linear trend continues. That’s all it tells you and no more. I am sure that the long term trend is not linear, though in the short term (say 30 years) it may prove to be.

The obsession in denialist circles with every downward wiggle in this data is pure desperation. The recent use of polynomial fits takes this a step beyond. It’s staggering to me that anyone with an education beyond first year maths would even consider such a thing.

15. #15 P. Lewis
January 16, 2009

MarkG

Know what you mean/getting at, but your terminology is likely the start of the slippery slope if not nipped. A linear fit is one using a polynomial of degree 1.

16. #16 MarkG
January 16, 2009

Indeed. I meant something more like “… The recent use of higher order polynomial fits…”

17. #17 ben
January 16, 2009

No, ben, that doesn’t help.

Do you not recognize sarcasm?

The linear fit is useful for extrapolation if and only if the linear trend continues. That’s all it tells you and no more.

That was my point.

If you’ve got error, raising the order of the polynomial approximation will quickly cause it to become worse and worse…even between known points.

That’s not necessarily true. As long as the number of data points is significantly greater than the order of the polynomial then the fit match optimally for the assumption that the polynomial is a valid representation of the underlying dynamics.

And good grief, I’m just clowning around, I’m not defending the stupid fit used in the graph.

18. #18 Chris O'Neill
January 16, 2009

When discussing Jon Jenkin’s use of a ridiculous sixth-degree poynomial fit to temperatures

I notice that Bond University doesn’t have any degree program that has anything to do with atmospheric or basic physics. So they’re likely to let him go on his merry way.

19. #19 Chris O'Neill
January 16, 2009

I notice that Bond University doesn’t have any degree program that has anything to do with atmospheric or basic physics.

I should probably have said general physics instead of basic physics.

20. #20 elspi
January 16, 2009

Anton says “Even if the data is noise-free, there are a lot of curves for which polynomial interpolation just doesn’t converge”

Not sure what you mean there, but that doesn’t seem quit right.

The Weierstrass approximation theorem states that $I$ is a closed interval and
$f:I \to R$ is a continuous function then for any $\epsilon>0$ there is a polynomial $p$ such that $|p(x) -f(x)| < \epsilon$ for all $x \in I$.

Thus you can find a sequence of polynomials $(p_n)$ with $(p_n)$ converging to $f$ uniformly on $I$. I think the point is that there is no power series which converges to a function which isn’t differentiable (or analytic). And of course you cannot make this work on an open interval (or the whole real line).

Also the degree of the polynomial tends to grow exponentially. For example to approx. the function |x| to within .001 you need a polynomial of degree very very very large.
(I don’t remember what the exact answer was, just that it was ridiculous)

21. #21 elspi
January 16, 2009

“I should probably have said general physics instead of basic physics.”
You should have said science,

BTW, where the hell do they get off calling THAT a university.

22. #22 Eli Rabett
January 17, 2009

Dunc gets it:Oh man, I missed that the first time round… What possible excuse could there be for using a sixth-order poly, other than “I’ve been playing around in Excel and that gave me the trendline I wanted”?

ben not:They obviously should have used a 7th degree polynomial.

Jenkins WAS using EXCEL. The polynomial trendline ONLY goes up to 6th order.

Pardon Eli, he has to go clean the carrot juice off the screen.

23. #23 jyyh
January 17, 2009

What I “like” about this kind of denialist’s models is their simplicity. what I’ve been wondering, what is the minimum number of linear equations to do a good fit to the temperature data? Like, insert the knowns, like the black-body T of earth, the atmospheric H2O, CO2, CH4, SO3 amounts (of N2O there is probably too short a record), ENSO data (there aren’t probably long enough series on other oceanic variables), the solar cycle, albedo (from ice/snow data) and see if linear equations (though in reality they are not such) are enough to get a good fit (say… accuracy of 75%). This would mean at least 7 equations (assuming they do not affect each other, but only the surface temperature). Linear, non-interdependent equations are easier to get than the complex equations in the climate models that try to separate fe. the low, mid, and high troposphere. Assume yet a flat earth, so the calclulations could be done in cubes rather than in round grids (set a correction coefficient for the distortion). This sort of atmospheric model even I could try to do, and though I know it would be inaccurate, I’d know where the most outrageous inaccuracies are.

75% accuracy isn’t much i admit (just assuming a rather large standard error on all variables). Though this sort of model would not predict anything, I think it’d help to understand the current climate models.

24. #24 ben
January 17, 2009

ben not:They obviously should have used a 7th degree polynomial.
Jenkins WAS using EXCEL. The polynomial trendline ONLY goes up to 6th order.

Well then he should have done it twice to get 12th order.

25. #25 Barton Paul Levenson
January 17, 2009

jyyh–

You can get 76% accuracy (assuming by “accuracy” you mean “fraction of variance accounted for”) with a simple linear fit between temperature anomaly and ln CO2 for 1880-2007. I did.

26. #26 Anton Mates
January 17, 2009

Ben:

If you’ve got error, raising the order of the polynomial approximation will quickly cause it to become worse and worse…even between known points.

That’s not necessarily true. As long as the number of data points is significantly greater than the order of the polynomial then the fit match optimally for the assumption that the polynomial is a valid representation of the underlying dynamics.

Nope, not when noise is involved. Really, try it. Generate, say, a linear function. Add some noise in. Fit polynomials of increasing degree to the noisy data, and calculate the error between the polynomial approximation and the original curve. You will see that the high-degree polynomials are less accurate.

This is true even when the original curve is a polynomial (which of course a linear function is), and even when the number of points is much larger than the degree of the polynomial (I used 500 points and checked polynomials up to degree 20 in Octave just now.)

Noise and polynomial interpolation don’t mix. And, as Ian said over on his Astroblog post on the subject, climate data is noisy as heck.

27. #27 Anton Mates
January 17, 2009

elspi,

Anton says “Even if the data is noise-free, there are a lot of curves for which polynomial interpolation just doesn’t converge”

Not sure what you mean there, but that doesn’t seem quit right.

The Weierstrass approximation theorem states that $I$ is a closed interval and $f:I \to R$ is a continuous function then for any $\epsilon>0$ there is a polynomial $p$ such that $|p(x) -f(x)| [is less than] \epsilon$ for all $x \in I$.

Thus you can find a sequence of polynomials $(pn)$ with $(pn)$ converging to $f$ uniformly on $I$.

You’re quite correct that Weierstrass guarantees a convergent sequence of polynomials. The problem is that you are not guaranteed to get that sequence by interpolation on any particular table of nodes. Some node tables are more robust than others–Chebyshev is much better than equidistant nodes, of course–but for any given node table, there are continuous functions whose interpolating polynomials on those nodes will diverge.

Granted, any polynomial which approximates the original curve well is an interpolation of that curve on some set of nodes, but simply knowing that fact is not terribly helpful.

28. #28 z
January 17, 2009

“Well then he should have done it twice to get 12th order. ”

snorted out loud (i extrapolated this as a joke)

29. #29 z
January 17, 2009

could anyone direct me to a loess for dummies reference? preferably for sas users if that’s an option? thanks

30. #30 luminous beauty
January 17, 2009

z,

I don’t know about dummies, but this one is for engineers:

http://www.itl.nist.gov/div898/handbook/pmd/section1/pmd144.htm

Hope that’s close enough. It helped me.

31. #31 sg
January 17, 2009

z, I would suspect that the SAS online docs would offer a good starting point.

32. #32 z
January 17, 2009

thanks sg, i’ll give it a look. although i must say, i find that official sas publications never are a good starting point for anything….

33. #33 z
January 17, 2009

and thanks lumie, also (working my way upwards, here)

34. #34 sg
January 17, 2009

I like the sas online docs. They’re very thorough. Impossible to search and very badly laid out, and you have to know the procedure you’re looking for to get anything out of them, but they’re good when you finally get where you’re going.

January 17, 2009

They are seriously phoning it in these days.

What’s next, the Kabala?

36. #36 Ezzthetic
January 17, 2009

Sixth-degree polynomial fits, just say no

When I did polynomials, we were taught to just say O(n).

37. #37 ben
January 18, 2009

Nope, not when noise is involved. Really, try it. Generate, say, a linear function. Add some noise in. Fit polynomials of increasing degree to the noisy data, and calculate the error between the polynomial approximation and the original curve. You will see that the high-degree polynomials are less accurate.

If the physics that produced the data match the polynomial you are using for the fit, then the fit will match the curve regardless of noise, provided that the number of data points is higher than the order of the polynomial.

Look, there is no closed form model for climate over time anyway, so any fit will be stupid, except for the linear fit. And that only shows the trend over whatever time period you pick for the fit. If you pick a long time period for the fit, then you will miss smaller time-scale changes, especially at the end points, and if you pick short time scales, then you will fit the short term variations, whoopee.

38. #38 (((Billy)))
January 18, 2009

I am a public historian (labour and industrial history), though I did start out as a computer engineering major (many, many years ago). Maybe my old mind is just not retaining the math from my mispent youth, but can anyone, in layman’s (historian’s) terms tell me what a polynomial is? I think that I understand that attempting to smooth a curve using non-data from outside the curve leads to some weird effects, but beyond that? Historian. If someone could either explain (in simple terms) or point me to a site that has that information, I would be greatly appreciative.

39. #39 mark
January 18, 2009

I think Jenkins could “prove” his point using lowess if he adjusted his parameters to narrow the window of weighting (essentially using just 2 or 3 adjacent points, meaning just reproducing the graph).
W.S. Cleveland had a good explanation for non-specialists of the procedure in his 1985 book “The Elements of Graphing Data.”

40. #40 Bernard J.
January 18, 2009

Billy

For a pretty good summary, our favourite know-it-all has something to say.

41. #41 Dano
January 18, 2009

Ezzthetic:

teh funneh.

Best,

D

42. #42 Anton Mates
January 18, 2009

If the physics that produced the data match the polynomial you are using for the fit, then the fit will match the curve regardless of noise, provided that the number of data points is higher than the order of the polynomial.

This simply isn’t true, Ben; noise makes polynomial fits lose accuracy as their order increases. Again, I just ran through a few examples myself in Octave. You can too; it’s a free program. I can tell you the (very short) code for an example if you’d like.

But if you don’t believe me, then, again, look at Ian Musgrave’s article. Look at the first picture. The original data is from a 1st-degree polynomial, plus noise; he gives its formula later on in the comments. The black curve is the linear fit–that is, a polynomial fit of degree 1. The green curve is a polynomial fit of degree 6. Which curve recovers the original line more accurately? It’s really not hard to see that the linear fit is better.

Look, there is no closed form model for climate over time anyway, so any fit will be stupid, except for the linear fit.

That hardly follows. You don’t need to fit a data series with exactly the same sort of function you believe produced the original data. Most sound waves are not in fact a finite sum of sinusoids, but that doesn’t mean you can’t usefully represent them with a finite Fourier series. Most population curves are not in fact exponential or logistic, but those functions can still represent them reasonably well (under the right conditions, anyway.)

A good curve-fitting technique just has to be capable of producing curves roughly like those you expect to see in the data, and it needs to be robust against noise. High-order polynomials meet the first criterion but not the second. Linear fits aren’t as good on the first criterion but are way way better on the second. Local regression, as Tamino’s graph shows, does quite well on both criteria.

43. #43 Douglas Watts
January 19, 2009

The difference here seems to be that Mr. Jenkins et al. “want” a trend line to go in a very specific direction. But the goal of analysis is to reveal trend – noise as to discover what a trend line might be.

This is like spending all of your money as fast as you can because you’ve noticed your paycheck usually arrives when your bank account is at its lowest.

44. #44 ben
January 19, 2009

This simply isn’t true, Ben; noise makes polynomial fits lose accuracy as their order increases.

Sorry, I worded that incorrectly. For a given set of data, least squares will find the “optimal” fit to the data. Yes, as the noise increases the fit will become worse. Here’s my example using Matlab:

% arbitrary coefficients
a0 = 0.1;
a1 = 0.2;
a2 = 0.35;
a3 = -0.2;
a4 = -0.035;
a5 = 0.001;

% the exact data for the model
t = 0:0.01:10;
x = a5*t.^5 + a4*t.^4 + a3*t.^3 + a2*t.^2 + a1.*t + a0;

for i = 1:max(size(t))
xn(i) = x(i) + 1e2*randn(1);
end

figure(1); clf(1);
plot(t,x)
hold on
plot(t,xn,’rx’)
hold off

And the resulting plots of data, noisy data, and the resulting fit, for a fifth order polynomial.

http://img.photobucket.com/albums/v665/hautlipz/noisy_n5_with_linear.jpg

Yes, the polynomial fit would get worse as the variance of the noise increased, to the point eventually where the polynomial fit would be useless. The linear fit would still show some sort of overall trend. This is all moot anyway, as there is no reason to expect that global temperatures would follow any sort of polynomial model over time. That is not, and never was my point. You guys don’t seem to understand a joke. Argh!

The *best* you could do would be to have a proper model of the system dynamics and then you could estimate and/or smooth the data.

45. #45 ben
January 19, 2009

http://img.photobucket.com/albums/v665/hautlipz/noisy_n5_with_linear.jpg

46. #46 ben
January 19, 2009

Dang, just go here

47. #47 luminous beauty
January 19, 2009

ben,

Excuse me, I didn’t hear your joke, because all I’m hearing from you is white noise.

48. #48 luminous beauty
January 19, 2009
49. #49 Alan D. McIntire
January 19, 2009

There’s a confusion in the debate here between MODELING data versus predictive formulas. The higher the degree of the polynomial, the more closely it will match the data for the closed period you already have data for. As for the future, the only way to tell which is better, a given linear model or a given higher polynomial, is to compare the predictions of each model with future data points.

I’ll change the subject here to multiple linear regression models. Obviously the temperature of the earth is not solely a function of time, but a function of CO2, various solar effects like where we are in the Milaknovich cycle and magnetic cycle, area of earth irrigated, etc.
In that case you can make an equation
T = a1X1 + a2X2..+ anXn.
When you add an additional variable you will automatically increase R^2, with a limit approaching 1, but you will NOT always increase the precision of the estimate, because ultimately you may reach a point where adding a variable will cause the residual mean square to increase.

Another posibility is to use the sequential F test. After adding an additional factor, you check whether the F value is still greater than the F value of 1, n-1 degrees of freedom at the 1%, 5% or whatever your cutoff value is.

50. #50 ben
January 19, 2009

Here’s the real problem: using any polynomial fit (linear or otherwise) to data of this nature to predict the future is dumb. The fit has no knowledge of the inputs of the system. At best, a linear fit will allow you to make a very short future time prediction as long as you know that your system dynamics are slow. A linear fit will tell you what a trend has been, but a lot less about what it will be in the future.

You can use a polynomial fit for aspects of models that really do follow polynomial behavior. For example, the drag coefficient of a wing often varies somewhat like the square of the angle of attack for a given Reynold’s Number. Hence you can put a wing in a wind tunnel, measure the drag at various angles of attack, and fit a second-order polynomial to the data. The resulting polynomial can then be used with some confidence in a simulation of the dynamics of an aircraft or whatever that uses that wing. The data from the wind-tunnel experiment can be noisy as long as the magnitude of the noise isn’t such that it obscures the higher-order features of the drag coefficient vs. angle of attack curve.

Can these results be used for prediction? Yes, somewhat. You don’t necessarily have to take data over the entire range of operation of the wing, you can extrapolate from your fit what will be its drag coefficient at higher angles of attack than what you tested. However, one must be careful in doing this. If the physics of the wing change at some particular angle of attack, your fit will miss this behavior and any prediction beyond any point of transition could be very poor. The fit, linear or otherwise, has absolutely no way of *knowing* this.

Now with the climate, well, the climate, as a function of time, is a process, right? Hence it is governed by differential equations of some sort, with a myriad of parameters, with many that are interdependent and non-constant. How can any polynomial fit, linear or otherwise, help anyone predict into the future? The dynamics are input dependent, and the fit knows nothing of the inputs. At best, as I stated above, if you know that your dynamics vary slowly enough, then if the known climate has followed a particular trend for a long enough time, then you might be able to predict the climate in the very near future. Beyond that, the fit is useless as a predictive tool.

51. #51 Anton Mates
January 19, 2009

Ben,

Sorry, I worded that incorrectly. For a given set of data, least squares will find the “optimal” fit to the data. Yes, as the noise increases the fit will become worse.

Well, sure, but that’s not what I was saying in my original post that you disputed. What I’m saying is that a given amount of noise will distort a high-degree polynomial fit more badly than a low-degree one. Of course all possible fitting techniques work worse and worse as the amount of noise increases, but the high-degree polynomials degrade faster. That makes low-degree ones a better choice for curve-fitting.

% arbitrary coefficients a0 = 0.1; a1 = 0.2; a2 = 0.35; a3 = -0.2; a4 = -0.035; a5 = 0.001;

% the exact data for the model t = 0:0.01:10; x = a5t.^5 + a4t.^4 + a3t.^3 + a2t.^2 + a1.*t + a0;

% add noise for i = 1:max(size(t)) xn(i) = x(i) + 1e2*randn(1); end

I fitted polynomials of degrees 1-10 to the noisy data, then looked at the pointwise average (RMS) error of each polynomial versus the exact data. I repeated this 10,000 times, generating new noise each time, so that the error estimates would converge. The result:

Degree: Error:
1 1.66818
2 0.39651
3 0.22653
4 0.22672
5 0.25486
6 0.25584
7 0.32272
8 0.34728
9 0.34754
10 0.35034

Notice that, even though your exact data is from a 5th-degree polynomial–and therefore, trivially, also a polynomial of every degree greater than 5–none of those fits are the best here. The error is lowest for the 3rd and 4th-degree polynomials.

So even though those lower-degree polynomials couldn’t possibly fit the exact data perfectly, they’re so much more resistant to noise that, on the whole, they provide a better fit.

This is all moot anyway, as there is no reason to expect that global temperatures would follow any sort of polynomial model over time.

Again, that in itself is not a problem; a good curve-fitting method doesn’t have to be able to match the true curve precisely. And obviously, over any finite time period global temperatures can be matched by a polynomial to arbitrary precision…the Stone-Weierstrass theorem guarantees that.

That is not, and never was my point. You guys don’t seem to understand a joke. Argh!

No offense, but the conversation isn’t about your joke; we were criticizing Jenkins’ curve-fitting method before you got here.

The best you could do would be to have a proper model of the system dynamics and then you could estimate and/or smooth the data.

The loess method seems to work quite well, even without a model.

52. #52 Anton Mates
January 19, 2009

I fitted polynomials of degrees 1-10 to the noisy data, then looked at the pointwise average (RMS) error of each polynomial versus the exact data. I repeated this 10,000 times, generating new noise each time, so that the error estimates would converge. The result:

Dang, that error table formatted horribly. Let me rewrite it:

Average error of 1st-deg. polynomial: 1.66818
Average error of 2nd-deg. polynomial: 0.39651
Average error of 3rd-deg. polynomial: 0.22653
Average error of 4th-deg. polynomial: 0.22672
Average error of 5th-deg. polynomial: 0.25486
Average error of 6th-deg. polynomial: 0.25584
Average error of 7th-deg. polynomial: 0.32272
Average error of 8th-deg. polynomial: 0.34728
Average error of 9th-deg. polynomial: 0.34754
Average error of 10th-deg. polynomial: 0.35034

The important thing to notice is that, even though your original data came from a 5th-degree polynomial, polynomials of lower degree still fit it better. That’s what noise does to the problem.

53. #53 Bernard J.
January 20, 2009

MODELING data versus predictive formulas.

To touch on one of Alan’s comments, I’d be surprised if any here, other than perhaps the most ignorant of climate science denialists, would suggest that curve fitting to 30 years of global mean temperature data constitutes ‘modelling’. After all, the independent variable (out of a mere two variables) is not even a forcing per se, so the concept of modelling does not even come into it.

And I would hope just as fervently that everyone participating in this discussion understands that lines fitted to an interval of x,y data do not ‘predict’ beyond the interval of the independent variable (in this case, time). As ben said, if the ‘dynamics’ of the system vary ‘slowly’ enough one might make very near-term predictions, but this is the scientific version of fingers-switched-by-canes stuff, and no scientist worth their salt would make predictions using this technique, considering the naïve nature of the data. Given the auto-correlated dynamic of weather one might simply, and perhaps more validly, use the weather from yesterday or perhaps from last year to make a near-term prediction, but extrapolating from a line-fit of 30 years of two-paramter data is a bit of a predictive no-no.

When we get right down to it, what Jenkins was trying to do was to show a trend over said interval by ‘removing noise’. The trouble is he was trying to show a ‘trend’ that conforms with his ideology, and not with the objective science, and to do this he had to use a completely inappropriate technique.

High-order polynomials have nothing to do with trend/smoothing methodologies. It is as simple as that. Jenkins’ employment of a high-order polynomial, and his omission of data both before and after the period graphed, illustrate either a deliberate intention to promote an (unsupportable) ideology over best science, or a complete incompetence to participate in such science in the first place.

I would suggest that it is past time for Jenkins to return to this thread in order to maintain his defence of his use of a high-order polynomial, with careful response to the many challenges that have been put to him, or to otherwise concede his glaring error and retract his piece.

If he chooses to not do so, then his scientific credibility is called into question. This is such a spectacular display of scientific incompetence that it matters not that it occurred in a non-scientific forum – as I noted earlier Google is an indispensable tool for any background check these days, and this incident is sure to stick to his boot like the proverbial dog faeces, whether or not it occurs in peer-reviewed literature.

Deltoid – and many other blogs – will certainly not forget.

54. #54 climatepatrol
January 23, 2009

Apart from the different curve fitting techniques, I’d like to stress that Jon Jenkin used data from the University of Alabama which showed a trend of just +0,127°C per decade while Giss (and the above graph) follow a trend of +0,162°C per decade for the last 30 years. This means that the Giss slope is 27.6% steeper than UAH as used by Jenkin at first place. Moreover, during the 21st century, the bias between Giss and UAH increased. 2008 is +0.52C above 1979 at Giss, but just +0.28C at UAH.

55. #55 bi -- IJI
January 23, 2009

> Apart from the different curve fitting techniques,

“Different”, as in “bogus”.

56. #56 (((Billy)))
January 23, 2009

Bernard J.: thanks. Hitting myself in the head now. I remember those from when I started out college as a computer engineering major (before I went straight). My question then becomes, why would anyone use polynomials to massage data? I guess it works in the middle of the data field, but don’t the ends get bizarre?

57. #57 Bernard J.
January 23, 2009

Billy, in answer to your question at #56, I can only think of two reasons: either the person fitting the polynomial is mathematically/scientifically ignorant, and has no concept of the different between prediction, trend demonstration, and smoothing (and why high-order olynomials are not appropriate in any case), or the person is attempting to lead an uninformed and unsuspecting lay audience to conclude something from data that describe something else entirely.

Of course, the two options are not mutually exclusive.

58. #58 Dano
January 23, 2009

To expand on Bernard’s cogent point in 57, IME the latter is what I expect from certain ideological websites, conservatarian authors, paid shills, etc.

Best,

D

59. #59 (((Billy))) The Atheist
January 23, 2009

Thanks. That was my reading on it but, lets face it, I am an historian so my internal guess that it is done either through ignorance or to commit fraud was a good guess. Thanks for putting some fact behind my guess.

60. #60 Bernard J.
January 24, 2009

For the lurkers on Deltoid who might have missed recent activity on other blogs…

Roy Spencer is definitely joining the ranks of academics-emeritus. He too selects a polynomial (although ‘only’ fourth order) to describe a similar ‘trend’.

The really telling (recent) descent of Spencer though is well described by Tamino.

61. #61 Bernard J.
January 26, 2009

Jon Jenkins.

You were quick to include yourself in the original Deltoid thread about the Australian article by yourself.

Since then, the number of points and posts critical of your use of mathematical curve description has increased markedly, and you have been conspicuously silent in justifying your analysis and your claims. Perhaps you could clarify your scientific reasoning, and point out where many professionals have it wrong and where you are correct.

At the same time, given developments since, perhaps you could also clarify the nature of your qualifications and employment, as these were used as bona fides to enhance the credibility of the Australian piece.

62. #63 climatepatrol
January 30, 2009

Dear Barton (#62)

I basically get your point. But I also get Roy’s point: CO2 is not a bomb – CO2 is life. And I don’t think he will sue you for calling him “dishonest”, brother, because he knew this is going to happen. No, Spencer is not a denier of physically established science, he even publicly rebuked a contrarian as early as 1998 here (scroll down all the way to link “Reactions to Dr. Hug’s paper”, 1st comment by Roy

63. #64 climatepatrol
January 30, 2009

Dear Barton (#62)

I basically get your point. But I also get Roy’s point: CO2 is not a bomb – CO2 is life. And I don’t think he will sue you for calling him “dishonest”, brother, because he knew this is going to happen. No, Spencer is not a denier of physically established science, he even publicly rebuked a contrarian as early as 1998 here (scroll down all the way to link “Reactions to Dr. Hug’s paper”, 1st comment by Roy

64. #65 bi -- IJI
January 30, 2009

climatepatrol:

> I basically get your point.

You mean, you basically ignore it.

65. #66 dhogaza
January 31, 2009

CO2 is not a bomb – CO2 is life.

CO2 is also death.

You’d think Roy Spencer would understand this.

66. #67 Bernard J.
February 1, 2009

Climatepatrol.

CO2 is life

This is just a very clumsy and a juvenile strawman.

Many elements and compounds are essential for the metabolic processes of various forms of life at certain concentrations, but are they not more effective, or indeed they are even toxic, at higher concentrations.

Consider boron, fluoride, oxygen, copper, iron, selenium, water, vitamin A – the list could go on for pages. All are substances required by organisms at particular concentrations, many at trace levels, and all are harmful at concentrations elevated above their various physiologically required ranges.

As much as it will grate on both you and Tim Curtin, CO2 is not of equal benefit to all plants above current ambient levels, and indeed in all plant species the growth-rate limiting effects of the concentrations of other nutrients (such as nitrogen) rapidly overcome any benefit from increasing CO2.

And this is before one factors in the effect that CO2 has on temperature, and the impact that temperature in turn has on plant growth. Do some basic learning about evapotranspiration, and about the temperature optima of enzymes, and about the relative temperature adaptations of different species and how changes in temperature impact upon these ecological equilibria.

Of course, you would know all this if you have even half a brain, because it’s been dealt with ad nauseum for years. Trying to frame things otherwise is not science, it’s ideology.

And if you truly don’t understand why the “CO2 is life” meme is a propagandist twist of real science, then you really aren’t in a place to be participating in a discussion of climate change at all, except to silently learn the most basic of facts.

You might think that repetition of a factoid will make it truth, well, there are many people here and elsewhere prepared to call you on your pseudoscience for as long as you try to promote it.

67. #68 climatepatrol
February 4, 2009
CO2 is life

This is just a very clumsy and a juvenile strawman.

Sure. No doubt about that. That’s why Spencer calls his own post “satirical” and “a counterpoint to Al Gore’s use of “millions of tons””.

As much as it will grate on both you and Tim Curtin, CO2 is not of equal benefit to all plants above current ambient levels, and indeed in all plant species the growth-rate limiting effects of the concentrations of other nutrients (such as nitrogen) rapidly overcome any benefit from increasing CO2.

And this is before one factors in the effect that CO2 has on temperature.

Thank you for your clarification as an expert of this subject. My whole point here is the lack of logic I see in Barton’s website to call Spencer “dishonest” on grounds of the CO2 volume argument. Isn’t it that we all agree that at 200-800 ppm concentration, CO2 is life. At 500000 ppm, it is death? At current CO2-concentrations, the biosphere as a whole (sorry my ignorance re biodiversity) is reported to still benefit from enhanced CO2. I’d like to come back to the topic of this tread. Spencer also clarified that his own 4th-order-polynomial fit did not have any predictive value, just like the above.

68. #69 Lee
February 4, 2009

climatepatrol: what complete and utter bullcrap!!

First, increased primary productivity is not necessarily a good thing, even if it were happening – rapid fertilization of existing ecosystems almost always leads to loss of diversity and simplification of the system, and therefore a loss of ecosystem stability.

But even ignoring that, your link about ‘benefit from enhanced CO2′ is absurd. Here is an example:
“The largest increase was in tropical ecosystems. Amazon rain forests accounted for 42% of the global increase in net primary production, owing mainly to decreased cloud cover and the resulting increase in solar radiation.”

Decreased cloud cover is happening BECAUSE THE DAMN RAIN FOREST IS BEING CUT DOWN!!!! There is a huge, stunningly huge, loss of biomass in the tropics, directly as a result of removal of rain forest. Remove the canopy, reduce transpiration, local relative humidity plummets, you get less rain, and it become harder to re-establish the rain forest. These guys are looking at a short-term slight increase in primary productivity after removal of rain forest canopy, attributing it to ‘climate change’ and CO2, and calling it a good thing.

Even more, they are implying that the short-term slight increase in primary productivity is actually an increase in biomass – hell, they label the 65 an increase in ‘NET primary production’ – not ‘net primary productivity’ and there is a difference – which they can only do because they are not including the loss of biomass due to rain forest removal. The reality is that a stunningly huge decrease in biomass has been caused by removal of rain forest canopy species, and that it is the second largest driver of anthropogenic CO2 increase, right after fossil fuel combustion. And that the increase in net productivity (productivity, not production) is in fact a consequence of the removal of the rain forests, with attendant loss of this critical ecosystem.

It is hard to imagine a more dishonest ‘analysis.’

69. #70 Bernard J.
February 5, 2009

sorry my ignorance re biodiversity

And therein lies the problem.

But at least you are open enough to concede the point. Tim Curtin blithely ploughs through decades of carefully acquired understanding in several disciplines – of plant productivity and ecology – and presumes to tell the experts, who know the fields much better than he, that they are born-again eugenicists and charlatans.

70. #71 bob sage
massachusetts
June 24, 2013

You might try fitting a curve to CFC density. Then compare that curve to the same polynomial used for temperatures. Also do a trend for CO2. I believe you’ll find that the CFC and termperature match well. The CO2 not so much.