Where the IPCC went wrong with its climate predictions

Well, that headline's a little unfair. I wrote it to lure in those who jump on every opportunity to prove that climatologists are frauds. What I really mean to say is: "Where the most recent assessment by the IPCC has been superceded by more recent findings.

It's all in a new report, The Copenhagen Diagnosis, assembled by some of the top people in the field. Here's the executive summary:

Surging greenhouse gas emissions: Global carbon dioxide emissions from fossil fuels in 2008 were nearly 40% higher than those in 1990. Even if global emission rates are stabilized at present -day levels, just 20 more years of emissions would give a 25% probability that warming exceeds 2ºC. Even with zero emissions after 2030. Every year of delayed action increase the chances of exceeding 2ºC warming.

Recent global temperatures demonstrate human-based warming: Over the past 25 years temperatures have increased at a rate of 0.19ºC per decade, in every good agreement with predictions based on greenhouse gas increases. Even over the past ten years, despite a decrease in solar forcing, the trend continues to be one of warming. Natural, short- term fluctuations are occurring as usual but there have been no significant changes in the underlying warming trend.

Acceleration of melting of ice-sheets, glaciers and ice-caps: A wide array of satellite and ice measurements now demonstrate beyond doubt that both the Greenland and Antarctic ice-sheets are losing mass at an increasing rate. Melting of glaciers and ice-caps in other parts of the world has also accelerated since 1990.

Rapid Arctic sea-ice decline: Summer-time melting of Arctic sea-ice has accelerated far beyond the expectations of climate models. This area of sea-ice melt during 2007-2009 was about 40% greater than the average prediction from IPCC AR4 climate models.

Current sea-level rise underestimates: Satellites show great global average sea-level rise (3.4 mm/yr over the past 15 years) to be 80% above past IPCC predictions. This acceleration in sea-level rise is consistent with a doubling in contribution from melting of glaciers, ice caps and the Greenland and West-Antarctic ice-sheets.

Sea-level prediction revised: By 2100, global sea-level is likely to rise at least twice as much as projected by Working Group 1 of the IPCC AR4, for unmitigated emissions it may well exceed 1 meter. The upper limit has been estimated as - 2 meters sea-level rise by 2100. Sea-level will continue to rise for centuries after global temperature have been stabilized and several meters of sea level rise must be expected over the next few centuries.

Delay in action risks irreversible damage: Several vulnerable elements in the climate system (e.g. continental icesheets. Amazon rainforest, West African monsoon and others) could be pushed towards abrupt or irreversible change if warming continues in a business-as-usual way throughout this century. The risk of transgressing critical thresholds ("tipping points") increase strongly with ongoing climate change. Thus waiting for higher levels of scientific certainty could mean that some tipping points will be crossed before they are recognized.

The turning point must come soon:
If global warming is to be limited to a maximum of 2ºC above pre-industrial values, global emissions need to peak between 2015 and 2020 and then decline rapidly. To stabilize climate, a decarbonized global society - with near-zero emissions of CO2 and other long-lived greenhouse gases - need to be reached well within this century. More specifically, the average annual per-capita emissions will have to shrink to well under 1 metric ton CO2 by 2050. This is 80-95% below the per-capita emissions in developed nations in 2000.

Some additional observations:

Categories

More like this

A few days ago a team of climate scientists (Catherine Ritz, Tamsin Edwards, Gaël Durand, Antony Payne, Vincent Peyaud, and Richard Hindmarsh) published a study of “Potential sea-level rise from Antarctic ice-sheet instability constrained by observations.” The study asked how much Antarctic ice…
The headline for this post is stolen verbatim from a section headline in a paper on climate change just published in Philosophical Transactions of the Royal Society A. It's yet another depressing read by NASA's Jim Hansen and five co-authors from the University of California, Santa Barbara and the…
It is nearly a magical sight to wake up to the gentle snowfall on Christmas morning. When the snow is still falling two days later, the magic starts to fade. Eventually, while poking at several inches of ice, buried beneath ten inches of snow, in hopes of finding your car, what was once magic soon…
Previously, I've noted the major hole that the IPCC digs itself by releasing its consensus reports on Fridays, only to be lost in the weekend news cycle. Back in February, the timing of the IPCC report helped contribute to what I described as a "massive communication failure" in generating wider…

I don't know. All that overwhelming evidence sure looks like a global communist fascist scientific conspiracy to me.

By Nils Ross (not verified) on 25 Nov 2009 #permalink

As Dan Rather might say (if he still had a job), "Just because the data has been manipulated, that doesn't mean the results aren't real."

By brave_dot (not verified) on 25 Nov 2009 #permalink

Let's see...The explanation for why fake data is not false is to quote results "Assembled by The Top LIARS, oh I mean scientists in the field".
Have you ever taken LOGIC in school? Your agenda is as clear as the liars oh I mean "Scientists".

There used to be a time when the conclusion came AFTER the results.

It is time to know the difference between science and philosophy!!!

You are not in the field of science but of PHILOSOPHY!!

GET REAL!!!

By Larry Crouch (not verified) on 25 Nov 2009 #permalink

Ahahaha you got me, you smug little bastard!

Biased climate data FTW!

I wrote it to lure in those who jump on every opportunity to prove that climatologists are frauds.

Five cretinous responses and counting ... :)

Summer-time melting of Arctic sea-ice has accelerated far beyond the expectations of climate models. This area of sea-ice melt during 2007-2009 was about 40% greater than the average prediction from IPCC AR4 climate models. ... This area of sea-ice melt during 2007-2009 was about 40% greater than the average prediction ... average sea-level rise ... 80% above past IPCC predictions. ... global sea-level is likely to rise at least twice as much as projected... melting faster than even this report assumes.

All of which supports my (amateur, only half paying attention) impression that just about all the serious professional estimates of climate change have been wrong: every time, the problem has been understated.

How accurate is my understanding so far? (Trolls, please - this question is not for you. SDASTFU.)

By Pierce R. Butler (not verified) on 25 Nov 2009 #permalink

Didn't Russian hackers discover the emails about temperature "tricks" to prevent declining numbers? The scientists were describing scientific method models that used temperatures from different times of the day in order to hide declining temperatures. People tell me that scientists care about their job, and wouldn't fudge numbers.

Well I ask this. Even though the police are expected to care about people and serve the public... is this always the case? People comfortable with a job, will do whatever they can to keep it, even if it means providing false science (like the nazi's did during WW2) to ensure their job security. I compare this to nazism because there have been many scientists who have had their publications ignored by the over-arching scientific community.

If you want the truth, look at how the models are generated, because those models will depict global warming or cooling. If you do not know the model being used for the numbers generated, how can you have faith in those numbers? Are you taking the numbers by faith (the faith you have in science and scientists)? Finally, wake up and recognize that a single volcanic explosion launched high enough into the stratosphere will drastically combat global warming by cooling the plant. Earth's sweat gland.

Enough of Global Worming... go find a real job.

The Emails provide plenty of context and evidence of corruption and politicized pseudo-science on their own. However, Google and read the analyses that are coming out based on the Harry_Read_Me.txt and the other computer code manipulation that has been going on behind the scenes at CRU. No wonder they hide their methods and data - it's total CRAP.
It is now orange jumpsuit and vaseline time for the CRU bunch. As far as the propagandists like James, they'll fade into the bush and re-appear around the next farcical liberal "crisis" that will define and consume their lives.

By Percy R. Buttless (not verified) on 25 Nov 2009 #permalink

Ha, you global warming alarmists may as well be reading chicken entrails. The findings of that 'research' would be about as credible as your computer modeling results.

The referenced site seems pretty slow. The "top people in the field" are these:

Citation: The Copenhagen Diagnosis, 2009: Updating the world on the Latest Climate Science. I. Allison, N. L. Bindoff, R.A. Bindoff, R.A. Bindschadler, P.M. Cox, N. de Noblet, M.H. England, J.E. Francis, N. Gruber, A.M. Haywood, D.J. Karoly, G. Kaser, C. Le Quéré, T.M. Lenton, M.E. Mann, B.I. McNeil, A.J. Pitman, S. Rahmstorf, E. Rignot, H.J. Schellnhuber, S.H. Schneider, S.C. Sherwood, R.C.J. Somerville, K.Steffen, E.J. Steig, M. Visbeck, A.J. Weaver. The University of New South Wales Climate Change Research Centre (CCRC), Sydney, Australia, 60pp.

and the direct link to the PDF is here.

Pierce Butler: If you take the averages and means as the predictions, they do understate the problem, but if you look to the worst case scenarios, they do seem to encompass the observations. For example, we're tracking the A1F1 CO2 emission projection pretty well, ~0.5GtC/y higher than the average of those other wimpy scenarios.

So apart from the removal of a few troublesome editors, blocking FOI requests, illegally deleting emails, blocking any contrarians from publication, the conflict of interest of peer-reviewing each others papers, cherry picking data, maintaining a database with virtually zero quality control, and modifying code to hide inconvenient trends as needed: please tell me â exactly what is wrong with the science? (snicker snicker)

I took the liberty to write some simplified code for CRU to use....

For intYear = 1400 to 2009;
floatGlobalMeanTemerature = floatGlobalMeanTemperature + WHATEVER_THE_HELL_YOU_WANT_IT_TO_BE;
intYear++
next

Print âHoly Crap! Weâre all going to DIE"

By Phillip Jones (not verified) on 25 Nov 2009 #permalink

Didn't Russian hackers discover the emails about temperature "tricks" to prevent declining numbers?

No. Next?

The scientists were describing scientific method models that used temperatures from different times of the day in order to hide declining temperatures. People tell me that scientists care about their job, and wouldn't fudge numbers.

Wow, this isn't even close. I'm curious, which denialist site did you find this lie on? Seriously, I want to know!

Finally, wake up and recognize that a single volcanic explosion launched high enough into the stratosphere will drastically combat global warming by cooling the plant.

Yes - for about two years. But what does this have to do with increased CO2 leading to temps slowly rising? Nothing.

So apart from the removal of a few troublesome editors, blocking FOI requests, illegally deleting emails, blocking any contrarians from publication, the conflict of interest of peer-reviewing each others papers, cherry picking data, maintaining a database with virtually zero quality control, and modifying code to hide inconvenient trends as needed: please tell me â exactly what is wrong with the science? (snicker snicker)

So, apart from lying about the content of the stolen e-mail, what exactly is wrong with denialism?

So apart from the removal of a few troublesome editors

Name one editor who has been removed.

blocking FOI requests

Denied because the data being requested doesn't belong to UEA CRU. McI et al need to ask the owners of the 2% of the data that aren't already in the public domain and freely available over the internet.

illegally deleting emails

There was frustrated talk of deleting e-mails. No evidence that any were deleted.

blocking any contrarians from publication

Lindzen, Christy, Spencer, etc all get published in the peer-reviewed literature. I guess we have a different definition of the verb "to block".

the conflict of interest of peer-reviewing each others papers

Well, if you think it's a conflict of interest for biologists to review biology papers, physicists physics papers, etc then I guess you're right. What do you think the word "peer" means?

cherry picking data

Give a specific example, please.

maintaining a database with virtually zero quality control

Let's use GISSTemp, then.

and modifying code to hide inconvenient trends as needed

You seriously want to replace the instrumental temperature record with derived tree-ring proxy information when the two conflict? This is one of the best denialist lines yet - "tree ring proxies are useless yet we should use them rather than the temp record of proxies show cooling while thermometers show warming".

That's stupid.

please tell me â exactly what is wrong with the science? (snicker snicker)

Nothing. What's wrong is that you're a lying idiot.

To dhogaza the Kool aid drinker......

One word: Harry_Read_Me.txt

Read it, and then re-read the emails. You won't because you don't want the truth.

rock on buddy.

By dpigslopper (not verified) on 25 Nov 2009 #permalink

From the CRU code file osborn-tree6/briffa_sep98_d.pro , used to prepare a graph purported to be of Northern Hemisphere temperatures and reconstructions.

;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,- 0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,âOooops!â
;
yearlyadj=interpol(valadj,yrloc,timey)
This, people, is blatant data-cooking, with no pretense otherwise. It flattens a period of warm temperatures in the 1940s â see those negative coefficients? Then, later on, it applies a positive multiplier so you get a nice dramatic hockey stick at the end of the century.

All you apologists weakly protesting that this is research business as usual and there are plausible explanations for everything in the emails? Sackcloth and ashes time for you. This isnât just a smoking gun, itâs a siege cannon with the barrel still hot.

more to come....just getting started........

By programmer pete (not verified) on 25 Nov 2009 #permalink

I've already answered ESR on his own blog. He doesn't know WTF he is talking about.

Nor do you.

paulm. huffingtonpost.com?

really?

By Arianna Oh Arianna (not verified) on 25 Nov 2009 #permalink

It might be better to let some of the cooler heads among the programming professionals talk the guy down, though. Else you risk getting a nasty feedback loop started.

Particularly on the holidays when the kids are off school and the "AOL Effect" is at full blast, it's important to just back away from the computer, realizing there's always a huge spike in drivel; be thankful for whatever in your life is more interesting than this.

Uh, oh the SNOWBALL is gaining momentum (James - we're hiring. call me)

UH OH! â raw data in New Zealand tells a different story than the âofficialâ one.

New Zealandâs NIWA accused of CRU-style temperature faking

The New Zealand Governmentâs chief climate advisory unit NIWA is under fire for allegedly massaging raw climate data to show a global warming trend that wasnât there.

The scandal breaks as fears grow worldwide that corruption of climate science is not confined to just Britainâs CRU climate research centre.

In New Zealandâs case, the figures published on NIWAâs [the National Institute of Water and Atmospheric research] website suggest a strong warming trend in New Zealand over the past century:

The caption to the photo on the NiWA site reads:

From NIWAâs web site â Figure 7: Mean annual temperature over New Zealand, from 1853 to 2008 inclusive, based on between 2 (from 1853) and 7 (from 1908) long-term station records. The blue and red bars show annual differences from the 1971 â 2000 average, the solid black line is a smoothed time series, and the dotted [straight] line is the linear trend over 1909 to 2008 (0.92°C/100 years).

But analysis of the raw climate data from the same temperature stations has just turned up a very different result:

Gone is the relentless rising temperature trend, and instead there appears to have been a much smaller growth in warming, consistent with the warming up of the planet after the end of the Little Ice Age in 1850.

The revelations are published today in a news alert from The Climate Science Coalition of NZ:

Straight away you can see thereâs no slopeâeither up or down. The temperatures are remarkably constant way back to the 1850s. Of course, the temperature still varies from year to year, but the trend stays levelâstatistically insignificant at 0.06°C per century since 1850.

Putting these two graphs side by side, you can see huge differences. What is going on?

Why does NIWAâs graph show strong warming, but graphing their own raw data looks completely different? Their graph shows warming, but the actual temperature readings show none whatsoever!

Have the readings in the official NIWA graph been adjusted?

It is relatively easy to find out. We compared raw data for each station (from NIWAâs web site) with the adjusted official data, which we obtained from one of Dr Salingerâs colleagues.

Requests for this information from Dr Salinger himself over the years, by different scientists, have long gone unanswered, but now we might discover the truth.

Proof of man-made warming

What did we find? First, the station histories are unremarkable. There are no reasons for any large corrections. But we were astonished to find that strong adjustments have indeed been made.

About half the adjustments actually created a warming trend where none existed; the other half greatly exaggerated existing warming. All the adjustments increased or even created a warming trend, with only one (Dunedin) going the other way and slightly reducing the original trend.

The shocking truth is that the oldest readings have been cranked way down and later readings artificially lifted to give a false impression of warming, as documented below. There is nothing in the station histories to warrant these adjustments and to date Dr Salinger and NIWA have not revealed why they did this.

One station, Hokitika, had its early temperatures reduced by a huge 1.3°C, creating strong warming from a mild cooling, yet thereâs no apparent reason for it.

We have discovered that the warming in New Zealand over the past 156 years was indeed man-made, but it had nothing to do with emissions of CO2âit was created by man-made adjustments of the temperature. Itâs a disgrace.

NIWA claim their official graph reveals a rising trend of 0.92ºC per century, which means (they claim) we warmed more than the rest of the globe, for according to the IPCC, global warming over the 20th century was only about 0.6°C.

NIWAâs David Wratt has told Investigate magazine this afternoon his organization denies faking temperature data and he claims NIWA has a good explanation for adjusting the temperature data upward. Wratt says NIWA is drafting a media response for release later this afternoon which will explain why they altered the raw data.

âDo you agree it might look bad in the wake of the CRU scandal?â

âNo, no,â replied Wratt before hitting out at the Climate Science Coalition and accusing them of âmisleadingâ people about the temperature adjustments.

Obvious manipulation of raw data is becoming all to commonplace as a skeptical world begins to put the global warming society under the microscope.

By Reality sinks in (not verified) on 25 Nov 2009 #permalink

Pseudonymous commenter @ # 9 - I have the guts to write here under my own name. How 'bout you?

Dave X @ # 12: ... if you look to the worst case scenarios, they do seem to encompass the observations.

Thanks for the feedback (intelligent & informed comments are pretty sparse in the immediate environment, yaknowwhaddimean?).

It's a sad reflection on our times when "the worst-case scenarios are occurring" comes across as a relatively optimistic statement that maybe somebody somewhere does have a handle on the situation.

By Pierce R. Butler (not verified) on 25 Nov 2009 #permalink

Read "Climate Cover-up." It does a thorough job of documenting the history of individuals and groups of individuals doing their utmost to deceive and manipulate the public. These emails are trivial in comparison.

The critics are a collection of kettles calling a small pot black.

In NZ, NIWA was right (NZ is warming) and the denialist non-scientists at NZCSC are complete crooks.

http://hot-topic.co.nz/nz-sceptics-lie-about-temp-records-try-to-smear-…

This one's simple. The denialists are pretending that temperature data from different sites was from the same site.

You know, surely, that temperature varies from place to place? Warmer at the airport than at Kelburn? You have to correct for that to spot a long-term trend, since the choice of measuring stations has changed over the years.

Ignoring the difference between local sites is a pretty simplistic way to generate garbage, but it happens to hide the warming trends, so the denialists in NZ (no climate scientists among them, despite their organization name) are using it.

This lie is so simple I think even casual readers will be able to understand why the government is right and the denialists are wrong in this case.

By Nathanael (not verified) on 26 Nov 2009 #permalink

"yearlyadj=interpol(valadj,yrloc,timey)"
That result is not used anywhere!!!!!!
Posted by: Phil | November 26, 2009 11:40 AM
----------------------------------
Phil? Phil Jones? :D
Is it sooo difficult to look at the next line?
densall=densall+yearlyadj
densall is used in plotting.

PS I looked at Harris's version, not Osborn's one. In Osborn's version the line in question is commented out, but it is still there and could easily be activated with few keystrokes.

PS I looked at Harris's version, not Osborn's one. In Osborn's version the line in question is commented out, but it is still there and could easily be activated with few keystrokes.

Oh, gosh, scientists poking and prodding at data, trying to understand stuff ...

SCANDAL, I SAY! SCANDAL!

(all caps in an effort to capture the proper level of denialist pseudo-rage)

I just came across something that absolutely amazes me about the IPCC report and the Hadley CRU manipulation of data and statistics, and yes I used the word âManipulationâ specifically and in a derogatory sense. The IPCC and Dr. Jones in specific use a baseline to reference temperature anomalies against. Pretty normal scientific method and you do have to have some sort of baseline to compare against. So what average did they choose as a baseline. 1961 to 1990. This is extremely interesting as the period between 1950 and the early 80âs was known to be a particularly cold period of average decreasing temperatures. Remember the hype in the 70âs about the coming of the next ice age. So why would you choose the coldest decades of this century as your baseline reference to evaluate temperature anomalies against. You would if you had a preconceived theory you wanted prove and specifically wanted to show global warming as temperatures recovered after that point. It really depends on your starting point and ending point for the reference baseline. Choose some other time frame for a reference average and you get different average temperature and therefore different anomalies and different trends. Choose that specific time period and you are guaranteed to show global warming. What you donât be live me? Want to call me a âDenierâ? Want references? Then go straight to the horses mouth.
http://ipcc-wg1.ucar.edu/wg1/Report/AR4WG1_Print_Ch03.pdf and look at Figure 3.1 on page 8 of the PDF file. In the text for Figure 3 âAnnual anomalies of global land-surface air temperature (°C), 1850 to 2005, relative to the 1961 to 1990 mean for CRUTEM3â
The graph is designed very specifically to highlight what they wanted to show, which is warming. Because the y-axis is the difference from a baseline reference average, if you change the baseline average value the trends will look much different (they can actually reverse in portions). Set the baseline to the average temperature from 1900 to 1990 and you have to include all the data in the 30âs to 50âs which raises the overall average and suddenly the differentials will âlookâ much different and the âamountâ of warming in the last 2 decades is way smaller and way less impressive looking. In fact it becomes a non-issue, rather its just within normal variability

By Phyllograptus (not verified) on 27 Nov 2009 #permalink

This whole mess is the typical end game of any politically driven social engineering attempt. Eventually someone flips the light switch on, and the roaches scatter, leaving their putrid droppings behind ..............................

yawn.

By James Lee, Ph.D. (not verified) on 27 Nov 2009 #permalink

This whole mess is the typical end game of any politically driven social engineering attempt. Eventually someone flips the light switch on, and the roaches scatter, leaving their putrid droppings behind ..............................

yawn.

By James Lee, Ph.D. (not verified) on 27 Nov 2009 #permalink

I just came across something that absolutely amazes me about the IPCC report and the Hadley CRU manipulation of data and statistics, and yes I used the word âManipulationâ specifically and in a derogatory sense.

Choice of baseline has absolutely no effect on trend.

(x1 - b) - (x0 - b) = x1 - x0 for all values of b(aseline).

Phyllograptus,
If a different baseline was used, the temperature anomaly lines would look *exactly* the same, only the labels on the y-axis would shift by a constant amount. Think about it a bit.

The lines would look the same IF they were plotting against Temperature on the y-axis. They are plotting against DIFFERENCE. Raise the average baseline value and the Difference of the temperature from average becomes different. If the the new higher average baseline is above the average baseline they chose automatically any value between temperature between the two averages now becomes a negative anomaly when it used to be a positive anomaly. The positive anomalies all get smaller. The negative anomalies all get bigger. Therefore the "look" changes
Look at the graph and think about it!

By Phyllograptus (not verified) on 27 Nov 2009 #permalink

Phyllograptus,
If the decade 2000-2009 were chosen as baseline, so that all but the most recent temperature anomalies were negative, the curves would still be identical as before, but the y-axis labels, including the zero anomaly tickmark, would all shift upward by six ticks or so. The anomaly trends would remain the same as before.

Presumably Phyllograptus flunked childhood algebra ...

LOL
You guys are proving my point. Its an elegantly simple deceptive graph.

Okay. Too get brutally simple. Say you have some time series data
Time (X), value(Y)
1,2
2,3
3,5
4,5
5,6
6,7
7,7
8,8
9,8
10,9
The average of ALL of the above value series is 6.
Processed dataset 1 using average of 6
So the time series of the difference from the average is
1,-4
2,-3
3,-1
4,-1
5,0
6,+1
7,+1
8,+2
9,+2
10,+3
However if I take the average of just the first 5 values in the time series, the average is 4 (actually 4.2, but lets round for simplicities sake)
Processed dataset 2 using average of 4
So now the time series of the difference from the average (4) based on a SUBSET of the data is
1,-2
2,-1
3,+1
4,+1
5,+2
6,+3
7,+3
8,+4
9,+4
10,+5
If I graph Processed Dataset One and Processed Dataset Two the graphs are going to look different. Look at time 4, Dataset One value is -1, Dataset two value is +1.Opposite deviance from the average and the negative values are reduced in magnitude, while the positives are increased in magnitude. So if I have a data set increasing in time and I want to create the maximum appearance of deviance as time progresses, I select a SUBSET of the data to generate a low average and compare that average value to the ENTIRE dataset. Viola, instant magnified trend.
But wait you say, thatâs just a trick because you only used a SUBSET of the data and at the low end at that to calculate the average of 4, otherwise you couldnât do this.
Which is what my point was in my original post. The IPCC took a SUBSET of the data at a period of low average temperatures to calculate the Baseline average (1961-1990) and calculated the difference from that baseline average for the ENTIRE dataset. Viola!
However, choose a different SUBSET, you get a different average and therefore a different dataset of DIFFERENCE values for the same ENTIRE dataset, and a different looking graph. Arenât tricks fun!

By Phyllograptus (not verified) on 27 Nov 2009 #permalink

And you still get the exact same shape if you plot the difference numbers, and you still get the exact same slopes to your trends.

Your example illustrates precisely that your amazing problem that nobody in the scientific community has seen ...

... isn't a problem at all.

By Michael Ralston (not verified) on 28 Nov 2009 #permalink

The picture that really cracks me up is the idea of a worldwide conspiracy of climate scientists cooking the data (pun intended) and getting together at meetings to plot their nefarious...something or other. I mean, have you ever been to one of these meetings?

I mean, have you ever been to one of these meetings?

No, but if they're like most other technical types, they'll do anything for free beer.

There's the key to uncovering the wholesale corruption of the climate science community - follow the beer!

Here come the armchair climatoligists:

http://biodiversivist.blogspot.com/2009/11/armchair-climatologist.html

If Climate change is a battle against good and evil, we need to sort out which is which. A good place to start is Grandia's new book Climate Cover Up. Not an exciting read but it does a good job documenting the sleeze found on one side of the debate.

This video found over at A Few Things Illconsidered is worth a thousand words:

http://scienceblogs.com/illconsidered/2009/11/ccw_-_birth_of_a_climate_…

Supra Shoes - Easy! You just start back at the original un-massaged raw data. Oh, wait..................D'OH!

I guess "blind trust" is in order then. Just ask a believer.

By Dr. A. F. Lac (not verified) on 30 Nov 2009 #permalink

Those that want the raw data can go to the GHCN...oh wait, we should not be talking about facts here, it's all about making claims without knowing what we're talking about!

Quite remarkable. I see that several of you have tried to clue Phyllograptus about the basic error of understanding he/she is labouring under, but to no effect.

So I will just look on in wonder and amazement - a perfect example of Dunning-Kruger in action.

Regards
Luke

By Luke Silburn (not verified) on 01 Dec 2009 #permalink

Okay, Iâll try to explain one last time. As people have pointed out the lines will not change, the data and the lines remain the same & as Michael Ralston says âyou still get the exact same slope to your trends.â (In fact if you plot the very small datasets I demonstrated as a line graph all that occurs is the lines move up by a scaler of 2). Thatâs not what Iâm talking about, rather Iâm discussing the âlookâ or the appearance of the graph. So am I splitting hairs? Yes, somewhat.
However the graph in question is not just a line graph, it is a line graph and bar graph combined. The bar graph is dependant how you choose the average or in their parlance âbaselineâ, which is what my other post showed with the negative values becoming positive and the negatives getting smaller and the positives getting larger, the positive or negative anomalies depend on what the baseline is and as such the bars are visually different.
The human mind is amazing at recognizing visual patterns and the fact that I consider this an elegantly simple deceptive graph comes from the fact that the authors have used that ability to recognize patterns to present exactly the image they want to portray and highlight. My contention is that the choice to combine a line graph and bar graph together and choose a specific subset of data to calculate the difference from is to give the graph more as Iâll refer to it, âVisual Weightâ or easier pattern recognition for the viewer.
So, if the authors had chosen to calculate the average with the entire dataset the average would have been lower and therefore the baseline to calculate the difference from would have been lower. If this had done there would have been significantly more âvisual weightâ to the gradually increasing trend and more of the bars would be positive anomalies, especially in the last half of the dataset. However that is not the image they wanted to present, because that would give the impression that the warming trend started much earlier, prior to the major human caused increases in CO2.
If the authors choose a subset of from say 1900 to 1990 the average would be higher than the average they are using and therefore more of the bars would have negative anomalies, and therefore the âvisual weightingâ, especially right at the end would be suppressed,. Which is also not the image they want to present, in this case the pattern that gets recognized is so whatâs the problem it just looks like natural variability and the dramatic rise at the tail end (most recent) data gets suppressed.
However, if they choose the subset they did you get a very simple image that shows exactly what you want it to show and the âvisual weightâ to the pattern sits exactly where you want it, at the tail end with the appearance of sudden and positively anomalous trend in the last part of the dataset.
So that is how and why the graph would âlookâ different. It is a matter of using the data to present the image you specifically want to highlight. Just the lines by themselves do not give the image they want to present, to many people they kind of show a trend of long term rising temperatures from before major human induced CO2 rises.
So to create the image you want to portray, you add the bars and choose the baseline that bests creates the âvisual weightâ you want and creates exactly the image that you want. As the old saw goes âa picture is worth a thousand wordsâ, and trying to explain how this graph is an elegantly simple deceptive graph proves it because it is taken thousands of words to try to explain the elegance of the image.

By Phyllograptus (not verified) on 01 Dec 2009 #permalink

hmmmmmmmm.................I wonder what this actually means...

"Both historical and near-real-time GHCN data undergo rigorous quality assurance reviews. These reviews include preprocessing checks on source data, time series checks that identify spurious changes in the mean and variance, spatial comparisons that verify the accuracy of the climatological mean and the seasonal cycle, and neighbor checks that identify outliers from both a serial and a spatial perspective"

By GHCN Data (not verified) on 01 Dec 2009 #permalink

hmmmmmmmm.................I wonder what this actually means...

It means the data coming from the met services occasionally contains errors.

Funny, when McI found the "Y2K" blip (due to russian stations, IIRC, duplicating data or making some other such error), much was made by the CA crowd of the fact that the error detection filters you describe didn't automatically catch that error.

Now, apparently, the existence of the error detection filter denialists earlier denigrated for not being stringent enough are "hmmm ... I wonder what this relaly means"-worthy.