At the Expense of the Future: Explaining the Success of Climate Science

A new concern arose around the turn of the 21st century, among the advancements in technology and science: what is the future of our planet's climate? This is a bold question, considering traditional problems with predicting the future. We have no evidence of future events, due to the asymmetry of time. It is difficult even to reconcile different interpretations of present conditions, because of epistemological flaws in our methods of observation. In the face of such uncertainties, and with fortunetelling abandoned along with magical thinking many years before, can science provide useful predictions about our planet? There is a chance; if we can successfully explain conditions in the past and determine that present conditions are similar, we can then conclude that the future will probably proceed similarly as events in the past.

In order to determine the accuracy of the predictions of climate science, we need to understand how the predictions are made and determine if they are really predictions in the classical scientific sense of the word. Traditionally, a scientific prediction is synonymous with a hypothesis: the scientist makes an inference about the results of an experiment, and then tests to see if the results match the inference. For example, a scientist may test in a laboratory to see if carbon dioxide (CO2) can trap heat. Light (or heat) enters the experiment at one wavelength (similar to sunlight) through a layer of CO2, and then emitted at another wavelength (similar to "earthshine" or the light reflected off the Earth's surface.) When the returning waves of heat hit the layer of gasses, the different wavelength causes the asymmetrical CO2 molecules to vibrate, and reflect the heat back, rather than escape. This can be tested repeatedly, or as Karl Popper described, be falsified. According to Popper, this is the essence of science itself: tests must be repeated until proven false. If they can't be repeated, they cannot be scientific. As he wrote, "a theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory (as people often think) but a vice."

While CO2 has always trapped heat under laboratory conditions, this does not immediately lead to the conclusion that future additions of CO2 into our atmosphere will trap heat and lead to widespread warming. The actual climate contains many more variables than the laboratory test. The future climate is even more difficult to approach. Not only are there more variables than in laboratory conditions, but the actual conditions are not yet available for study. (Surely Popper would frown on calling any broad attempts to study future conditions "science".) In order to study what has not happened yet, climate scientists rely on advanced models to simulate possible outcomes. Conditions tested in a laboratory can be simulated on a computer with a multitude of variables and applied at large scales.

i-e824b284c72f8e25b3af5b700ae359ac-atmosphereprofile.jpg

A profile of our atmosphere, showing the absorption of radiation by greenhouse gasses. (Remember this diagram for a chart that follows later on.)

Even if conditions can be simulated in a model, how do we know if they are even remotely accurate? In order to test the accuracy of their models, climate scientists must look at not predictive science, but historical science. While the past (particularly the distant past) cannot be studied any more directly than the future, past climatic conditions left traces of evidence: gasses trapped in ice, temperatures recorded in tree rings, and sediments bearing telltale radioactive signatures. According to philosopher of science, Carol Cleland, such traces of the past are comparatively abundant, due to the asymmetry of time; on the opposite side of this imbalance, we have no traces of the future. Cleland argues that even relatively few traces can point at a single cause (a "smoking gun") for an event that occurred in the distant past. She gives the example of a volcanic eruption: just a few traces of ash are enough to tell us the volcano erupted. She wrote, "predicting the occurrence of an eruption is much more difficult than inferring that one has already occurred." So, using traces of evidence for past events, scientists can deductively reconstruct the conditions of the past.

For example, by studying isotopes of carbon left in sediments, scientists were able to determine that a large amount of carbon-several thousand Gton, or about the equivalent of our coal reserves-was released into the atmosphere or oceans about 55 million years ago. The spike in carbon was followed by another trend, revealed in the sediments: signs that the ocean temperatures rose by 5°C. This was determined by studying the ratios of another isotope, this time of oxygen, recorded in the calcium carbonate (CaCO3) shells of organisms that lived in the oceans at the time. We know that these animals rely on certain temperatures for survival as well as the availability of new CaCO3 to form new shells. So, by studying the ratios of oxygen isotopes in oceanic sediments, scientists were able to reconstruct the temperature of the ocean 55 million years ago, during the event that is now called the Paleocene Eocene Thermal Maximum or PETM.

i-330c809ee44eefe94cf563b8c845c5ee-PETM.jpg

During the PETM, not only did temperatures rise drastically when massive amounts of carbon were released (Archer 2007, p. 136) but the animals suffered as well. The excess carbon dissolved all the available CaCO3 into carbonic acid. Not only did the animals suffer extinction, but their ancestors were dissolved as well. While we cannot determine the cause of the event-a "smoking gun" has yet to be found to incriminate the guilty party-we can see the tragic results. In order to create long term models of the historic climate, scientists look at clues left from anomalous events like the PETM in addition to variations in temperature over time, as revealed in ongoing records like ice cores and tree rings. These can be read backwards in time, revealing gradual changes in temperature and CO2 composition.

Computer models must simulate a variety of conditions to match the historical records, factoring not only the relationship between CO2 and the temperature, but variations in precipitation, geography, and most importantly, the source of the heat: the sun. Most of the historical temperature record (anomalous events aside) correlates strongly with the changes in the rotation of the earth around the sun. The overall amount of solar radiation reaching the earth changes along with the planet's rotation around the sun. Milutin Milankovic was among the first* to describe these changes in around the turn of the 20th century, although decades passed before the effects were completely understood. He found that when following the distance of the planet from the sun as it orbits, as well as the tilt and wobble along the path, the temperature changes on a 400,000 year time scale. Models based on the Milankovitch cycles and other natural feedbacks accurately describe the historical temperature record up until the past century, when we find anomalous warming that cannot be explained by natural phenomena (For more information, please see David Archer's Global Warming: Understanding the Forecast, an informative text with accompanying online models.)

Unlike the mysterious cause of the tragic events that occurred in the PETM, there are enough traces of evidence to suggest a certain cause for recent changes in temperature. In this case, rather than depending on isotopic ratios to reconstruct historical temperatures, we can look at direct measurements from thermometers, located not only on the earth's surface and underground, but also in various levels of the atmosphere, thanks to satellites and weather balloons. All of these records show a warming trend over recent decades virtually everywhere-except the lower stratosphere, just above the layer of greenhouse gasses. This layer of the atmosphere has actually been cooling while the air trapped beneath has been warming. This is the crucial evidence for pinpointing a specific cause for warming-if the temperature rise were due to anything other than heat trapped by an excess of CO2 in the greenhouse layer, we would expect the stratosphere to be warming right along with the area below, the troposphere.

i-c45c70f22d12923f9f0090fe59f8af0c-globalatmostemps.jpg

Historical records for stratospheric and tropospheric temperatures. (Compare this with the atmospheric profile diagram shown above.)

If models can accurately depict past events (that is, produce accurate hindcasts) and reproduce present conditions, they can begin to study the probabilities of different future conditions, or, in other words, produce forecasts. If we find crucial evidence that a cause is present which is similar to past events, we can assume the effects in the future will be similar to the effects in the past. In Carol Cleland's terms, if we find a smoking gun that has recently been fired, we can infer that victims of the shooting may also soon be found. In this case, as we, the consumers of fossil fuels who add excess CO2 into the atmosphere, stand accused for global warming, while temperature records and climate models provide more than enough crucial evidence for a solid conviction.

In 1988, climatologist James Hansen produced a model which incorporated anthropogenic warming and ran it for various scenarios through the year 2020. In order to do this, Hansen's model included many other atmospheric variables in addition to CO2 emissions caused by fossil fuel burning. For instance, Hansen knew that volcanic eruptions occur with semi-periodic frequency, averaging about one eruption per decade. (He could not know when an eruption would occur, of course, for the reasons described above by Carol Cleland.) So, when running his models, he added a volcanic eruption in the middle of the 1990s. The actual eruption, Pinatubo, occurred in 1991. Hansen sent a graph comparing his predictions to the record to author and climate skeptic, Michael Crichton, which clearly exhibited the near accuracy of Hansen's model. The volcano discrepancy-albeit small-was the only obvious inaccuracy in the entire prediction.

i-ec337ac652dab4dc7708ba195cabfb8b-hansensmodel.jpg

James Hansen's predictions, for several different scenarios based on emissions use. A is a high growth scenario, B is "buisness as usual", and C is for a "greener" future. According to the caption, "Shaded area is an estimate of the global temperature during the peak of the current interglacial period (the Altithermal, peaking about 6,000 to 10,000 years ago, when we estimate that global temperature was in the lower part of the shaded area) and the prior interglacial period (the Eemian period, about 120,000 years ago, when we estimate that global temperature probably peaked near the upper part of the shaded area). The temperature zero point is the 1951-1980 mean."

As we have shown, climate science does not rely solely on falsifiable methods of experimental science. Instead, it involves an interdisciplinary approach, incorporating many methods, both experimental and historical. These methods incorporate a set of assumptions used by many different types of scientists. According to philosopher of science Larry Laudan, the effectiveness of science lies not in methodology, but in the success of problems solved within an overarching research tradition which incorporates the sorts of assumptions used by climate scientists. As described by Laudan, a research tradition serves specific functions: "They indicate what assumptions can be regarded as uncontroversial 'background knowledge' to all scientists working in that tradition" and "establish rules for the collection of data and for the testing of theories." The research tradition that climate science is currently conducted in sets rules for methodology and provides useful assumptions.

For instance, scientists use Ockham's razor, or the idea that the simplest explanation is the best, when they suggest an increase in greenhouse gasses is more likely to be the cause of warming than anomalies in radiation coming from the sun; it would take a far more complex theory to explain the differences between the temperatures in the troposphere and stratosphere. Climate scientists also use assumptions about the nature of time when they apply the principle of uniformity, which suggests processes happening on earth now are similar events in the distant past or future. Principles like Ockham's razor and uniformity allow science to succeed because they are successful--a better approach has yet to come along.

Theories which argue against anthropogenic causes of global warming are not as successful within Laudan's view, as they tend to create more methodological problems than they solve. Competing theories often operate outside of the research tradition, favoring outlandish causes for temperature rise over simpler models. In some cases, skeptics argue that observations disagree with the theory that warming is happening, but fail to provide evidence supporting this. In other cases, evidence is manipulated or omitted to show different conclusions. Without any supporting data, the skeptics can at best argue that observations are incomplete, citing empirical problems with science at large. In Laudan's view, this is not enough to replace the conclusions drawn by most climate scientists, which are more successful at problem-solving within the accepted research tradition.

Working within the current research tradition to explain the conditions both past and present, climate scientists can create useful predictions and conclude that future events may be similar to events in the past. In this vein, thousands of models have been constructed over the past decades, in order to predict the future conditions of our climate, on both global and regional levels. In this case, we can conclude that the recent additions of CO2 into our atmosphere will lead to a rise in global temperatures between 1.2°C and 4.1°C in the next century, which may have catastrophic effects. Many of the effects are already beginning to show-polar ice is thinning faster than expected while temperature records have been breaking with greater frequency. Climate science can offer useful predictions-can we afford not to listen to them?

This was my final term paper for my philosophy of science class in 2007. Image credits: Atmosphere profile image via the Department of Physics and Astronomy, University of Georgia, temperature record showing PETM via Global Warming Art, atmospheric temperature record charts via the Met Office Hadley Centre.

Categories

More like this

Super paper! My goal in the world is to search for some sort of synthesis as well ... very well put!

By Allan Evans (not verified) on 14 Jan 2008 #permalink

Try using IPCC HadCrut3 data and the projections of the various IPCC reports. The forecasts are not even close to reality.

The man responsible for IPCC data, Dr Phil Jones admits there has been no warming for tghe past 15 yrs.

IPCC lead author Trenberth says "it's a travesty" they can't explain the lack of warming.

Your own plot for stratosphere temps shows warming for the past 15 yrs, instead of the IPCC forecast cooling.

The is not a single piece of hard evidence for man caused global warming. If there was, it would be in the latest IPCC report ... but there is NONE !

By Dr A Burns (not verified) on 19 Mar 2011 #permalink

Dear Dr. Burns,

I am not really sure what to make of your claims. Could you be more specific about which temperature projection I am comparing?

I don't know quite what you mean re: stratospheric temps. The graph presented only extend to 2007, and thus could not show cooling over the 15 years prior to your 2011 comment. I checked the UAH satellite record for March 1994 - March 2011. It shows slight stratospheric cooling, not warming. The cooling is not statistically significant, however, the last 15 years *pictured* do show a highly significant (P<10^-6) decline. I think the real question is what is happening with the 'tail' following the Pinatubo erruption. Approximating this as June 1993 - Present, I find a significant (P<0.01) decline, though I haven't tested this for robustness.

Your comment re: Phil Jones is a bit of an oversimplification, again about statistical significance. He said that there was warming, but it was not significant at the 95% confidence interval.
http://www.skepticalscience.com/Phil-Jones-says-no-global-warming-since…
There are other criticisms (including from Trenberth himself) of the quotemines taken from the stolen CRU emails.

As for evidence for AGW, have you *read* the AR4? Maybe you really do want a single piece of evidence, some keystone that really decides things definitively. But I think the real strength of AGW as a theory is that it depends on a great number of pieces of evidence - cooling stratospheres, nights warming faster than days, changing seasonal timing and migration of species, etc... we could imaging a handful of mistakes throughout this collection of evidence, but even if they were critically flawed, it doesn't matter - there is still a huge weight of evidence even when you discard false positives.

Sorry; that should read (P<10^-6 for Sept 1992 - Sept 2007)

Great article! I've read it and probably need to think about it some and reread it. I thought it was interesting to discuss the asymmetry in the burden of proof for past vs. future events. On the other hand, I don't think it's out of the ordinary for theories to make statements about future states - how can a theory be falsifiable if it doesn't make predictions about future states?

You mention ocean acidification and the PETM, which are some of my favorite topics :) I actually have a series of articles which look at misinformation about ocean acidification. One of the issues I explore is how to justify extrapolating trends in the future. (Part II.5 in particular)

The series can be found here:
http://topologicoceans.wordpress.com/tag/john-everett/

take care!