on Bob Carter in the Age is a good one for playing
[Global Warming Skeptic
Bingo](http://scienceblogs.com/deltoid/2005/04/gwsbingo.php). Though I think I
should add a rule to the effect that if a numerical claim is wrong by
more than an order of magnitude you get a free square on the bingo
board. Look at what Carter claims:
>Carbon dioxide was a minor greenhouse gas, responsible for 3.6 per
>cent of the total greenhouse effect, [Carter] said. Of this, only 0.12 per
>cent, or 0.036 degrees Celsius, could be attributed to human activity.
show](http://www.realclimate.org/index.php?p=142) that without
CO2 the Greenhouse effect would be about 91% as strong.
Further, he implies that only 0.12/3.6=3% of the CO2 in the
atmosphere is due to human activity. But the concentration of
CO2 in the atmosphere has increased [from 280 ppm to 380
ppm](http://www.grida.no/climate/ipcc_tar/wg1/fig3-2.htm) and this
increase is [all due to human
activity](http://www.realclimate.org/index.php?p=160). So, correcting
Carter's numbers we have that 100/380=25% of the CO2 in the
atmosphere is anthropogenic, so 25% of 9%=2.4% of the greenhouse
effect or 0.7 degrees Celsius is man-made. Carter is wrong by a
factor of 20. Actually he's wrong by more than a factor of 20 since
his calculation assumes that the quantity of water vapour in the
atmosphere is fixed and this isn't true. As the globe warms there is
more water vapour in the atmosphere and this further strengthens the
So how did something this inaccurate get into the Age?
Well, Carter gave a speech to the Victorian Farmers Federation so the
reporter who wrote the story was their agricultural reporter rather
than their science reporter who might have noticed that Carter was
spouting a load of rubbish.
Tim, when you say
So, correcting Carter's numbers we have that 100/380=25% of the CO2 in the atmosphere is anthropogenic, so 25% of 9%=2.4% of the greenhouse effect or 0.7 degrees Celsius is man-made.
are you really saying that all of the 20th century warming can be attributed to human CO2 emissions? Either that or you are claiming that there was more than .7C of 20th century warming.
Jet, it is actually the case that nearly double the temperature increase observed during the 20th century is attributable to humans, because we have around 0.5C left in the pipe-line because the oceans are still warming up. This will flow through into global temperatures over the next 50-100 years, even if we cease adding to the anthropogenic greenhouse affect.
David, I will not argue that point, but the "20th century warming" is not taking that into account.
Tim, I'm afraid it may be time to expand your Bingo grid to allow for Carter's apparent claim (possibly distorted by the reporter, to be fair)that the reduction in average global temperature since 1998 somehow bolsters his argument that we are beginning a long-term cooling trend. I've seen this factoid a few other times, although it's clearly one that the more rational septics would tend to avoid.
Also, regarding the three prior posts, global dimming has been a major factor in keeping things a lot cooler than they otherwise would have been. There's an extensive discussion of this and related issues at www.realclimate.org.
What are the odds (or more importantly, which model predicted it) that with diminishing effects of global dimming and increase in the pace of CO2 output that there would be 7 years of either little warming or slight cooling? Makes a lot more sense when contrasted with the strong GCR record for that decade. Anthropogenic greenhouse gases may still account for less than 50% of total positive forcings instead of the 60-75% that is current "fact".
Jet, it sounds as if you're already sufficiently familiar with this stuff to know that no model makes any prediction for such a short period of time. Seven years is weather, not climate. In that context, the odds are actually pretty good that during an overall warming climate trend one can look in isolation at a period of time immediately following a record high year and find an apparent cooling trend. Similarly, one can look at a period of time up to and including a record year and find a short-term heating trend. I believe this is why running averages are generally considered more useful; the highs and lows get knocked out and the longer-term trend becomes apparent.
Regarding the <50% figure you cite, could you post or link to the full calculation from which it is derived?
All I could post would be papers, educational notes, and the IPCC paper, contemplating GCR and the correlation with Global Warming. No one has disproven a causal relationship for sure, and the correlation remains strong. The best argument I've heard against the theory is that GCR trends aren't exactly uniform, which probably has a lot to do with GCR not being constant, which makes sense given that the Universe is always moving or something is exploding. These last 7 years appear to coincide with a stronger than average decade for GCR's, only strengthening the correlation. So my <50% is just guesstimation using the +-5% delta in cloud cover that GCR's could account for.
On a side note, I think something usually lost in these layman GCR debates is that it isn't just Sun Spots vs GCRs. If the Sun's magnetic activity is taken into account, the correlation between climate and solar activity becomes even more apparent and more direct.
What IPCC paper? Please provide that link. My impression, based on http://www.realclimate.org/index.php?p=153#more-153, is that the GCR hypothesis has been debunked. The correlation with solar magnetism seems like even more of a reach, but I'd be interested in looking at any analysis linking such a correlation to global warming.
First RealClimate's choice of graphs for GCR vs AA starts at 1950, a horrible date almost to be blatant cherry picking. The graph should have included derived data from at least 1880 to show a better picture of the massive increase in solar activity (almost doubling) and resultant drop in GCR proxies. Also, AA should have been replaced with sunspot counts + magnetic activity to show a better correlation. Also the Vostok ice core graph is so fuzzy, you can't really read it. But if you can find a good graph, you'll notice that CO2 levels lag behind temperature increases and are likely not the main forcing driving historic climate.
As for the IPCC paper, I'm referring to the 1999 drafting which merely states that GCR's quite possibly are a major forcing, yet are not able to be properly modeled, thus the IPCC ignores them. The IPCC recognizes GCR's as probably important, then acknowledges the lack of ability to model them, then ignores them.
But the main point, besides the 8,000 year high in solar activity, is that since 1900, sunspot counts have more than doubled as well as the recent measuring of the reduction of Earthshine (Palle). Palle's Earthshine study is an even stronger correlation to solar activity, in that the recent 7 years of higher than average GCR data, coincide with a reversal of falling levels of Earthshine, and also a slowing, or minor reversal of global warming.
Certainly not proof, but only a fanatic would ignore all this based on RealClimate's cherry picking (1950 to 2000 my ass) and short term climate observations (offered as evidence) that their is no mechanism for GCR's to interact with cloud cover.
Here is where the IPCC sums up the unknowns for indirect solar forcings.
Jet says that the IPCC TAR said
that GCR's quite possibly are a major forcing
Having read the reference, I would suggest that "quite possibly" is a huge stretch and might is about as good as it gets.
The GCR/solar magnetic field stuff has three problems. The first is that there is no good mechanism. The argument also suffers from the fact that the originally proposed mechanisms were knocked down pretty conclusively by the data, although the jury is still slightly out on more recent ones. The second is overprediction. If you accept the various correlation arguments the amount of positive forcing for GCR/SMF + GHG and aerosols is much too high for the observed response. The third, well, like Schleswig-Holstein, I have forgotten (actually the argument has been around since ~1970 and keeps popping up again driven by cosmic ray observers and keeps getting knocked down. At some point folk lose interest, sort of like they did in flat earth arguments).
jet, you haven't read the Vostok papers regarding CO2's role in forcing. I can tell. Are you reading second-hand predigested pap from somewhere like GES or see-oh-too?
Anyway, I can also tell you haven't read any papers on GCR. Are you reading second-hand predigested pap from somewhere like GES or see-oh-too? Eli addresses it better than I can, but essentially once you read the actual papers on the subject the conclusions are all preliminary, and the authors say so.
There isn't enough there to hang your hat on, much less a tout.
You may want to scrutinize where you get your parroted arguments from, as their language shows in your comments...
"HTH" "second-hand predigested pap" "parroted arguments" "as their language shows in your comments"
Nice, nothing more convincing than a dickhead.
Since I don't have access to most of these papers, I rely on things like answers.com and Wikipedia which source their info form the original papers. Although sometimes I can get to the original. I mean, I've ran some of my own simple regressions on the Vostok data, but that doesn't show me anything I can't see much better done on Wikipedia.
1. I thought there were CE RN laboratory results showing that GCR's can effect water droplet formation in clouds?
2. The implication is that the overprediction is found in one of the other forcings being overly attributed.
3. I had to look up the joke.
If you haven't read the papers, jet, how can you speak to them and what they say?
Yes, ions and ionizing radiation can form CCNs, the problem is getting enough of it where the clouds form. The first order prediction is increased cloud formation at high latitudes where ionizing radiation is larger. Compare with observations.