Easterbrook on Lancet study

Obviously anything Gregg Easterbrook writes about the Lancet study is going to be really stupid, and sure enough, he gives us this:

The latest silly estimate comes from a new study in the British medical journal Lancet, which absurdly estimates that since March 2003 exactly 654,965 Iraqis have died as a consequence of American action. The study uses extremely loose methods of estimation, including attributing about half its total to "unknown causes." The study also commits the logical offense of multiplying a series of estimates, then treating the result as precise. White House officials have dismissed the Lancet study, and they should. It's gibberish.

Here's what the study actually says:

We estimate that as of July, 2006, there have been 654,965
(392,979-942,636) excess Iraqi deaths as a consequence of the war,
which corresponds to 2·5% of the population in the study area.

It does not say that the number is exactly 654,965 but gives a range. How could even someone as useless as Easterbrook get this wrong?

Nor does the study attribute half the deaths to "unknown causes". It breaks the numbers down several different ways. One of them is whether a violent was caused by the coalition or other parties. In half of the cases the respondent did not know which side was repsonsible. This is not attributing half the deaths to "unknown causes".

Atrios adds:

Probably the stupidest person in professional pundtry is Gregg Easterbrook. He's exhibit A for "too stupid to know he's stupid" and more than that he's too stupid to understand that there are people who know things that he doesn't, and more than that he's so stupid that he sets himself up as an authority about things he has absolutely zero comprehension of. It'd be comical except he's helping to make even more people as stupid as he is and what we don't need right now is even more stupid people.

More like this

Easterbrook is a strange bird. I find his comments about football amusing and perceptive. However, when he ventures outside that subject the veil of stupidity descends. Perhaps ESPN should give him a keyboard that gives a good shock when he ventures into science.

By natural cynic (not verified) on 25 Oct 2006 #permalink

How could even someone as useless as Easterbrook get this wrong?

Oo, Mr. Kotter! I know!

Easterbrook didn't read the study.

The authors of the study were a bit foolish to use so many digits of precision in the figures. They correctly state the mean value and the range but given that these are estimates, two significant digits (i.e. 650k deaths with a range of 390k to 940k) would have sufficed and shut-up some of the fools. It's a minor point and doesn't detract from the results of the study for anyone who actually understands stats but it does provide some fodder for misleading lay people.

The authors of the study were a bit foolish to use so many digits of precision in the figures.

I've heard social sciences people (at least one over at Good Math, Bad Math) say that writing out the entire integer is a convention of the field, but I've no idea if that's true.

By Anton Mates (not verified) on 25 Oct 2006 #permalink

As a former social scientist - yes, the concept of "significant digits" doesn't really enter into things for most people. Probably not a good thing , but unlike in, say, chemistry it isn't really a critical issue either.

This post raises a serious acedemic question: In this day and age, does it make more sense to omit the mean and only present the range when reporting statistics?

I vote yes.

wow...I learnt about significant digits in grade 8. they may not enter into things for most social scientists but the perception of many people (including most people with a math or science background), is that the presence of so many digits implies an associated degree of precision. Yes, it'd be great if journalists mentioned the mean and the range but while we wait for that, a small but helpful step would be to use an appropriate number of significant digits. In any case, we're talking about epidemiologists in the Lancet, not social scientists.

The Johns Hopkins study was published in a Medical Journal, so it presents statistcial data as it should be presented with the mean and standard confidence interval.

Presenting data in this manner has the extra (quite unintended) advantage of acting to weed out the nitwits who have no clue about statistics and immediately jump on the "precise" number.

Presumably, there was also a decimal part to the mean which would have really sent people like Easterbrook into the stratosphere.

Can you imagine what he would have said about 654,965.4 (or whatever) Iraqis dead? "There's no way you can have .4 people dead so the whole study is ludicrous."

I really wonder about the Brookings Institution which employs Easterbrook.

Do they have any standards?

The integer is presented because it is in general the MLE, or the value with the highest probability of being true (here, fractions of death become problematic). The problem with reporting CIs is that you have people believe that any number within the range is equally likely, when in general they are not. The difference in probability weight between 654964 and 654965 maybe small but the difference between 392979 and 654965 is huge.