Iraq Body Count has published a defence against some of the criticism they have been receiving. The Lancet study implies that there are about five times as many Iraqi deaths as the IBC number. They do not accept this and so are arguing that Lancet estimate is to high and is not corroborated by the ILCS:
Comparisons between the Lancet study and ILCS have been attempted in the past, one of the best-known being by British activist Milan Rai. His analysis concludes:
“If we crudely scale up the UNDP [IMIRA] figure to take account of the longer Lancet time period, we reach a figure (33,000) which is exactly the Lancet-derived figure of 33,000 violent deaths due to military action.”
This widely cited conclusion is wrong, for at least two reasons.
First, the correct Lancet figure for combat-related violence is nearer 39,000 than 33,000. The incorrect 33,000 figure was published by a blogger named Tim Lambert, and accepted uncritically by Rai. But data from the Lancet study itself shows that only a third of 57,600 violent deaths were due to criminal activity, leaving 38,400 combat-related violent deaths. A later re-analysis of Lancet data by the Small Arms Survey placed this figure at 39,000.
In fact, Rai did not uncritically accept my figure — he asked me to explain how it was derived and included the explanation immediately before the paragraph quoted by IBC:
Each death recorded in the Lancet study represents 3,000 deaths in Iraq in the period under consideration. Outside Fallujah, there were nine deaths caused by coalition forces and two by the insurgents. By simple multiplication, this results in 33,000 ‘war-related’ deaths according to the UNDP definition of this term, for the period March 2003-September 2004.
The only difference between my analysis and the Small Arms Survey one is that they included two violent deaths of “unknown origin” as well as the nine coalition- and the two insurgent-caused deaths. The Lancet study describes the breakdown like this:
Table 2 includes 12 violent deaths not attributed to coalition
forces, including 11 men and one woman. Of these, two were attributed
to anti-coalition forces, two were of unknown origin, seven were
criminal murders, and one was from the previous regime during the
The ILCS question asked whether a death was “war-related”. I don’t think that if the cause of death was unknown it would have been described as “war-related” by a respondent in the ILCS.
Correction: In comments Josh Dougherty points out that on a radio show on 28 Oct 2005 Les Roberts gave some more details: “2 people died in firefights where it was unclear where the bullet came from”. Therefore, these two should be included in the war-related deaths so my revised estimate is 39,000. This doesn’t affect my conclusion — the ILCS still corroborates the Lancet study.
The IBC defence continues:
Second, the correct ILCS figure is probably nearer 28,000 than 33,000. This is because the per-day death rate in the post-invasion period was much lower than during the invasion. Averaging across the whole period, as Rai does, gives an unrealistically high per-day rate for the post-invasion months over which the scaling-up was applied. ILCS does not provide its own time-distribution of deaths, but our own recalculation, which applies the Lancet time-distribution to ILCS, yields a scaled-up total of 28,165.
Unfortunately their recalculation contains multiple errors:
Of the 14 violent conflict-related deaths reported by Lancet, five took place during the 42-day invasion phase of March-April 2003, and nine took place in the remaining 510 days covered by the study.
The Lancet study does not report the time distribution for the 14 violent deaths you get if you exclude murders. It reports the time distribution for the 21 violent deaths outside Falluja. IBC seems to have made the unwarranted assumption that there were no murders during the invasion. They also assume that the death rate was constant in the post invasion period even though deaths increased in 2004. Let’s apply the age distribution correctly. Of the 21 violent deaths, 11 occurred before the ILCS was conducted, 6 happened in the months when the ILCS was being conducted, and 4 after the ILCS was finished. If we split the 6 evenly into before and after we get that 14 of 21 violent deaths would have been picked up by the ILCS. Using this to adjust the ILCS gives an estimate of 24,000x(21/14) = 36,000, which is higher than the 33,000 we used before.
The IBC continues:
When these two corrections are combined, it is revealed that the Lancet estimate remains some 10,000 (35%) above the scaled-up ILCS estimate.
Both of their corrections are erroneous, but even if they weren’t, the two numbers are still close to each other and the ILCS number still supports the Lancet estimate. Even if we reduced the 100,000 Lancet estimate by 10,000 it wouldn’t make much difference.
They then present the graph on the left to argue that
the ILCS data only allows for a one in a thousand chance that the true number lies within the upper half of the Lancet range (the area shaded in grey).
However, both confidence intervals are too narrow. All they have done is scale the original intervals for the Lancet and the ILCS estimate. But the Lancet number here is based on a smaller sample so the confidence interval is wider. And the ILCS number has been scaled by a factor based on 14 deaths in the Lancet study and will be much much wider if the uncertainty of that scale factor is taken into account. (And I know that confidence intervals are not probability distributions, but I’ll give them a pass on this, since you could uses Bayes theorem to get probability distributions that look like those above.)
This is a disappointing effort by the IBC. I believe that they should have sought advice from some experts in epidemiology.