Researchers Who Rushed Into Print a Study of Iraqi Civilian Deaths Now Wonder Why It Was IgnoredBy LILA GUTERMAN
When more than 200,000 people died in a tsunami caused by an Asian earthquake in December, the immediate reaction in the United States was an outpouring of grief and philanthropy, prompted by extensive coverage in the news media.
Two months earlier, the reaction in the United States to news of another large-scale human tragedy was much quieter. In late October, a study was published in The Lancet, a prestigious British medical journal, concluding that about 100,000 civilians had been killed in Iraq since it was invaded by a United States-led coalition in March 2003. On the eve of a contentious presidential election -- fought in part over U.S. policy on Iraq -- many American newspapers and television news programs ignored the study or buried reports about it far from the top headlines.
The paper, written by researchers at the Johns Hopkins University, Columbia University, and Baghdad's Al-Mustansiriya University, was based on a door-to-door survey in September of nearly 8,000 people in 33 randomly selected locations in Iraq. It was dangerous work, and the team of researchers was lucky to emerge from the survey unharmed.
The paper that they published carried some caveats. For instance, the researchers admitted that many of the dead might have been combatants. They also acknowledged that the true number of deaths could fall anywhere within a range of 8,000 to 194,000, a function of the researchers' having extrapolated their survey to a country of 25 million.
But the statistics do point to a number in the middle of that range. And the raw numbers upon which the researchers' extrapolation was based are undeniable: Since the invasion, the No. 1 cause of death among households surveyed was violence. The risk of death due to violence had increased 58-fold since before the war. And more than half of the people who had died from violence and its aftermath since the invasion began were women and children.
Neither the Defense Department nor the State Department responded to the paper, nor would they comment when contacted by The Chronicle. American news-media outlets largely published only short articles, noting how much higher the Lancet estimate was than previous estimates. Some pundits called the results politicized and worthless.
Les F. Roberts, a research associate at Hopkins and the lead author of the paper, was shocked by the muted or dismissive reception. He had expected the public response to his paper to be "moral outrage."
On its merits, the study should have received more prominent play. Public-health professionals have uniformly praised the paper for its correct methods and notable results.
"Les has used, and consistently uses, the best possible methodology," says Bradley A. Woodruff, a medical epidemiologist at the U.S. Centers for Disease Control and Prevention.
Indeed, the United Nations and the State Department have cited mortality numbers compiled by Mr. Roberts on previous conflicts as fact -- and have acted on those results.
What went wrong this time? Perhaps the rush by researchers and The Lancet to put the study in front of American voters before the election accomplished precisely the opposite result, drowning out a valuable study in the clamor of the presidential campaign.
A Risky Proposition
Mr. Roberts has studied mortality caused by war since 1992, having done surveys in locations including Bosnia, Congo, and Rwanda. His three surveys in Congo for the International Rescue Committee, a nongovernmental humanitarian organization, in which he used methods akin to those of his Iraq study, received a great deal of attention. "Tony Blair and Colin Powell have quoted those results time and time again without any question as to the precision or validity," he says.
Mr. Roberts's first survey in Congo, in 2000, estimated that 1.7 million people had died over 22 months of armed conflict. The response was dramatic. Within a month, the U.N. Security Council passed a resolution that all foreign armies must leave Congo, and later that year, the United Nations called for $140-million in aid to that country, more than doubling its previous annual request. Later, citing the study, the State Department announced a pledge of an additional $10-million for emergency programs in Congo.
About a year ago, Mr. Roberts decided to study mortality in Iraq. He connected with a colleague at Columbia, Richard Garfield, a professor of nursing who had done research in Iraq since the mid-1990s. Mr. Garfield knew Riyadh Lafta, a mortality researcher at Al-Mustansiriya University, who recruited interviewers to do the door-to-door survey.
"He had to ask many people before he could find five interviewers willing to work in a study that involved an American," Mr. Roberts says.
Mr. Roberts planned to travel to Iraq last spring. After an American hostage was beheaded on video, however, Dr. Lafta told him that the danger was too great. "I was going to go in June, but in June it was even worse," Mr. Roberts says. Finally, in late August, with a fall teaching commitment looming, he decided to go.
On September 1, Mr. Roberts sneaked into Iraq from Jordan, lying on the back seat of a sport-utility vehicle. He trained the interviewers, who tried out their questions in a relatively safe neighborhood of Baghdad before embarking on the study.
The researchers visited 30 homes in each of 33 neighborhoods in Iraq. They selected the communities to be surveyed using a random process adjusted so that more-populous areas were more likely to be picked, giving each person in Iraq an equal chance of being interviewed. Within each community, a spot was chosen at random, and the interviewers visited the 30 households nearest to that point.
At each house, the interviewers asked for the age and sex of everyone living there currently and on January 1, 2002. The interviewers asked about deaths since the first day of 2002 and recorded the day, cause, and circumstances, so that they could compare the time just before the 2003 invasion with the period since then.
In each neighborhood, in at least the first two households where an adult's death had occurred, the interviewers ended by asking for death certificates. They received confirmation of deaths in 63 of the 78 houses where they asked.
Mr. Garfield says the high proportion of death certificates assuaged his concern that lying might be widespread. In unstable countries, where records of deaths aren't always thorough, ascertaining lies or simply faulty memories becomes difficult.
At first Mr. Roberts accompanied the Iraqi researchers. To mask his identity, he dyed his graying brown hair black, wore Iraqi clothing, and never spoke in public. But he was acutely aware of the danger his presence created for his colleagues; one interviewer refused to ride in a car with him.
On the eighth day, the interviewers ended up in Balad, a town north of Baghdad whose main street was dominated by a huge portrait of the radical Islamic cleric Moktada al-Sadr. "As fate would have it," Mr. Roberts says, "one of the first doors we knocked on was the governor's. There I am, I'm sitting in the car, and a police car rolls up, and my two interviewers get hauled away."
Mr. Roberts and his driver decided to wait. "I laid on my side and pretended to be asleep so no one would see my blue eyes," he says. After the interviewers had been gone for about 40 minutes, he says, "two little kids walked up to the car and in English said, 'Hello, Mister!'"
"It's just impossible for a Westerner to stay invisible in Iraq," he says.
After more than an hour, the two interviewers, who were physicians for the Iraqi Ministry of Health, managed to talk their way out of the situation. Mr. Roberts retreated to a hotel in Baghdad for the duration of his stay, getting daily reports from Dr. Lafta.
The researchers saved the most dangerous location for last. On September 20, Dr. Lafta went to violence-racked Fallujah with the only interviewer willing to travel there. The researchers had done a haunting bit of calculus before the journey. Given that the chance was high of an interviewer's or researcher's getting killed there, the study would be better served by getting the other data first.
The Fallujah data were chilling: 53 deaths had taken place in the study's 30 households there since the invasion commenced, on March 19, 2003. In the other 32 neighborhoods combined, the researchers had counted 89 deaths. While 21 of the deaths elsewhere were attributable to violence, in Fallujah 52 of the 53 deaths were due to violence.
The number of deaths in Fallujah was so much higher than in other locations that the researchers excluded the data from their overall estimate as a statistical outlier. Because of that, Mr. Roberts says, chances are good that the actual number of deaths caused by the invasion and occupation is higher than 100,000.
Mr. Roberts took a few days in Baghdad in late September to compile and analyze the data. He discovered that the risk of death was 2.5 times as high in the 18 months after the invasion as it was in the 15 months before it; the risk was still 1.5 times as high if he ignored the Fallujah data. Because he had found in many other wars that malnutrition and disease were the most frequent causes of civilian deaths, he was "shocked," he says, that violence had been the primary cause of death since the invasion.
"On the 25th of September my focus was about how to get out of the country," he recalls. "My second focus was to get this information out before the U.S. election." In little more than 30 days, the paper was published in The Lancet.
Mr. Roberts and his colleagues now believe that the speedy publication of that data created much of the public skepticism toward the study. He sent the manuscript to the medical journal on October 1, requesting that it be published that month. Mr. Roberts says the editors agreed to do so without asking him why.
Despite the sprint to publication, the paper did go through editing and peer review. In an accompanying editorial, Richard Horton, editor of the The Lancet, wrote that the paper "has been extensively peer-reviewed, revised, edited, and fast-tracked to publication because of its importance to the evolving security situation in Iraq."
Dr. Horton declined repeated requests by The Chronicle for comment on the study and the decision to publish it before the U.S. presidential election. But three other major medical journals told The Chronicle that they, too, occasionally put papers of immediate importance on a fast track, and that the time from receipt to publication can be days or a few weeks.
Mr. Roberts calls the peer-review process that his paper underwent "rigorous." One of the peer reviewers told The Chronicle that he had had about a week to comment on the paper.
A Question of Timing
The timing of the paper's publication opened the study to charges of political propaganda. So did Mr. Roberts's admission to an Associated Press reporter on the day that the paper came out that he opposed the war. "That was the wrong answer," Mr. Roberts says now, "because some of the other study members hated Saddam and were in favor of the initial invasion."
Mr. Garfield, one of the co-authors, says he did not feel the same urgency about publishing before the U.S. election. "I was afraid that the importance of the topic would get lost among many other electoral issues," he says.
Mr. Garfield appears to have been correct.
The Lancet released the paper on October 29, the Friday before the election, when many reporters were busy with political coverage. That day, the Los Angeles Times and the Chicago Tribune each dedicated only about 400 words to the study and placed the articles inside their front sections, on Pages A4 and A11, respectively. (The news media in Europe gave the study much more play; many newspapers put articles about it on their front pages.)
In a short article about the study on Page A8, The New York Times noted that the Iraq Body Count, a project to tally civilian deaths reported in the news media, had put the maximum death toll at around 17,000. The new study, the article said, "is certain to generate intense controversy." But the Times has not published any further news articles about the paper.
The Washington Post, perhaps most damagingly to the study's reputation, quoted Marc E. Garlasco, a senior military analyst at Human Rights Watch, as saying, "These numbers seem to be inflated."
Mr. Garlasco says now that he had not read the paper at the time and calls his quote in the Post "really unfortunate." He says he told the reporter, "I haven't read it. I haven't seen it. I don't know anything about it, so I shouldn't comment on it." But, Mr. Garlasco continues, "like any good journalist, he got me to."
Mr. Garlasco says he misunderstood the reporter's description of the paper's results. He did not understand that the paper's estimate includes deaths caused not only directly by violence but also by its offshoots: chaos leading to lack of sanitation and medical care.
Online, the words flew. Some bloggers denounced the study. The online magazine Slate published an essay by its military columnist, Fred Kaplan, saying that the wide range of possible deaths, 8,000 to 194,000, is not an estimate. "It's a dartboard," he wrote.
The U.S. government had no comment at the time and remains silent about Iraqi civilian deaths. "The only thing we keep track of is casualties for U.S. troops and civilians," a Defense Department spokesman told The Chronicle.
Mr. Garfield now regrets the timing of the paper's release because he believes that it allowed people to dismiss the research. "The argument is an idiotic one of, 'You're playing politics, so then the data's not true,'" he says.
Such logic angers him. "Hey," he says. "This is valuable information. The fact that somebody wants to convince you of it -- how is that suddenly illegitimate? Why is that a reason to ignore it? If it's wrong, then ignore it. If it's dealing with deaths of people that don't count in the world, then ignore it. I don't think it's wrong, and I don't think Iraqi deaths don't count."
Mr. Roberts insists that his primary motive for rushing the paper to press was not political. He says he is glad the paper appeared before the election because he was concerned for his Iraqi colleagues' safety. Had the paper come out after the election, he argues, it would have looked like a cover-up. Dr. Lafta, he says, "would have been killed -- there is just no doubt."
Dr. Lafta, in an e-mail message to The Chronicle, disagrees: "My personal opinion is that this was an unjustified fear."
Mr. Roberts acknowledges that he also hoped to ignite a policy change or public response. "This was going to do more good in terms of changing policy if it came out in October than if it came out in November," he says. "But we never had any delusions that this might affect the U.S. election."
Reassessing the Evidence
The reception of the Iraqi mortality study by scientists has been far friendlier than by the news media.
Scientists say the size of the survey was adequate for extrapolation to the entire country. "That's a classical sample size," says Michael J. Toole, head of the Center for International Health at the Burnet Institute, an Australian research organization. Researchers typically conduct surveys in 30 neighborhoods, so the Iraq study's total of 33 strengthens its conclusions. "I just don't see any evidence of significant exaggeration," he says.
David R. Meddings, a medical officer with the Department of Injuries and Violence Prevention at the World Health Organization, says any such survey will have uncertainty because of extrapolation based on small numbers, and because of the possibility that people gave incorrect information about deaths in their households.
"I don't think the authors ignored that or understated" those factors, he says. "Those cautions I don't believe should be applied any more or any less stringently to a study that looks at a politically sensitive conflict than to a study that looks at a pill for heart disease."
The uncertainty leads to the breadth of the so-called 95-percent confidence interval -- in other words, the 95-percent chance that the number of deaths in Iraq resulting from military activities is between 8,000 and 194,000.
Critics like the Slate writer seized on that range, says Dr. Woodruff, the government epidemiologist. "They thought, 'Well, it's just as likely to be 18,000 as 100,000.' That's not true at all," he says. "The further you get away from 100,000, the probability that the number is true gets much smaller."
The gap between the Lancet estimate and that of Iraq Body Count does not trouble scientists contacted by The Chronicle. John Sloboda, a professor of psychology at the University of Keele, in England, and a co-founder of Iraq Body Count, says his team's efforts will lead to a count smaller than the true number because not every death is reported in the news media.
Dr. Woodruff says, "Les [Roberts] has the most valid estimate."
Dr. Toole agrees: "If anything, the deaths may have been higher [than the study's estimate] because what they are unable to do is survey families where everyone has died."
Robin M. Coupland, a medical adviser on weapons and armed violence in the legal division of the International Committee of the Red Cross, has only one concern: Mr. Roberts's team did not document how many people were wounded.
"In every recorded context where conventional explosive weapons have been used in armed contact," Dr. Coupland says, "there's usually two or more people wounded per person killed. The question that glares out from that article is, Where are all the 200,000 wounded?"
Mr. Roberts says his team did not ask about injuries because of the difficulty of defining both what constitutes an injury and whether the injury stemmed directly or indirectly from violence. "If someone is running from fighting and they cut their foot, is that a war wound?" he asks.
Burden of Proof
Despite the muted public response, public-health professionals are glad that the study brought to light the human toll of the Iraq war and continuing occupation. Both the study and the Iraq Body Count, says Mr. Sloboda, are "shoestring attempts by private citizens" to do work he says the government ought to be doing.
Mr. Garlasco, of Human Rights Watch, is mystified that the Defense Department is not publicly interested in such studies. "Civilian casualties can be a bellwether for the actual conduct of the war-fighting," says Mr. Garlasco, who was an intelligence officer at the Pentagon until 2003. "They're using all these precision weapons, so one would expect that if you're striving to minimize casualties, you'd have very low casualties. In Iraq we've seen the exact opposite, so one has to wonder why."
Besides, he says, counting civilian deaths could actually be useful for the Pentagon's public image. "I truly believe when the U.S. military says we're not there to kill civilians, it's absolutely true," he says. "The problem is, though, there are many people who don't accept their reasoning. The only way they'll change their minds is if the U.S. military shows they take civilian casualties seriously enough that they quantify them and attempt to minimize casualties in the future."
In the Lancet article, Mr. Roberts and his colleagues write, "It seems difficult to understand how a military force could monitor the extent to which civilians are protected without systematically doing body counts or at least looking at the kinds of casualties they induce."
Dr. Coupland says, "The number of noncombatant deaths and injuries would speak to the legality of the nature of the hostilities."
That's why surveys like the Lancet one are important, says the World Health Organization's Dr. Meddings, even if the immediate response is hesitant: "If you can put accurate information out, it shifts the burden of proof onto militaries to substantiate why what they're doing is worth this humanitarian cost."
At the end of the day, Mr. Roberts worries that his study may play little part in that crucial debate. Although he blames the American news media for being embedded not only with the military but also with the military point of view, he also partly blames himself for the lack of public response.
"Maybe we the scientists have mismanaged this information," he says. "We had a message that was of interest to most Americans. We had a message that was extremely robust scientifically. And we failed to get it out into society where they could use it."
© Chronicle of Higher Education