Lott's responses to Michelle Malkin's op-ed are in a fixed-width font, while my comments on his response are in italics like this. Lott's responses were downloaded on 25 April 2005.
Below is Malkin's op-ed with commentary by me (my comments are indented and in italics and start at the bottom of the page with the numbered responses corresponding to the numbers in the supporting document). (Note that two other discussions on this issue have been posted since February 2003 and involve a general discussion of the two other polls that ask about brandishing that have been done over the previous two decades as well a response to other attacks are available at the bottom of the page found here.) Despite being sent this information several times, she has not responded to any of these points. Steve Malzberg and Karen Hunter, co-hosts of a morning drive time show on WWRL (1600 AM) in New York, offered to let Malkin discuss these claims with me on the air, but she was unwilling to participate. It is disappointing that she will make allegations in print and on radio shows, but that she is unwilling to defend these assertions when I am present.
The other Lott controversy
Michelle Malkin
For those few of us in the mainstream media who openly support Second Amendment rights, research scholar John Lott has been -- or rather, had been -- an absolute godsend.
Armed with top-notch credentials (including stints at Stanford, Rice, UCLA, Wharton, Cornell, the University of Chicago and Yale), Lott took on the entrenched anti-gun bias of the ivory tower with seemingly meticulous scholarship. His best-selling 1998 book, "More Guns, Less Crime," provided analysis of FBI crime data that showed a groundbreaking correlation between concealed-weapons laws and reduced violent crime rates.
I met Lott briefly after a seminar at the University of Washington in Seattle several years ago and was deeply impressed by his intellectual rigor. Lott responded directly and extensively to critics' arguments. He made his data accessible to many other researchers.
But as he prepares to release a new book, "Bias Against Guns," next month, Lott must grapple with an emerging controversy---brought to the public eye by the blogosphere---that goes to the heart of his academic integrity.
The most disturbing charge, first raised by retired University of California, Santa Barbara professor Otis Dudley Duncan and pursued by Australian computer programmer Tim Lambert, is that Lott fabricated a study claiming that 98 percent of defensive gun uses involved mere brandishing, as opposed to shooting.
When Lott cited the statistic peripherally on page three of his book, he attributed it to "national surveys." In the second edition, he changed the citation to "a national survey that I conducted." He has also incorrectly attributed the figure to newspaper polls and Florida State University criminologist Gary Kleck.
1) The reference to the survey involves one number in one sentence in my book. Compared to the 98 percent number there was an earlier survey by Kleck that found 92 percent of defensive gun uses involved brandishing and warning shots and because the survey was asking people about events that occurred over a long period of time it is likely that it over emphasized more dramatic responses. (My number that is directly comparable to the 92 percent estimate is about 99 percent.) My point in the book was that defensive gun use rarely involves more "newsworthy" events where the attacker is killed and either survey would have made the general point.
Lott repeated the 98% figure many many times and based arguments on it. Kleck's survey did not find that 92% of DGUs involved brandishing and warning shots. Kleck's estimate for that figure is 84% (compared with an absurd 99.5%, or 27.9 out of the 28 uses Lott claims from his 1997 survey). But why doesn't Lott compare his 98% number with Kleck's estimate for brandishing, which was 76%? His claim that it is likely that Kleck's survey "over emphasized more dramatic responses" is false. The brandishing number that you obtain from Kleck's survey does not change if you just consider those defensive uses with the previous year. The difference between 2% of defenders firing (Lott) and 24% firing (Kleck) is a factor of 12 and does make a huge difference to Lott's point about how often "newsworthy" defensives uses occur.
I never attributed my survey results to Kleck. What happened was that Dave Kopel from the Independence Institute took an op-ed that I had in the Rocky Mountain News and edited it for his web site. In the editing he added the incorrect reference to Kleck. (Statements from Kopel and others are provided in the supporting documents ). The two pieces are identical except for the reference to Kleck. As to the claim that I attributed the number to newspaper polls, that claim involves a misreading of two different sentences in an op-ed. As to using the plural, that was an error. Given the years that have passed since I wrote the sentence, I cannot remember exactly what I had in my mind but the most plausible explanation is that I was describing what findings had been generated by the polls, in other words I was thinking of them as a collective body of research. I had been planning on including more of a discussion on the survey in the book, just as I have in my book that came out early this year, but I had a hard disk crash (see response (2)) and I lost part of the book along with the data.
The specific attribution to Kleck was probably added by Kopel. Discussion is here. If Lott was thinking of the findings of the polls as a collective body of research, then he was, contrary to his claims here, attributing the 98% to the polls. And the polls, taken as a collective body or research show no such thing.
More importantly, the survey results that I used were biased against the claim that I was making. The relevant discussions in both of my books focus on media bias and the point was that the lack of coverage of defense gun uses is understandable if most uses simply involve brandishing where no one is harmed, no shots fired, no dead bodies on the ground, no crime actually committed. If others believe that the actual rate of brandishing is lower and I had used the results of Kleck, it becomes MORE difficult to explain the lack of news coverage of defensive gun uses. The two short discussions that I have on this issue in my two books thus choose results that are BIASED AGAINST the overall point that I am making, that the media is biased against guns.
Actually, the claim that Lott was making in More Guns, Less Crime was that "underreporting of defensive gun use is large". The extraordinarily high brandishing rates that Lott claims his surveys found are biased TOWARDS this claim. The lack of news coverage can be explained if defensive gun use is not as frequent as Lott claims and as the National Crime Victimization Survey indicates.
Last fall, Northwestern University law professor James Lindgren volunteered to investigate the claimed existence of Lott's 1997 telephone survey of 2,424 people. "I thought it would be exceedingly simple to establish" that the research had been done, Lindgren wrote in his report.
Unfortunately, Malkin fails to mention that Lindgren is not an unbiased observer since I had written a journal article in Journal of Law & Politics critiquing some of his work months before he "volunteered to investigate" these claims
Lott fails to mention that it was Lott who chose to call Lindgren and take up his offer to investigate the case. If he felt that Lindgren was biased why choose him to conduct the investigation? Nor was Lott's article particularly critical of Lindgren's work---Lott wrote: "Table 1 uses the set of control variables employed by Lindgren and generally confirms his results."
It was not simple. Lott claims to have lost all of his data due to a computer crash.
2) As to the "claim" that I lost my data in a computer crash on July 3, 1997, I have offered Malkin the statements from nine academics (statements attached), four of whom I was co-authoring papers with at the time and who remember quite vividly also losing the data that we had on projects. David Mustard at the University of Georgia spent considerable time during 1997 helping me replace gun crime data. Other academics worked with me to replace data on our other projects. Just so it is clear, this computer crash basically cost me all my data on all my projects up to that point in time, including all the data and word files for my book, More Guns, Less Crime, and numerous papers that were under review at journals. The next couple of years were hell trying to replace things and the data for this survey which ended up being one sentence in the book, was not of particular importance.However, all the data was replaced, including not only the large county level data, the state level data, as well as the survey data, when the survey was redone.
Proving that he lost some data in a computer crash does not prove that he lost his survey data in that crash. Notice that he is able to find nine people to testify that there was a crash, but not one to testify that he conducted a survey. Doing another survey does not replace the data in the original survey.
He financed the survey himself and kept no financial records.Unlike many academics, I have never asked for government support for my research. Nothing different or unusual was done in this case. While we still have the tax forms that we filed that show we made large expenditures on research assistants that year, my wife keeps our financial documents for the three years required by the IRS. I have checked with the bank that we had an account with, but they only keep records five years back. Since wild claims have been made about the costs of the survey, some notion of its scope would be useful. The survey was structured so that over 90 percent of those questioned would only have to answer three short questions and those were usually completed in under 30 seconds. Less than one percent of those surveyed would actually answer as many as seven questions and even in that case the survey only took about two minutes. The appendix in The Bias Against Guns provides a description of the survey when it was replicated.
Lott's tax returns do not show expenditures on research assistants. There are expenses listed under "legal and professional services", which might possibly include payments to research assistants, but Lott originally told Lindgren that he did not pay the students who conducted the survey and only changed his story when he started claiming that his tax returns supported his story about conducting a survey.
In any case, such a short survey is pointless---it does not provide any information that has not already been obtained in other surveys. Lott has not explained what research goal this survey was supposed to serve.
And the new survey did not replicate the results he claimed he got from the 1997 survey. The new survey gave a brandishing number of 90%, but Lott falsely claimed the result was 95%.
He has forgotten the names of the students who allegedly helped with the survey and who supposedly dialed thousands of survey respondents long-distance from their own dorm rooms using survey software Lott can't identify or produce.
I have hired lots of student RAs over the years. Since I have been at AEI in the last couple of years I have had around 25 people work for me on various projects. While I can usually reconstruct who has worked for me, it requires that I have that material written down. The information on these students was lost in the hard disk crash and given that I had lost data for other projects such as three revise-and-resubmits that I had at the Journal of Political Economy it was not a particularly high priority.
I don't have the original CD with telephone numbers from across the country that was used to obtain telephone numbers, but I have kept one that I obtained later in 1997 when I was considering redoing the survey and I still have that available.
Assuming the survey data was lost in a computer crash, it is still remarkable that Lott could not produce a single, contemporaneous scrap of paper proving the survey's existence, such as the research protocol or survey instrument.3) I have moved three times (Chicago to Yale to Pennsylvania to AEI) as well as changed offices at Chicago and Yale since the summer of 1997. I had essentially kept the material on the hard disk and that was lost. Yet, besides the statements from the academics who can verify the hard disk crash, I do have statements David Mustard, who I had talked to numerous times about doing the survey with me during 1996 and who remembers after that us talking about the survey after it was completed. He is "fairly confident" that those conversations took place during 1997. John Whitley and Geoff Huck also have some recollections. Russell Roberts, now a professor at George Mason, was someone else that I talked to about the survey, but he simply can't remember one way of the other. I didn't talk to people other than co-authors about the survey and the research that I was doing on guns generally. This is because of the often great hostility to my gun work and also because I didn't want to give those who disliked me a heads-up on what I was doing. I did have the questions from the survey and they were reused in the replicated survey in 2002.
Of the people mentioned, only David Mustard recalls Lott telling him that he conducted a survey, and he isn't sure that he was told until 1999 when Lott first claimed that the 98% came from his own, never mentioned before, survey. Lott could not recall the questions from his survey when Lindgren asked him about them in September 2002.
After Lindgren's report was published, a Minnesota gun rights activist named David Gross came forward, claiming he was surveyed in 1997. Some have said that Gross's account proves that the survey was done. I think skepticism is warranted.4) David Gross is the only person who Malkin mentions and she doubts his statements. Gross, a former city prosecutor, does have strong feelings on guns, but that is one reason why he remembers talking to me about the survey when I gave a talk in Minnesota a couple of years after the survey. There was no other gun survey on the questions that I asked during 1997. And another survey that was given close in time, during the beginning of 1996, was dramatically different from mine (e.g., the 1996 survey was done by a polling firm (not by students), was very long with at least 32 open ended questions (not something that could be done in a few minutes), involved Harvard (not Chicago), did not ask about brandishing, etc.). What Gross remembers indicates that it could only have been my survey.
Gross was one of the prime movers behind the Minnesota concealed carry law, devoting four years and $1M in lost income towards getting it passed. It is too much of a coincidence for someone with this much of a motive for preserving Lott's credibility to have been randomly chosen in a national survey.
Malkin also very selectively quotes Lindgren. Lindgren told the Washington Times that, "I interviewed [Mr. Gross] at length and found him credible." Mr. Gross has also responded to later statements made by Lindgren.
Actually, it is Lott who is very selectively quoting Lindgren, whose current position is that he has "substantial doubts whether John Lott ever did the supposed 1997 study".
Lott now admits he used a fake persona, "Mary Rosh," to post voluminous defenses of his work over the Internet.When asked about the similarities between my writings and those posted under this Internet chat room pseudonym during this past January I did admit it immediately. (The only other evidence that existed when I admitted that I was using the pseudonym was that the person posting was from southeastern Pennsylvania.) I had originally used my own name in chat rooms but switched after receiving threatening and obnoxious telephone calls from other Internet posters. While the fictitious name was from an e-mail account we had set up for our children based on their names (see latter discussion), on a couple of occasions I used the female persona implied by the name in the chat rooms to try to get people to think about how people who are smaller and weaker physically can defend themselves. Virtually all the posting were on factual issues involving guns and the empirical debates surrounding them. All that information was completely accurate.
Lott used "Mary Rosh" in online discussion groups (principally Usenet), not in chat rooms. Chat rooms are places where real-time conversations occur, unlike discussion groups where messages are left for later response. Lott only admitted to the deception when Julian Sanchez discovered that Mary Rosh and John Lott had the same IP number. Before Mary Rosh was active, Lott made some postings to Usenet between 3 June 1998 and 14 July 1998. All the responses were polite. In one of his postings Lott complains about getting threatening phone calls, but not about phone calls from other Internet posters:
"You ought to see what happens to my telephone calls when someone like a Charles Schumer or Josh Sugarmann or Sara Brady makes this charge. I get lots of threatening telephone calls and letters. These calls don't bother me, though they do greatly upset my wife."
Nor is it true that Rosh's facts about guns were accurate. For an example of a blatantly false Rosh claim, see here.
"Rosh" gushed that Lott was "the best professor that I ever had."For a couple of the posts I posed as a former graduate student so as to answer some questions about discussions about that time in my life. For the Ph.D. classes that I have taught I have gotten perfect teaching evaluation scores.
Rosh's "best professor" claim was not part of an answer to a question about Lott's time at Wharton. Rather, it was part of an attempt to praise Lott's objectivity. You can see the entire context here.
She/he also penned an effusive review of "More Guns, Less Crime" on Amazon.com: "It was very interesting reading and Lott writes very well." (Lott claims that one of his sons posted the review in "Rosh's" name.)The e-mail account was set up by my wife for my four sons (Maxim, Ryan, Roger, and Sherwin in birth order) and involves the first two letters of each of their names in order of their birth. Maxim wrote several reviews on Amazon.com using that e-mail account and signed in using maryrosh@aol.com, not "Mary Rosh." His posting included not only a review of my book, but also reviews of computer games such as Caesars III.
The language in the review of "More Guns, Less Crime" is the same as that used by Lott and completely different from that in the review of "Caesar III". It is likely that the review of "More Guns, Less Crime" was written by John Lott while the review of "Caesar III" was written by Maxim Lott. Lott wrote many other anonymous reviews at Amazon, some signed with the first names of his children Maxim and Sherwin.
For whatever it is worth, a recent glich at Amazon.com revealed that it is quite common practice for authors to actually write positive anonymous reviews of their own books. The New York Times story on this revelation was actually quite sympathetic, which contrasts with the attack that the New York Times had on me when it also incorrectly claimed that I had written the review of my book.
If it is a common practice for authors to anonymously review their own books, then that makes it more likely that Lott did so, but he is still denying it. The New York Times cartoon he is complaining about is here---judge for yourself whether it is unfair.
Just last week, "Rosh" complained on a blog comment board: "Critics such as Lambert and Lindgren ought to slink away and hide."
By itself, there is nothing wrong with using a pseudonym. But Lott's invention of Mary Rosh to praise his own research and blast other scholars is beyond creepy. And it shows his extensive willingness to deceive to protect and promote his work.
It would have been helpful if Malkin had actually read the text of what I wrote.
Malkin has accurately reported Rosh's comments. Here is where Rosh attacked Lindgren. Here is an example of Rosh praising Lott's research and here is an example of Rosh attacking other scholars.
Some Second Amendment activists believe there is an anti-gun conspiracy to discredit Lott as "payback" for the fall of Michael Bellesiles, the disgraced former Emory University professor who engaged in rampant research fraud to bolster his anti-gun book, "Arming America." But it wasn't an anti-gun zealot who unmasked Rosh/Lott. It was Internet blogger Julian Sanchez, a staffer at the libertarian Cato Institute, which staunchly defends the Second Amendment. And it was the conservative Washington Times that first reported last week on the survey dispute in the mainstream press.The January 23rd story in the Washington Times could not accurately be described as a negative story. Professor Dan Polsby is quoted as saying that I was "vindicated." Even Lindgren, a critic, is characterized by the Times as believing that "the question appears to have been at least partially resolved ..." and he did say that David Gross was a credible witness.
The Washington Times piece was a whitewash, but the claims that Lott is "payback" for Bellesiles are obvious nonsense.
In an interview Monday, Lott stressed that his new defensive gun use survey (whose results will be published in the new book) will show similar results to the lost survey. But the existence of the new survey does not lay to rest the still lingering doubts about the old survey's existence.She never asked me any questions about the old survey.
The media coverage of the 1997 survey data dispute, Lott told me, is "a much-ado about nothing."This quote is totally taken out of context. Some people had accused me of violating federal regulations regarding federal approval for human experiments while I was at Chicago. Malkin's telephone call focused on that claim, and that is what my quote referred to.
Lott can't even keep his stories straight from one paragraph to the next. Lott did not obtain IRB approval for the survey he claimed to have done in 1997. If Malkin asked about that, she was asking about the old survey, but Lott denied this in the paragraph before.
I wish I could agree.I spent years replacing the data lost in the hard disk crash. The county level crime data was replaced and given out to academics at dozens of universities so that they could replicate every single regression in More Guns, Less Crime. I have also made the data for my other book The Bias Against Guns available at http://www.johnlott.org/cgi-bin/login.cgi. The data for my other reserach has also been made available. The survey was also replicated and obtained similar results to the first survey and the new data has been made available since the beginning of the year.
The new data is indeed available. That lets anyone check to see that the new survey did not obtain similar results.
When asked I have even made my data available before the research was published. I don't think that there are any academics who have had a better record then I have in making my data available to other researchers. For an example of just on of my recent critics who has refused to share his data see here.
Lott neglects to mention that the data Ian Ayres refused to share was proprietary data on Lojack sales. Companies usually only allow researchers to use such data if they agree to keep it confidential. Lott would have been told that he would have to get the data from Lojack himself. Lott imposes the same condition on the data you can obtain from his website---to get the data you have to agree not to share it. Apparently Lojack didn't think that Lott could be trusted with their data, but that is hardly Ayres' fault. Lott also neglects to mention that all the data for Ayres' paper that is critical of Lott is available on Ayres' web site.
I have provided Malkin with the information noted here, but she has never replied to e-mails that I have sent her.
Funny, Lott has not replied to my emails, but he has falsely claimed that he had offered several times to provide data for me and that I had not replied.
<< Since wild claims have been made about the costs of the survey, some notion of its scope would be useful. The survey was structured so that over 90 percent of those questioned would only have to answer three short questions and those were usually completed in under 30 seconds. Less than one percent of those surveyed would actually answer as many as seven questions and even in that case the survey only took about two minutes. The appendix in The Bias Against Guns provides a description of the survey when it was replicated last year.>>
This strikes me as the strongest evidence yet of fraud.
How is it possible to construct a representative national survey using only three questions?
How do you collect the necessary demographic data to normalise your sample for demographic variables such as age and gender?
In his 2002 survey had one question for age, one for race and the interviewer only asked about gender if unsure from the voice. The questions are here.
So it is possible, though such a survey seems pretty pointless.
This is over a year old.
Who gives a shit.
The Malkin op-ed piece is new. My comment is a year old and somehow got attached to this thread rather than the thread to which it was originally posted. Tim's the computer programmer, he may be able to explain how on Earth that happened.
Lott added stuff yet again to his reply to Malkin last week, so I updated this post. I moved it to the top of the page because it gives you a good idea of his defence on the survey fabrication case.
That was some good work on this Tim. I was also impressed with your round-up on the "owning a gun makes you 2.6 times more likely to die of a gunshot" study.
Didn't realize you were a fellow coder, though I should have suspected from your logical turn of mind. I mostly work in embedded database languages myself.