John R. Lott, Jr.
American Enterprise Institute
[Critical Commentary by Tim Lambert
This is a copy of the original document by Lott, downloaded from Lott’s web site here on March 21, 2003. My comments appear in italics like this.]
Guns make it easier for bad things to happen, but they also make it easier for people to stop crime and prevent bad things from happening. The important question that ultimately concerns everyone is the net effect, whether on net guns save lives or cost lives and whether they increase or decrease violent crime that threaten so many people.
A basic issue is how to measure these different effects. Two approaches are available for a researcher: regressions using already existing data or to conduct surveys to expand the available data set. Both approaches have their own strengths and weaknesses.
Surveys have filled many important gaps in our knowledge. For example, the most commonly discussed crime data come from reported crimes to police departments. Yet, we know that most crimes are actually not reported to the police. For many violent crimes, in particular rape, the gap between what is reported to police and what actually occurs appears to be quite large. The only way to try to get a handle on the size of that gap is through conducting direct surveys of people.
Similar issues arise with gun use. Crimes committed with guns that result in murder are very accurately reported, though other violent crimes committed with guns are not as well recorded. But an even greater problem exists on the other side of the ledger since there is no attempt by police to count reported instances of defensive gun use. The only partial attempt by the government to quantify the consequences of defensive gun use involves the number of "justified homicides," but unfortunately a large number of jurisdictions do not even report that number. With surveys by Hemenway, Azrael, and Miller in 1996 and 1999 indicating that people use guns on average about 2 millions of times per year [Not true—see comment on footnote],1 even a large multiple of recorded justifiable homicides implies that attackers are killed in fewer than 1/1000 of defensive gun uses.2 Thus the number of justifiable homicides provided by the U.S. Department of Justice misses virtually all of the picture of defensive gun uses. Surveys on this rate of defensive gun use are basically the only way to get information on the number of defensive uses as well as how the guns are used in those cases.
But as to the other aspects of defensive gun use the picture is much fuzzier. Thus even this number misses virtually all of the picture of defensive gun uses. Just like for the total number of defensive gun uses, surveys are basically the only way to gather information on what happens when people chose to use a gun to try to defend themselves. They may simply brandish a gun, fire a warning shot, or fire it at the attacker.
However, while surveys may be the only way to gather this information, they are by no means perfect. For instance, people might feel intimidated and embarrassed about many questions (e.g., few people who used a gun improperly could be expected to admit to doing so). There is also the issue of bias. People’s policy preferences may color how they answer the pollster’s questions. Detailed questions such as whether the presence of a gun actually saved their life or whether it really prevented a criminal from attacking may introduce an element of judgment or bias.
Other serious concerns arise with survey data. Does a criminal who is thwarted from committing one particular crime merely substitute into another victim or another type of crime? Or, might this general deterrence sufficiently raise the costs of these undesirable activities that some criminals stop committing crimes? Survey data just have not been able to answer this.
The most common way to answer these last questions about the impact on crime rates is to examine how either changes in gun laws or the presence of guns affect crime rates, accidental gun deaths or suicide rates. This usually involves a large sample with information on the crime or death rates being studied as well as other factors that may affect them across many places over many years. Regressions fill in important gaps that surveys can’t deal with.
Differences in Survey Results
There have been basically two different types of random surveys used to estimate the rate of defensive gun use. The National Crime Victimization Survey, conducted by the U.S. Bureau of Justice Statistics, frequently estimate defensive gun use at merely a little over hundred thousand times per year. But other surveys generally produce much higher estimates, with the National Study of Private Ownership of Firearms implying as many as 4.7 million defensive gun uses in 1994, though an estimate of 1.5 million is viewed by the authors as more reasonable.3 Other recent estimates have ranged from 2 million to 2.5 million. [Not true—see comment on footnote] 4 The 1997 survey estimates that I made implied about 2.1 million defensive gun uses, well within the range implied by these other similar surveys. My other estimate, for 2002, yet to be released in my forthcoming book, The Bias Against Guns, implies a slightly higher number, but the results are not "statistically significantly" different.5
The huge difference between the NCVS estimates and the rest of the surveys likely arises because the NCVS only allows people to volunteer whether they have used a gun defensively once they answered that they were indeed "victimized" by a crime. Obviously, those who used a gun to successfully prevent an attack may not view themselves as having been victims. [This is not true. The NCVS asks questions like “Did anyone try to rob you?” It does not just count completed crimes.] Others disagree with this and claim that the difference is due to the wording of questions regarding how the victims respond to crime.6
Many concerns exist about so-called "false positives," where people report a defensive gun use even though none has occurred. They may lie for political or other reasons. Those who hate guns don’t have quite the same option to bias the results downward. Part of this bias assumes that the respondent anticipates why the questions are being asked. In addition, as McDowall et. al. note, "the possibility of fabricated incidents requires an interaction between question content and question order." 7
One way that I have tried to deal with this possible problem of false positives in my surveys is by asking: "During the last year, were you ever threatened with physical violence or harmed by another person or were you present when someone else faced such a situation?" A second question inquiring into the number of times that someone has been threatened at least puts limits on how much respondents can exaggerate defensive gun uses before they are asked how they responded to these physical threats of violence.
[This is a reasonable approach to the possible problem raised by McDowell et al in their 2000 paper. However, how is it possible to use a 2000 paper to design the questions for a 1997 survey?]
Outside of the NCVS survey, it is surprising how few surveys have broken down exactly how guns are used defensively. Since 1990, there are only four such surveys, two of them influenced greatly by Gary Kleck.8 While Cook and Ludwig wrote up the results of the 1994 National Study of Private Ownership of Firearms, arguing that no meaningful results could be obtained from it, the survey itself was designed and overseen by Gary Kleck.9 Similarly there is the 1993 survey by Gary Kleck and Marc Gertz. The other two surveys were conducted by me.
In place of professional pollsters and the numerous multiple questions that were designed to check for inconsistencies in respondents answers, my surveys used students and were extremely short, with a maximum of seven questions (see appendix). The difference in the number of survey questions was very large. So for example in describing the time devoted to his surveys, Gary Kleck has said that: "So even if you had two people working on it, it would still take 500 evenings to do 2,000 interviews." Yet, my survey was precisely set up to dramatically minimize this time. 90 percent of those questioned would simply answer "no" to the first question and then simply have to answer two very short demographic questions before being finished. For those respondents who answered the first question "no," students would finish the interview usually in about 30 seconds.10 Only about one percent of the respondents were asked all seven questions, which includes the short demographic questions. When we redid the survey on a smaller sample in 2002 we completed it within 8 evenings. There are obvious benefits from having a longer survey, but I wanted to get a rough idea of the magnitude of the effects involved.11
While we obtained similar results in terms of the defensive gun use rate, it is quite plausible that the 76 percent brandishing rate found in Kleck and Gertz and the 73 percent rate found in the National Survey of Private Ownership of Firearms differ from the over ninety-five percent rate that I have found because of these other differences.12 [It is not at all plausible that having a shorter survey could cause marked differences in the brandishing rate.] There are also differences in the time period discussed (they asked respondents about incidents over the preceding five years [The brandishing rate in Kleck’s survey is the same if just incidents over the previous year are considered]while I asked about incidents only over the preceding year) and the surveys were also given in different years (his in the early 1990s and mine three to eight years after his last one). Yet, the difference between the National Survey of Private Ownership of Firearms and the 95 percent estimate of brandishings that I obtained in my 2002 survey [The unweighted brandishing percentage from his 2002 survey is 92%, the weighted percentage is 91%. Calculations are here. ] are significantly different at the 10.4 percent level for a one-tailed t-test, which is just slightly short of being considered statistically significant under normal standards.13 [A t-test is not appropriate here and a one-tailed test is certainly not correct. Lott also seems to have used an incorrect number for the sample size (13 defensive uses by 7 people does not mean that the sample size is 13)] Combining my two surveys together (with their 98 and 95 percent estimates) does produce a statistically significantly different result from the National Survey of Private Ownership of Firearms with a single-tailed t-statistic of 2.4.14 [“single-tailed t-statistic” makes no sense. Given a particular t value you can construct a one or two-tailed p value, but the phrase “one-tailed” makes no sense when applied to a t statistic. Furthermore, it makes no sense to combine the 2002 survey with the non-existent data from the 1997 survey. And since he uses unweighted data in the combination, he has a 38/41=93% brandishing number anyway. ]
Kleck and Gertz’s estimates rise to 92 percent when brandishing and warning shots are added together.[Not true. Adding warning shots gives 84%.] To look at it differently, if the question is whether the attacker is directly shot at, the surveys agree that happens fewer than 10 percent of the time. [In fact, Kleck’s survey indicates that this number is 16%, while Lott has claimed that this number is 0.5%]
The sample sizes for the number of defensive gun uses in these surveys are small, and it becomes difficult to produce precise estimates of differences in subsamples of subsamples. The National Survey of Private Ownership of Firearms had 45 defensive gun uses against humans during a year (19 when cases viewed by Cook and Ludwig as doubtful were excluded). [The sample size for Kleck’s survey is over 200. NCVS 1992-2001 has a samle size of almost 300.] For my work, only by combining both of my samples together would it have been possible get a sample of over 40 observations.15 Often with such small subsamples it is possible to only view the evidence as giving our best guess about what the "preponderance of the evidence" shows. Indeed, it is one of the reasons that the one sentence that mentioned this result in my book stated "If a national survey that I conducted is correct . . . ."16 [Lott still has not explained why he repeatedly attributed his 98% number to other surveys.]
With the exception of the National Crime Victimization Survey, the surveys provide surprisingly consistent evidence on the annual rate of defensive gun use. Yet, despite general interest in the topics of guns and a fair number of surveys that examine defensive gun use generally, surprising few surveys have attempted to break down who people use their guns when they defend themselves from threats of violent crime. Obviously more work can be done on this topic. The surveys and methods are available for others to replicate further and examine why these finer breakdowns of survey results appear to produce somewhat different results.
Appendix 1: Survey on Defensive Gun Use
Below is the survey that was used to identify the rate of defensive.
Hello, my name is _______, and I am a student at ________ working on a very brief survey on crime. The survey should take about one minute. Could I please ask you a few questions?
1) During the last year, were you ever threatened with physical violence or harmed by another person or were you present when someone else faced such a situation? (Threats do not have to be spoken threats. Includes physically menacing. Attacks include an assault, robbery or rape.)
d) Declined to answer
(Just ask people "YES" or "NO." If they answer "NO" or "Decline to answer," go directly to demographic questions. If people are "Uncertain" or say "YES," proceed with question 2.)
2) How many times did these threats of violence or crimes occur? _____
3) Which of the following best describe how you responded to the threat(s) or crime(s)? Pick one from the following list that best described your behavior or the person who you were with for each case faced.
a) behaved passively
b) used your fists
c) ran away
d) screamed or called for help
e) used a gun
f) used a knife
g) used mace
h) used a baseball bat or club
(Rotate these answers (a) through (h), place a number for 0 to whatever for each option. Stop going through list if they volunteer answer(s) that account for the number of threats that they faced.)
4) This is only done if they respondent answers "e" (a gun) to question 3 If a gun was used, did you or the other person you were with:
a) brandish it
b) fire a warning shot
c) fire at the attacker
d) injure the attacker
e) kill the attacker
(Again, place a number for 0 to whatever number is appropriate for each option. Rotate the answers to this question.)
5) Were you or the person you were with harmed by the attack(s)? a) Yes
c) Refused to answer
(We obviously have the area code for location, write down sex from the voice if possible, otherwise ask.)
Two demographic questions asked of all participants.
What is your race? black, white, Hispanic, Asian, Other.
What is your age by decade? 20s, 30s, 40s, so on.
|Raw unweighted data from 2002 survey|
|Question 1: Physically threatened with violence||Question 2: How many times?||Question 4: How did you use gun?||Question 5: Were you or the person you were with harmed by the attack(s)?|
|Yes||No||Brandish||Other||Yes, total number of times||Yes, number of people saying that they were ever harmed|
|114||901||467 Cases/ involving 114 people||12||1||30||29|
|Raw unweighted data from 2002 survey: Question 3: Which of the following best describe how you responded to the threat(s) or crime(s)?|
|behaved passively||used your fists||ran away||screamed or called for help||used a gun||used a knife||used mace||used a baseball bat or club||other|
|Total number of cases||41||18||18||27||13||2||1||1||177|
|Total number of people in category||33||18||18||20||7||1||1||1||30|
|Raw unweighted data from 2002 survey|
|Question for surveyor: Were you or the person you were with harmed by the attack(s)?|
|No serious concern||Some concern||Serious concern|
The survey was conducted over eight evenings (Mondays through Thursdays) during November and December, 2002. When a telephone was busy or there was no answer or there was an answering machine the telephone number was called a total of three times inorder to try to reach the party. When a surveyor received no answer, a busy signal, or an answering machine, they called that telephone number two more times on different days in order to try to reach the party.
In all defensive gun uses, the surveyors were debriefed that night or the following morning about the call. All the respondents in these cases had volunteered extensive details of what happened with the defensive gun use. None of the defensive gun uses recorded involved defensive gun uses by police. A number of call (from the surveyor’s end) were randomly monitored as they were made. Two of our surveyors had previous experience conducting telephone surveys and they went through the survey with the surveyors before everyone started. As a result of callbacks, over 60 percent of telephone numbers produced completed interviews. Finally, at least two respondents for each surveyor who had indicated that they had felt threatened over the last year were called back to double check their answers. In almost all cases it was possible to reach the respondent and in all cases the answers matched what had been previously
The telephone numbers were collected from a program called Select Phone Pro Version 2.4 made by infoUSA. The telephone numbers were selected randomly by state so that percentage of telephone numbers from each state reflected that state’s share of the national population. Data for all fifty states were used.
I assume that there were 206.99 million adults 20 years of age or older in December 2002 (the census estimated that there were 288.68 million Americans at that time and the 2000 Census indicated that 71.7 percent are over 19 years of age). 1.125 percent of the people in the weighted sample used guns defensively. 1,015 people were surveyed, and the estimates were weighted by race (black, white, other) and gender. [Lott earlier claimed that his 1997 survey was weighted by age, race, gender and state. ]
The time of James Knowles, Jill Mitchell, Carl Westine, Susan Follett, Matt Trager, Arnaud Bonraisin, Andrei Zlate, and Sandra Long, in conducting this survey was greatly appreciated.
1 D. Hemenway, D. Azrel, and M. Miller, "Gun Use in the United States: Results from two national surveys," Injury Prevention (2000): 266, Table 3. The table shows that across those two samples about 1 percent of adults used guns defensively. [The tables clearly indicate that they show defensive uses over the previous five years, implying an estimate of 400,000 defensive uses. The authors also clearly state that this would be an overestimate due to telescoping and false positives.]
2 The recorded number of defensive gun uses in 1996 was 261 and in 1999 was 188, for an average of 225. The data is from the Uniform Crime Reports. Even if the true number was eight times higher than the reported number, the rate at which attackers are killed in defensive gun use would still be fewer than 1 per 1000 defensive uses.
3 See Philip J. Cook and Jens Ludwig, "Defensive Gun Uses: New Evidence from a National Survey," Journal of Quantitative Criminology, vol. 14 (1998): 111-131. Cook and Ludwig however dismiss this estimate as being implausibly high and claim that survey data simply can’t estimate the number of defensive uses.
4 For the 2 million estimate see D. Hemenway, D. Azrel, and M. Miller, "Gun Use in the United States: Results from two national surveys," Injury Prevention (2000): 263-7. These authors estimate that there were on average 1 percent of adults in 1996 and 1999 who used a gun defensively. [The one percent is for defensive uses over the previous five years, implying an estimate of 400,000 defensive uses. The authors also clearly state that this would be an overestimate due to telescoping and false positives.] For the 2.5 million estimate see Tomislav Kovandzic, Gary Kleck, and Marc Gertz, "Defensive Gun Use: Vengeful vigilante imagery versus reality: results from the National Self-Defense Survey," Journal of Criminal Justice, Vol. 26 (1998): 251-8. This survey was conducted during 1993.
5 The 2002 survey is 42 percent as large as the original survey (1,015 versus 2,424). Information on obtaining the data for the 2002 survey is provided in Appendix 2.
6 David McDowall, Colin Loftin, and Stanley Presser, "Measureing Civilian Defenisve firearm Use: A methodological experiment," Journal of Quantitative Criminology, Vol. 16 (2000): 16-17.
7 Ibid, p. 17.
8 Gary Kleck and Marc Gertz list three surveys during the 1970s that examined the rate that guns were fired in defensive gun uses. One survey in 1982 for broken down this percent for Ohio and one survey in 1989 by Time/CNN asked only about the percent who fired a gun and not about other types of defensive gun use. Thus prior to the 1990s one has to go back to the early 1980s to obtain a non-NCVS survey that examines the rate of defensive gun uses that involve brandishing. Gary Kleck, and Marc Gertz, "Defensive Gun Use: Vengeful vigilante imagery versus reality: results from the National Self- Defense Survey," Journal of Criminal Justice, Vol. 26 (1998): Table 1. [Lott neglects to mention that these surveys give a range of 34% to 67% for often the gun is fired.]
9 Because the writing of the survey was handled by people who were different than the person who designed and oversaw the survey, Cook and Ludwig’s paper has a schizophrenic tendency to it, where the first part sets out the survey and the second part of the paper explains why the survey makes no sense.
10 The 30 second estimate is based upon comments from James Knowles and Jill Mitchell.
11 The students who participated in the 2002 survey are as follows: James Knowles, Jill Mitchell, Carl Westine, Susan Follett, Matt Trager, Arnaud Bonraisin, Andrei Zlate, and Sandra Long. On average James Knowles tells me that one average there were four and a fraction students doing the survey on any given night. Sometimes the telephone calling also did not start on time and thus the total time spent that night was cut short.
12 In the his results for the one-year reporting period, Kleck is said to have simply assumed the same percentage rate as for the five year period. See Otis Dudley Duncan, "Gun Use Surveys: In Numbers We Trust," The Criminologist Vol 25, No 1 Jan/Feb 2000 pp 1-7. Cook and Ludwig briefly mention that the survey data constructed by Kleck contained information on the issue of brandishing, but they do not discuss any results. ( Philip J. Cook and Jens Ludwig, "Defensive Gun Uses: New Evidence from a National Survey," Journal of Quantitative Criminology, vol. 14 (1998): 120.)
13 This calculation uses the unweighted numbers reported in Cook and Ludwig’s Table 3. For a two-tailed t-test these results are not statistically significantly different.
14 This is based upon the unweighted samples for the 1997 and 2002 surveys because the data for the 1997 was lost in a hard disk crash (see appendices 3 and 4). My recollection is that of 28 defensive gun use cases, two involved something other than brandishing.
15 Unfortunately, the original 1997 sample was lost in a computer hard disk crash. For evidence of this see the appendix. [The appendix contains no such evidence.]
16 See John R. Lott, Jr., More Guns, Less Crime, University of Chicago Press, second edition 2000, p. 3. The first edition had just made the same general conditional statement regarding "national surveys," but part of the book that would have explained this point further was lost in the computer crash as noted in the previous footnote. [National surveys do not indicate that the brandishing number is anywhere near 98%. No possible further explanation could turn this false statement into a true one.] As it is, just the one sentence was left in the book. For some discussion on this see Appendix 6 here.[There is no appendix 6.]