<< News Scan | Main | Texas Stay >>


Deterrence Wars

| 0 Comments

   Nothing induces hysteria in the anti-death-penalty crowd as much as the new generation of studies confirming that the death penalty does, indeed, have a deterrent effect and save innocent lives when it is actually enforced. There is good reason for the reaction. Deterrence is the strongest argument for the death penalty in terms of persuading people who are otherwise on the fence. As a result, the studies and the people who do them have been the subject of attacks. Some of the criticisms are bona fide scholarly debate. Some are just bald-faced lies, as noted here.

   One of the most widely cited criticisms is Donohue & Wolfers, Uses and Abuses of Empirical Evidence in the the Death Penalty Debate, 58 Stan. L. Rev. 791 (2005). (This article is not included in our list of deterrence studies from peer-reviewed journals, because it is not in a peer-reviewed journal. See note 56, below.) Authors of two of the studies that Donahue and Wolfers criticize have already posted working-paper versions of replies, and they are in the working paper section of our deterrence study list.

   On Monday, Hashem Dezhbakhsh and Paul Rubin of Emory University posted a draft of their reply, titled From the “Econometrics of Capital Punishment” To the “Capital Punishment” of Econometrics: On the Use and Abuse of Sensitivity Analysis. To call it blistering would be an understatement. I have copied the summary from the end of the paper after the jump.

----------------------------------------------------------

Summary: Donohue and Wolfers examine the robustness of a few of the recent studies on the deterrent effect of capital punishment. They claim based on their results that “there is little evidence to convince believers in the deterrent hypothesis”.52 The obituary of “the deterrence findings” that they write is based on a selective sample, flawed analysis, biased reporting, and violation of guidelines for objective sensitivity analysis (see Section II). For example, among the 21 studies that examine the deterrence hypothesis, they examine only four studies for robustness.53 In fact, six of their nine tables relate to only two studies.54 It is not clear whether such limited scope is the outcome of sample selectivity or reporting selectivity. In either case, the narrow scope of Donohue and Wolfers’ inquiry does not warrant such a grandiose claim. More importantly, their sensitivity analysis of our two studies that they extensively cover is replete with statistical errors, biased reporting, concealing results, and misleading conclusions.

Some of these errors could have been easily avoided had they followed normal scientific procedures by seeking comments from the concerned authors. But they only sent us their paper when it was about to go to print.55 More importantly, they chose to publish their econometric study in a student-refereed law review rather than a peer-refereed economics journal. Law students have no particular expertise in econometrics to identify the aforementioned errors. Obviously, the flaws in the paper could have been detected by expert refereeing, if they had submitted their paper to a peer reviewed outlet.56 We did ask the Stanford Law Review the right to reply, but were not granted this right.

Finally, most of Donohue and Wolfers’ results are, ironically, obtained through extensive data mining, a practice that sensitivity analysis is supposed to safeguard against. Moreover, contrary to the inclusivity norms of sensitivity analysis, they report some estimates of the key parameters, while remaining silent about other estimates. Their selectivity strongly favors anti-deterrence. We believe that Donohue and Wolfers would have benefited from listening to their own words:

“…potential dangers [are] awaiting those who might wish to short-circuit the full process of scientific inquiry and validation…”57

May our collective wisdom advance science and enlighten policy.

Footnotes:

52. Donohue and Wolfers (2005), page 844.

53. See the introduction of this paper for a list of these 21 studies. Donohue and Wolfers have been aware of all but perhaps one (Fagan, Zimring, and Geller, 2006) of these studies which have all been cited in Dezhbakhsh and Shepherd (2006) or Shepherd’s congressional testimony that Donohue and Wolfers cite in footnote 11 of their 2006 study. The four studies they examine include Dezhbakhsh, Rubin, and Shepherd (2003), Dezhbakhsh and Shepherd (2006), Katz, Levitt, and Shustorovich (2003), and Mocan and Gittings (2003). They also briefly touch on three others studies by Cloninger and Marchesini (2001 and 2005) and Zimmerman (2004), without a detailed sensitivity analysis.

54. Six of the nine tables reporting their sensitivity checks are devoted to two studies, Dezhbakhsh and Shepherd (2006) and Dezhbakhsh, Rubin, and Shepherd (2003).

55. Donohue and Wolfers e-mailed us their paper Friday night December 2, 2005, asking for input by Monday December 5, 2005—the date Stanford Law Review had given them to make further revisions, as they stated in their e-mail. Despite various prior e-mails where they sought our help with data and computer codes, they never communicated to us the exact purpose of their study or their findings until that Friday evening. Moreover, the paper was already in the Stanford Law Review format, indicating that they had prepared it much earlier.

56. We find it quite amusing that Wolfers in an interview with Tanner (2007) refers to deterrence findings as “flimsy [results] that appeared in second-tier journals.” The studies he refers to have gone through rigorous peer review process, often blind review, and appeared in economics peer reviewed journals, while his finding with Donohue, that he uses to call the deterrence studies flimsy, has never been peer reviewed and appears only in a student edited/refereed journal.

57. Donohue and Wolfers (2005), page 845.

Leave a comment

Monthly Archives