<< Liberals Discover Problems with Clemency | Main | Think for Yourself >>


Release Decisions By Computer

| 2 Comments
Eric Siddall of the L.A. Association of Deputy District Attorneys has this post.

As stories emerge about the Arnold Foundation's "algorithm" pretrial release tool, we should be disturbed about the results.  As covered in a previous blog, use of the tool is linked to two murders and the wholesale release of dangerous felons.
 
However, a Wired story raises even more questions about the Arnold Foundation algorithm.   It turns out the tool was given to San Francisco for free, but with conditions that bars the disclosure of "any information about the use of the Tool, including any information about the development, operation and presentation of the Tool."
There is something to be said for having decisions made according to a formula rather the subjective judgment of a human decision-maker.  In terms of practical effects, the formula may predict dangerousness better than a seat-of-the-pants judgment.  In terms of fairness, a formula avoids the problem of different judges making different decisions on the same facts.  A formula can also reduce bias problems, if done correctly.  Those were the reasons behind the Sentencing Reform Act of 1984 and the originally mandatory guidelines under that law.  A computer algorithm is essentially just a sophisticated formula.
If formulas are going to be used, though, they must be transparent and open to public scrutiny.  ADDA is worried about the impact on public safety of releasing people who should not be released.  It works the other way, also.  We need to be concerned about jailing people who don't need to be jailed.

We must also keep in mind that utility and justice are not always congruent.  Some factors should not be used in criminal justice decisions even if they do correlate with future dangerousness.  See my post on last February's Supreme Court decision in Buck v. Davis.  Criminal justice consequences should depend on a person's voluntary choices, not factors beyond his control.  An algorithm and its supporting data need to be open to scrutiny to insure that decisions are just in addition to being useful, or at least that the proper balance is struck.  This is a policy question than cannot be left to technocrats.

There is an inherent problem with using algorithms developed by private entities.  They regard the algorithms and supporting data as proprietary information.  But they can't be proprietary if they are going to receive the needed scrutiny.  Some of my economic libertarian friends in the FedSoc reflexively assert that private is always better than government.  No, not always.  Here is a good case for the government to spend its own bucks and develop its own algorithm in the public domain.

2 Comments

Other legitimate arguments against computer-based release decisions aside, this strikes me as important.
The Wired article references an earlier Pro Publica article, which alleged that a different (from the Arnold Foundation software) computer-based predictive release-decision suite was racially biased. They offered some statistics which ostensibly demonstrated this bias, but more accurately, demonstrated the very poor record of the software to predict re-offending (or non re-offending).
Surprisingly the Pro Publica article included the actual questionnaire that decisions were based upon. I'll link below. First, the questionnaire is in large part, completed by the arrestee, and the data remains unverified in any way. So, an arrestee may be completely dishonest when answering the questions.
Appearing nowhere among the questions and not considered in any fashion is the race of the subject. So, for Prop Publica to actually show some likelihood that the software is racially biased, they would have to point somewhere in the questionnaire that contains some implicit racial bias. That evidence is lacking.
And I would add another concern: judging success or failure in the software used to predict re-offending considers only follow-up arrests. It fails to provide for the sense that the vast majority of crimes remain unsolved, or that law enforcement can, as we see now, suffer periodic (and sometimes, geographic) lapses in crime-fighting. Because any particular defendant is not caught reoffending while on pre-trial release may be more about the finesse of the defendant or the initiative of the cops, rather than the factual question of whether he/she has reoffended.
A link to the questionnaire: See if you spot any racial bias in the questions. Do you see any areas of questions which are answered by a defendant and which may provide a false basis for decision? https://www.documentcloud.org/documents/2702103-Sample-Risk-Assessment-COMPAS-CORE.html

Leave a comment

Monthly Archives