Return to Timeline of the History of Computers
Algorithm Influences Prison Sentence
“Eric Loomis was sentenced to six years in prison and five years’ extended supervision for charges associated with a drive-by shooting in La Crosse, Wisconsin. The judge rejected Loomis’s plea deal, citing (among other factors), the high score that Loomis had received from the computerized COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) risk-assessment system.
Loomis’ lawyers appealed his sentence on the grounds that his due process was violated, as he did not have any information into how the algorithm derived his score. As it turns out, neither did the judge. And the creators of COMPAS — Northpointe Inc. — refused to provide that information, claiming that it was proprietary. The Wisconsin Supreme court upheld the lower court’s ruling against Loomis, reasoning that the COMPAS score was just one of many factors the judge used to determine the sentence. In June 2017, the US Supreme Court decided not to give an opinion on the case, after previously inviting the acting solicitor general of the United States to file an amicus brief.
Data-driven decision-making focused on predicting the likelihood of some future behavior is not new — just ask parents who pay for their teenager’s auto insurance or a person with poor credit who applies for a loan. What is relatively new, however, is the increasingly opaque reasoning that these models perform as a consequence of the increasing use of sophisticated statistical machine learning. Research has shown that hidden bias can be inadvertently (or intentionally) coded into an algorithm. Illegal bias can also result from the selection of data fed to the data model. An additional question in the Loomis case is whether gender was considered in the algorithm’s score, a factor that is unconstitutional at sentencing. A final complicating fact is that profit-driven companies are neither required nor motivated to reveal any of this information.
State v. Loomis helped raise public awareness about the use of “black box” algorithms in the criminal justice system. This, in turn, has helped to stimulate new research into development of “white box” algorithms that increase the transparency and understandability of criminal prediction models by a nontechnical person.”
Computer algorithms such as the COMPAS risk-assessment system can influence the sentencing of convicted defendants in criminal cases.
Angwin, Julia, Jeff Larson, Surya Mattu, and Lauren Kirchner. “Machine Bias” ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
Eric L. Loomis v. State of Wisconsin, 2015AP157-CR (Supreme Court of Wisconsin, October 12, 2016).
Harvard Law Review. “State v. Loomis: Wisconsin Supreme Court Requires Warning Before Use of Algorithmic Risk Assessments in Sentencing.” Vol. 130 (March 10, 2017): 1530–37.
Liptak, Adam. “Sent to Prison by a Software Program’s Secret Algorithms.” New York Times online, May 1, 2017. https://www.nytimes.com/2017/05/01/us/politics/sent-to-prison-by-a-software-programs-secret-algorithms.html.
Pasquale, Frank. “Secret Algorithms Threaten the Rule of Law.” MIT Technology Review, June 1, 2017. https://www.technologyreview.com/s/608011/secret-algorithms-threaten-the-rule-of-law/.
State of Wisconsin v. Eric L. Loomis 2015AP157-CR (Wisconsin Court of Appeals District IV, September 17, 2015). https://www.wicourts.gov/ca/cert/DisplayDocument.pdf?content=pdf&seqNo=149036.