Chapter 70. Toward Algorithmic Humility

Marc Faddoul

Defendant #3172 is an unmarried 22-year-old female. She previously served two months in prison for marijuana trafficking and has just been arrested for engaging in a violent public altercation with her partner. Is the defendant going to commit a violent crime in the three-month period before the trial? To answer such a question, many American jurisdictions use algorithmic systems known as pretrial risk assessment tools. Let’s consider one of the most common of these tools, the Public Safety Assessment (PSA).1

When the PSA sees a high risk, it raises a red flag, and this automatically sends the defendant into detention, without further consideration from the judge to challenge the machine’s prediction. The stakes are high, as pretrial detention often comes with devastating consequences for the job and housing security of defendants, including those who are later proven innocent at trial. Tragically, 97% of these life-wrecking algorithmic red flags are in fact false alarms.2 In other words, 3% of flagged defendants would have actually committed a violent crime had they been released, while the other 97% were detained unnecessarily. This is a strikingly poor performance, but it is somewhat unsurprising.

Foreseeing a crime in the near future is hard, and machines are not oracles. ...

Get 97 Things About Ethics Everyone in Data Science Should Know now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.