Skip to main content

Racial Bias in Criminal Risk Scores Is Mathematically Inevitable [PSMag.com]

 

The racial bias that ProPublica found in a formula used by courts and parole boards to forecast future criminal behavior arises inevitably from the test’s design, according to new research.

The findings were described in scholarly papers published or circulated over the past several months. Taken together, they represent the most far-reaching critique to date of the fairness of algorithms that seek to provide an objective measure of the likelihood a defendant will commit further crimes.

Increasingly, criminal justice officials are using similar risk prediction equations to inform their decisions about bail, sentencing, and early release.

The researchers found that the formula, and others like it, has been written in a way that guarantees black defendants will be inaccurately identified as future criminals more often than their white counterparts.

The studies, by four groups of scholars working independently, suggest the possibility that the widely used algorithms could be revised to reduce the number of blacks who were unfairly categorized without sacrificing the ability to predict future crimes.



[For more of this story, written by Julia Angwin and Jeff Larson, go to https://psmag.com/racial-bias-...3d898b37e#.xb9z0e4fo]

Add Comment

Comments (0)

Post
Copyright © 2023, PACEsConnection. All rights reserved.
×
×
×
×
Link copied to your clipboard.
×