Skip to main content

California PACEs Action

Artificial Intelligence Will Soon Be Responsible For Reducing Implicit Bias In The San Francisco DA’s Office [Witness LA]

 
by Taylor Walker, Witness LA, June 14, 2019

On July 1, San Francisco District Attorney George Gascón will launch a new artificial intelligence tool meant to eradicate potential racial bias in prosecutors’ charging decisions via a “race-blind charging system.”

The first-of-its-kind algorithmic tool, created by the Stanford Computational Policy Lab, will also be offered free to any other prosecutor’s offices that wish to take part.

“Lady justice is depicted wearing a blindfold to signify impartiality of the law, but it is blindingly clear that the criminal justice system remains biased when it comes to race,” said District Attorney George Gascón. “This technology will reduce the threat that implicit bias poses to the purity of decisions which have serious ramifications for the accused, and that will help make our system of justice more fair and just.”

In recent years, implicit–or unconscious–bias has become more widely acknowledged as an issue that tangibly impacts policing and all other critical stages of the U.S. criminal justice system.

Increasingly, police officials, prosecutors, and public defenders are implementing implicit bias training within their offices, with the hope of reducing racial inequity within the justice system.

San Francisco’s plan takes it a step farther, by bringing in tech that, with luck, will make it far more difficult for prosecutors’ to make decisions based on those subconscious biases.

Read entire article

Add Comment

Comments (0)

Copyright © 2023, PACEsConnection. All rights reserved.
×
×
×
×
Link copied to your clipboard.
×