A prosecutor’s decision to charge or dismiss a criminal case is a particularly high-stakes choice. There is concern, however, that these judgements may suffer from explicit or implicit racial bias, as with many other such actions in the criminal justice system. To reduce potential bias in charging decisions, we designed a system that algorithmically redacts race-related information from free-text case narratives. In a first-of-its-kind initiative, we deployed this system at a large American district attorney’s office to help prosecutors make race-obscured charging decisions, where it was used to review many incoming felony cases. We report on both the design, efficacy, and impact of our tool for aiding equitable decision-making. We demonstrate that our redaction algorithm is able to accurately obscure race-related information, making it difficult for a human reviewer to guess the race of a suspect while preserving other information from the case narrative. In the jurisdiction we study, we found little evidence of disparate treatment in charging decisions even prior to deployment of our intervention. Thus, as expected, our tool did not substantially alter charging rates. Nevertheless, our study demonstrates the feasibility of race-obscured charging, and more generally highlights the promise of algorithms to bolster equitable decision-making in the criminal justice system.
Chohlas-Wood, Alex, Joe Nudell, Keniel Yao, Zhiyuan (Jerry) Lin, Julian Nyarko, and Sharad Goel. "Blind Justice: Algorithmically Masking Race in Charging Decisions." Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society. Association for Computing Machinery, July 2021, 35-45.