Accuracy and Fairness for Juvenile Justice Risk Assessments
Richard Berk
Journal of Empirical Legal Studies, 2019, vol. 16, issue 1, 175-194
Abstract:
Risk assessment algorithms used in criminal justice settings are often said to introduce “bias.” But such charges can conflate an algorithm's performance with bias in the data used to train the algorithm with bias in the actions undertaken with an algorithm's output. In this article, algorithms themselves are the focus. Tradeoffs between different kinds of fairness and between fairness and accuracy are illustrated using an algorithmic application to juvenile justice data. Given potential bias in training data, can risk assessment algorithms improve fairness and, if so, with what consequences for accuracy? Although statisticians and computer scientists can document the tradeoffs, they cannot provide technical solutions that satisfy all fairness and accuracy objectives. In the end, it falls to stakeholders to do the required balancing using legal and legislative procedures, just as it always has.
Date: 2019
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
https://doi.org/10.1111/jels.12206
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:wly:empleg:v:16:y:2019:i:1:p:175-194
Access Statistics for this article
More articles in Journal of Empirical Legal Studies from John Wiley & Sons
Bibliographic data for series maintained by Wiley Content Delivery ().