Accuracy and Fairness for Juvenile Justice Risk Assessments

Richard A. Berk

Risk assessment algorithms used in criminal justice settings are

often said to introduce “bias”. But such charges can conflate an algorithm’s

performance with bias in the data used to train the algorithm

and with bias in the actions undertaken with an algorithm’s output.

In this paper, algorithms themselves are the focus. Tradeo↵s between

di↵erent kinds of fairness and between fairness and accuracy are illustrated

using an algorithmic application to juvenile justice data. Given

potential bias in training data, can risk assessment algorithms improve

fairness, and if so, with what consequences for accuracy? Although

statisticians and computer scientists can documents the tradeo↵s, they

cannot provide technical solutions that satisfy all fairness and accuracy

objectives. In the end, it falls to stakeholders to do the required

balancing using legal and legislative procedures, just as it always has.

PDF icon Berk_FairJuvy_1.2.2018.pdf