Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments

Big Data. 2017 Jun;5(2):153-163. doi: 10.1089/big.2016.0047.

Abstract

Recidivism prediction instruments (RPIs) provide decision-makers with an assessment of the likelihood that a criminal defendant will reoffend at a future point in time. Although such instruments are gaining increasing popularity across the country, their use is attracting tremendous controversy. Much of the controversy concerns potential discriminatory bias in the risk assessments that are produced. This article discusses several fairness criteria that have recently been applied to assess the fairness of RPIs. We demonstrate that the criteria cannot all be simultaneously satisfied when recidivism prevalence differs across groups. We then show how disparate impact can arise when an RPI fails to satisfy the criterion of error rate balance.

Keywords: bias; disparate impact; fair machine learning; recidivism prediction; risk assessment.

MeSH terms

  • Decision Making*
  • Empirical Research
  • Humans
  • Models, Theoretical
  • Risk Assessment