Digital Commons @ Georgia Law Scholarly Works Faculty Scholarship 1-1-2019 Bias In, Bias Out Sandra Mayson Assistant Professor of Law University of Georgia School of Law,
[email protected] University of Georgia School of Law Research Paper Series Paper No. 2018-35 Repository Citation Sandra G. Mayson, Bias In, Bias Out , 128 Yale L.J. 2218 (2019), Available at: https://digitalcommons.law.uga.edu/fac_artchop/1293 This Article is brought to you for free and open access by the Faculty Scholarship at Digital Commons @ Georgia Law. It has been accepted for inclusion in Scholarly Works by an authorized administrator of Digital Commons @ Georgia Law. Please share how you have benefited from this access For more information, please contact
[email protected]. SANDRA G. MAYSON Bias In, Bias Out abstract. Police, prosecutors, judges, and other criminal justice actors increasingly use al- gorithmic risk assessment to estimate the likelihood that a person will commit future crime. As many scholars have noted, these algorithms tend to have disparate racial impacts. In response, critics advocate three strategies of resistance: (1) the exclusion of input factors that correlate closely with race; (2) adjustments to algorithmic design to equalize predictions across racial lines; and (3) rejection of algorithmic methods altogether. This Article’s central claim is that these strategies are at best superficial and at worst counter- productive because the source of racial inequality in risk assessment lies neither in the input data, nor in a particular algorithm, nor in algorithmic methodology per se. The deep problem is the nature of prediction itself. All prediction looks to the past to make guesses about future events.