Programming a "Fairer" System: Assessing Bias in Enterprise AI Products (B)
This case is part of the Giving Voice to Values (GVV) curriculum. To see other material in the GVV curriculum, please visit http://store.darden.virginia.edu/giving-voice-to-values. In this case, Timothy Brennan is the founder and CEO of technology company Northpointe, Inc. (Northpointe), and the creator of its flagship software program, COMPAS, an artificially intelligent software tool for US court systems that predicts a defendant's likelihood to reoffend and informs bail, parole, and probation sentencing decisions. Brennan originally created COMPAS in order to standardize decision-making within the criminal justice system and to reduce the likelihood of human error or bias impacting court rulings. However, years after COMPAS's public release and widespread adoption within the US court systems, an investigative journalism report claims that COMPAS is more likely to mislabel Black defendants as higher risk and White defendants as lower risk of recidivism. To complicate the matter, any coding adjustments that Northpointe would make to uncover or address the bias-causing programming might reduce the software's performance or reveal sensitive operational information to competitors. In the A case, Brennan's challenge is to organize a response to investigate bias within the COMPAS software, while still protecting the complexity and intellectual property of the product. In this B case, students read a synopsis of Brennan's actual response and review its implications for Northpointe and the US criminal justice system. They are encouraged to consider how Brennan could have responded more creatively and constructively.