Analysis of racial bias in Northpointe's COMPAS algorithm
This study will evaluate the effectiveness and validity of an algorithm that is widely used in the criminal justice system. In the year 2000, Northpointe introduced a risk assessment algorithm, called Correctional Offender Management Profiling for Alternative Sanctions, often referred to as COMPAS, in attempt to optimize the pretrial and sentencing stages of the process. COMPAS computes scores for defendants and these scores assist judges in deciding jail terms, sentencing, and probation. Some argue that such algorithms are more accurate and less biased than humans, but COMPAS has been highly criticized for perpetuating the systemic racial bias that is currently present in the criminal justice system. This analysis will test the logical bases of the algorithm and its predictive ability. The results of this analysis cast doubt on the usefulness of the COMPAS algorithm if fairness is the goal of the criminal justice system. A logistic regression analysis will show that the two most important variables for predicting recidivism are a defendant’s age and the number of prior offenses committed. A linear model using only these two predictors will be shown to yield better predictive accuracy than COMPAS. The analysis will also show that COMPAS scores were unevenly distributed between different racial groups. In general, African Americans were more likely than Caucasians to be given higher scores regardless of their recidivism rates. Further, more frequently than Caucasians, African Americans who were assigned lower scores did not recidivate.