From the President: The Siren Song of Objectivity: Risk Assessment Tools and Racial Disparity

Promoted as a method to combat inherent bias in the criminal justice system, risk assessment instruments are being adopted around the country at the pretrial and parole phases. While advocates claim that risk assessment algorithms can reduce the impact of bias in the criminal justice system, opponents caution otherwise.

Access to The Champion archive is one of many exclusive member benefits. It’s normally restricted to just NACDL members. However, this content, and others like it, is available to everyone in order to educate the public on why criminal justice reform is a necessity.

In 2013, both 20-year-old Dylan Fugett and 21-year-old Bernard Parker lived in Broward County, Florida.1 Both had previous run-ins with the law: police charged Fugett in 2010 with felony attempted burglary and Parker in 2011 with misdemeanor resisting arrest without violence.2 Their similar paths continued when, in January 2013, police arrested Parker for felony drug possession with intent to sell.3 A month later, police picked up Fugett for felony cocaine possession and two misdemeanors for possession of marijuana and drug paraphernalia.4 But after both spent a night in jail, the similarities ended. Using a risk assessment algorithm, Broward County scored Fugett a 3 (low risk) and Parker a 10 (high risk). The only discernible difference researchers from ProPublica, who studied the risk assessment tool in Florida, found was the men’s race: Fugett is white and Parker is Black.5 

Promoted as a method to combat inherent bias in the criminal justice system, risk assessment instruments (RAIs) like the one used in Broward County are quickly being adopted around the country at both the pretrial and parole phases. In its most basic pretrial form, an RAI is a series of questions designed to predict the likelihood of a client returning for a future court date and/or re-offending. Theoretically, RAIs help the criminal justice system fulfill its 14th Amendment mandate of due process by reducing overly burdensome bail requirements and lengthy periods of pretrial detention. Indeed, supporters of RAIs point to several examples as effective ways to move the system closer toward a presumption of release while improving court appearance rates.6 Prior to New Jersey switching from money bail to risk assessment in early 2017, 75 percent of the population in New Jersey county jails were waiting for a court date.7 Twelve percent were there because they could not afford bail of less than $2,000.8 The Public Safety Assessment model, created by the Administrative Office of the Courts and now used by New Jersey, employs a database of 1.5 million cases from 300 jurisdictions across the nation, focusing on three results: (1) appearing in court, (2) committing a new crime and (3) committing a violent crime.9 It arrives at a probability for these outcomes through an analysis of the following factors:

  • the person’s age at the current arrest;
  • whether the current offense is violent;
  • pending charges at the time of the offense;
  • prior misdemeanor convictions;
  • prior felony convictions;
  • whether those prior convictions were for violent crimes;
  • prior failure to appear in the past two years;
  • prior failure to appear instances that are older than two years; and
  • prior incarceration sentences.10 

Coupled with a presumption of release, the first month of New Jersey’s program saw 3,382 new cases, but only three in which bail was set.11 In a state where 1 in 8 inmates is awaiting trial behind bars because they could not afford $2,500 bail, the New Jersey program merits continued scrutiny and consideration.12 

Not all outcomes show the promise of New Jersey, however. Baltimore implemented the most recent version of its risk assessment tool in 2010.13 Two years later, however, a report by the Justice Policy Institute found lackluster results: 57 percent of the population in Baltimore City jail were there because they were not offered bail on one or more of their charges.14 Advocates for Baltimore’s risk assessment tool claim that questions posed to clients about drug use can be helpful in putting them in touch with social services. Accounts of how such data is gathered, however, suggest that this seemingly beneficial purpose for RAIs is having a negative effect. Clients report that, without a lawyer present, they had no idea that those same questions could factor into their risk score and thus impact their bail and detention.15 

On the other side of the ledger are those who are offered bail. A study by the Maryland Office of the Public Defender found that, for those who manage to raise the funds, the price is steep. Between 2011 and 2015, more than $250 million was paid in non-refundable corporate bond premiums across Maryland’s 18 district court jurisdictions.16 Making matters worse, bail is most often extracted from those least able to pay it. Of the top 15 zip codes ranked by premium payout, every one had a poverty rate above the state average.17 

Problems like these that accompany money bail are well-documented and finally receiving the attention they deserve. RAIs are billed as one such attempt to break the reliance on money bail and make the process of pretrial considerations more objective. For clients from historically disadvantaged groups – including communities of color – such seemingly objective factors would appear to be welcome guidance for their cases. The Juvenile Detention Alternatives Initiative (JDAI), an effort by the Annie E. Casey Foundation, for example, uses data collection and risk assessment in addition to several other core strategies such as objective admissions criteria, detention alternatives and the expediting of cases in an effort to reduce the reliance on incarceration for youth.18 Since starting more than 20 years ago, JDAI sites (there are more than 250 counties across the country in the program, representing 29 percent of the total youth population in the United States) have reduced the average daily population of youth of color by 40 percent.19 Nevertheless, Black youth are still over five times as likely as their white peers to be detained or committed and Latino youth remain 65 percent more likely than their white peers to be detained.20 

While advocates claim that risk assessment algorithms can reduce the impact of bias in the criminal justice system, opponents caution otherwise. The same bias that can impact a judge’s or prosecutor’s view of a client can also infiltrate the creation of an algorithm. And much like predictive policing, the appearance of objectivity in a scientific tool can make hidden bias even harder to combat. In the study performed last year, ProPublica examined the risk scores of 7,000 people arrested in Broward County, Florida, between 2013 and 2014.21 The study found the COMPAS score (the Correctional Offender Management Profiling for Alternative Sanctions tool developed by Northpointe) was inaccurate in predicting the likelihood of future violent crime, to the tune of an 80 percent failure rate.22 More troubling, however, was how the tool failed. It was worse for black clients:

  • Black defendants were often predicted to be at a higher risk of recidivism than they were. Black defendants who did not recidivate over a two-year period were nearly twice as likely to be misclassified as higher risk compared to their white counterparts (45 percent versus 23 percent).
  • White defendants were often predicted to be less risky than they were. White defendants who re-offended within the next two years were mistakenly labeled low risk almost twice as often as black re-offenders (48 percent versus 28 percent).
  • Even when controlling for prior crimes, future recidivism, age, and gender, black defendants were 45 percent more likely to be assigned higher risk scores than white defendants.23 

COMPAS is one of the more widely used tools in the country, but the method Northpointe uses to weigh the factors found in its survey is considered proprietary, making it hard to discern if and how race influences a person’s score. Even if questions about education level and whether a parent was sent to prison are not outwardly racial in nature, critics contend that they are in fact stand-ins for race. In Risk as a Proxy for Race, Bernard Harcourt describes how the risk assessment trend of narrowing the number of factors considered, along with a focus on previous criminal history, has had a particularly profound impact on African Americans.24 In effect, by pulling data from a system that already disproportionately incarcerates African Americans, risk assessment tools are amplifying and solidifying this imbalance in the criminal justice system.

Northpointe responded to ProPublica’s critique and drew attention to the outcome: Blacks and whites have similarly accurate COMPAS scores (60 percent).25 Stated otherwise, a Black person with a COMPAS score of 7 and a white person with a COMPAS score of 7 both have close to a 60 percent chance of reoffending.26 ProPublica, however, viewed fairness differently, pointing to the statistics above that Blacks were far more likely to be scored as high risks while actually not reoffending later. In a follow up to the dispute, several independent academics evaluated both claims and came to the telling conclusion that both versions of fairness cannot mathematically exist at the same time.27 Under Northpointe’s version, however, Blacks classified as high risk, who will not go on to reoffend, are still subject to heightened bail and scrutiny by the system.

Beyond the pretrial stage of a case, RAI’s are also used at sentencing. In State v. Loomis, Eric Loomis challenged the use of a COMPAS assessment at the sentencing stage.28 Because the inner workings of the COMPAS assessment are considered a trade secret – and unavailable for scrutiny – Mr. Loomis asserted a due process claim related both to what he considered a violation of his rights to an individualized sentence and to a sentence based on precise information.29 Ruling against Mr. Loomis, the court relied on the fact that COMPAS pulls publicly available data, finding that the factors going into an assessment were still available for challenge. Furthermore, Justice Bradley of the Wisconsin Supreme Court distinguished a sentence based solely on a risk assessment score and one that merely took the score into consideration. In describing the appropriate process of the second method, Justice Bradley also prescribed several warnings to accompany the use of a risk assessment.30 Although one of the prescribed warnings is an acknowledgement of studies that show a racial disparity in recidivism risk, critics contend that such warnings, when weighed against the gloss of objectivity provided by data, will amount to little true scrutiny of these tools.31 

As I wrote in an earlier column,32 arguably the most influential officials in a case are the prosecutors who decide whether someone arrested will face charges, what those charges will be and the punishment they will face. Given the tremendous discretion exercised by prosecutors and what we know about implicit bias, real and substantive efforts to hold them accountable and check their authority are good policy. But if those efforts include a risk assessment model that simply repackages racial disparity into a seemingly objective score, we are worse off than when we started.

Conversely, defense attorneys are best-positioned to push back against the indiscriminate use of RAIs. We must take every opportunity to remind judges, policymakers and society at large that these tools are, at best, merely stop-gap measures and, at worst, a disingenuous approach to eliminating disparity. The real road to overhauling and reforming the system necessitates the determined will to change our society at a broader, more fundamental level. The half measure of RAIs will, ultimately, not repair the racism tearing at our system.

Notes

  1. Julia Angwin et al., What Algorithmic Injustice Looks Like in Real Life, ProPublica, May 25, 2016, https://www.propublica.org/article/what-algorithmic-injustice-looks-like-in-real-life.
  2. Id. 
  3. Id. 
  4. Id. 
  5. Id. 
  6. Press Release, Laura and John Arnold Foundation, New Data: Pretrial Risk Assessment Tool Works to Reduce Crime, Increase Court Appearances, Aug. 8, 2016, http://www.arnoldfoundation.org/new-data-pretrial-risk-assessment-tool-works-reduce-crime-increase-court-appearances.
  7. Joel Rose, New Jersey Banking on Shift from Bail Money to Risk Assessment, NPR, Dec. 27, 2016, http://www.npr.org/2016/12/27/507049538/new-jersey-banking-on-shift-from-bail-money-to-risk-assessment.
  8. Id. 
  9. Ephrat Livi, In the US, Some Criminal Court Judges Now Use Algorithms to Guide Decisions on Bail, Quartz, Feb. 28, 2017, https://qz.com/920196/criminal-court-judges-in-new-jersey-now-use-algorithms-to-guide-decisions-on-bail.
  10. Issie Lapowsky, One State’s Bail Reform Exposes the Promise and Pitfalls of Tech-Driven Justice, Wired, Sept. 5, 2017, https://www.wired.com/story/bail-reform-tech-justice 
  11. Lisa Foderaro, New Jersey Alters Its Bail System and Upends Legal Landscape, N.Y. Times, Feb. 6, 2017, https://www.nytimes.com/2017/02/06/nyregion/new-jersey-bail-system.html; see also Michaelangelo Conte, Bail Reform Assuming Nearly All Defendants Be Released Takes Effect, NJ.com, Jan. 2, 2017, http://www.nj.com/hudson/index.ssf/2017/01/bail_reform_assuming_nearly_all_defendants_be_rele.html.
  12. Id. 
  13. George Joseph, Justice by Algorithm, Citylab, Dec. 8, 2016, https://www.citylab.com/equity/2016/12/justice-by-algorithm/505514.
  14. Justice Policy Institute, Bailing on Baltimore: Voices from the Front Lines of the Justice System (2012), http://www.justicepolicy.org/uploads/justicepolicy/documents/bailingonbaltimore-final.pdf.
  15. Joseph, supra note 13.
  16. Maryland Office of the Public Defender, The High Cost of Bail: How Maryland’s Reliance on Money Bail Jails the Poor and Costs the Community Millions, Nov. 2016, http://www.opd.state.md.us/Portals/0/Downloads/High%20Cost%20of%20Bail.pdf.
  17. Id. 
  18. http://www.aecf.org/work/juvenile-justice/jdai.
  19. Richard A. Mendel, The Annie E. Casey Foundation, Juvenile Detention Alternatives Initiative: Progress Report 2014, http://www.aecf.org/m/resourcedoc/aecf-2014JDAIProgressReport-2014.pdf#page=20.
  20. The Sentencing Project, Black Disparities in Youth Incarceration, Sept. 12, 2017, http://www.sentencingproject.org/publications/black-disparities-youth-incarceration; and The Sentencing Project, Latino Disparities in Youth Incarceration, Oct. 21, 2017, http://www.sentencingproject.org/publications/latino-disparities-youth-incarceration.
  21. Julia Angwin et al., Machine Bias, ProPublica, May 23, 2016, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
  22. Id. 
  23. Jeff Larson et al., How We Analyzed the COMPAS Recidivism Algorithm, ProPublica, May 23, 2016, https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm.
  24. Bernard E. Harcourt, Risk as a Proxy for Race, John M. Olin Program in Law and Economics Working Paper No. 535, 2010, http://chicagounbound.uchicago.edu/law_and_economics/433.
  25. Julia Angwin & Jeff Larson, Bias in Criminal Risk Scores Is Mathematically Inevitable, Researchers Say, ProPublica, Dec. 30, 2016, https://www.propublica.org/article/bias-in-criminal-risk-scores-is-mathematically-inevitable-researchers-say.
  26. Sam Corbett-Davies et al., A Computer Program Used for Bail and Sentencing Decisions Was Labeled Biased Against Blacks. It’s Actually Not That Clear, Wash. Post, Oct. 17, 2016, https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorithm-be-racist-our-analysis-is-more-cautious-than-propublicas/?utm_term=.7bc3156f7812.
  27. Id. 
  28. State v. Loomis, 881 N.W.2d 749 (Wis. 2016).
  29. State v. Loomis: Wisconsin Supreme Court Requires Warning Before Use of Algorithmic Risk Assessments in Sentencing, 130 L. Rev. 1530, Mar. 10, 2017, https://harvardlawreview.org/2017/03/state-v-loomis.
  30. Id. 
  31. Id. 
  32. Rick Jones, See No Evil: Prosecution and Unchecked Discretion, The Champion, January/February 2018, at 5.
About the Author

Rick Jones is the executive director and a founding member of the Neighborhood Defender Service of Harlem, which has gained national and international recognition for its early-entry, holistic, client-centered, community-based, team-defense approach to public defense. He teaches the criminal defense externship and a trial practice course at Columbia Law School, serves on the faculty of the National Criminal Defense College in Macon, Georgia, and is a member of the board of the International Legal Foundation.

Rick Jones
Neighborhood Defender Service of Harlem
New York, NY
212-876-5500
www.ndsny.org
rjones@ndsny.org