2.8 C
New York
Monday, March 25, 2024

Algorithms Were Supposed to Fix the Bail System. They Haven't

logoThe AI Database →

Application

Ethics

Prediction

Regulation

End User

Government

Sector

Public safety

If you are booked into jail in New Jersey, a judge will decide whether to hold you until trial or set you free. One factor the judge must weigh: the result from an algorithm called PSA that estimates how likely you are to skip court or commit another crime.

New Jersey adopted algorithmic risk assessment in 2014 at the urging, in part, of the nonprofit Pretrial Justice Institute. The influential Baltimore organization has for years advocated use of algorithms in place of cash bail, helping them spread to most states in the nation.

Then, earlier this month, PJI suddenly reversed itself. In a statement posted online, the group said risk-assessment tools like those it previously promoted have no place in pretrial justice because they perpetuate racial inequities.

“We saw in jurisdictions that use the tools and saw jail populations decrease that they were not able to see disparities decrease, and in some cases they saw disparities increase,” says Tenille Patterson, an executive partner at PJI.

Asked to name a state where risk-assessment tools didn’t work out, she pointed to New Jersey. State figures released last year show jail populations fell nearly by half after the changes, which took effect in 2017, eliminating cash bail and introducing the PSA algorithm. But the demographics of defendants stuck in jail stayed largely the same: about 50 percent black and 30 percent white.

Pete McAleer, a spokesperson for New Jersey’s Administrative Office of the Courts, said PSA alone could not be expected to eliminate centuries-old inequities and that the state’s reforms had prevented many black and Hispanic defendants from being detained. State officials are seeking ways to eliminate remaining disparities in its justice system, he said.

PJI’s switch from advocate to opponent of risk-assessment algorithms reflects growing concerns about the role of algorithms in criminal justice and other arenas.

In an open letter last July, 27 prominent academics suggested pretrial risk assessments be abandoned. The researchers said the tools are often built on data that reflects racial and ethnic disparities in policing, charging, and judicial decisions. “These problems cannot be resolved with technical fixes,” it said.

Last month, concerns about racial bias prompted Ohio’s Supreme Court to delete from a list of proposed bail reforms a recommendation that the state adopt risk-assessment tools. The court’s recommendations will become law automatically this summer unless both chambers of the state legislature block them. In December, a Massachusetts commission rejected risk-assessment tools in that state’s report on bail reform, and cited potential racial bias. California voters will decide in November whether to repeal a new state law that would eliminate cash bail and require use of risk assessments.

Beyond bail systems, critics have cautioned about blindly trusting algorithms in areas as diverse as facial recognition, where many algorithms have higher error rates for darker skin; health care, where researchers found evidence a widely used care-management system pushed black patients to the back of the line; and online recommendations, accused of amplifying conspiracy theories and hate speech.

>

Risk-assessment algorithms like PSA have spread across the US as part of efforts to scrap or minimize use of cash bail to decide who gets to go home while awaiting a court date. Criminal justice reformers view bail as unfair to people from poorer, minority communities, who often are jailed for only minor charges. Instead of a single judge making such decisions alone, potentially introducing personal biases, combining human judgement with mathematical formulas based on statistics from past cases was thought to be fairer. A recent report from the Philadelphia nonprofit the Media Mobilizing Project and the Oakland nonprofit MediaJustice found that such tools are used in at least parts of 46 states and the District of Columbia.

PSA is among the most widely used algorithms. It was developed by the Arnold Foundation, a philanthropic organization now called Arnold Ventures that has given grants to PJI totaling more than $500,000 since 2017.

“Many states and jurisdictions have adopted these tools, and PJI and the Arnold Foundation have been promoting it,” says Ben Winters, a researcher on data and algorithms in criminal justice at the Electronic Privacy Information Center. “Now there’s a turn towards skepticism and pulling back.”

Last week, Advancing Pretrial Policy and Research, which is funded by Arnold Ventures and distributes PSA, posted a response to PJI’s change of position, saying tools like PSA cannot alone fix pretrial justice. In a statement to WIRED, APPR’s codirectors, Madeline Carter and Alison Shames, said PSA is “one strategy in a comprehensive approach to achieving fair, just, and effective pretrial justice.”

Most risk-assessment tools are relatively simple and produce scores based on a handful of data about a person and their case. PSA uses nine factors, including a person’s age, prior convictions, and pending charges.

Evidence of how such tools perform in practice is relatively scarce, but recent studies show the results can be surprising and uneven. A study of Virginia, which adopted risk assessment in 2002, found that judges overruled an algorithmic risk-assessment’s recommendations most of the time, and that racial disparities increased among circuits that used risk assessment the most. A study of Kentucky, another early adopter, found that after risk-assessment tools were introduced, white defendants were offered no-bail release much more often than blacks.

Concern has also grown about the bias baked into the statistics underlying risk scoring algorithms stemming from the realities of American policing and justice. It broke into the mainstream in 2016 when a ProPublica investigation described how an algorithm called COMPAS used in Broward County, Florida, overestimated the risk of recidivism by black defendants.

Patterson of PJI says the group changed its view of algorithms in pretrial justice in part because since 2018 it has placed more emphasis on racial justice and has begun listening more to grassroots organizations. “We heard people in these communities saying these are tools of harm,” Patterson says.

Divorcing bail reform from risk-assessment algorithms may not be easy. Julian Adler, director of policy and research at the Center for Court Innovation, a nonprofit that has worked with New York State and Arnold Ventures, says the two have been closely entwined—in part thanks to the work of PJI. “It raises a lot of questions about what’s to come,” he says.

PJI says it has already begun talking with officials in places that took its previous advice about how they might follow the nonprofit in turning their back on risk-assessment algorithms. Alternatives include laying out simple, clear rules that send most defendants home and guide judges on what can justify holding someone in jail. PJI expects to publish updated, algorithm-free guidance on pretrial reform next month—likely adding momentum to the turn against risk-assessment algorithms.

Related Articles

Latest Articles