The Justice Department created an algorithm to measure a person's risk of committing a new crime after leaving prison. But even after multiple tweaks, the tool is leading to racial disparities.

Transcript

STEVE INSKEEP, HOST:

Thousands of people are leaving federal prison this month because of a law called the First Step Act. President Trump signed this bipartisan measure back in 2018, which is designed in part to reduce the federal prison population. Leaders of both parties agreed that too many Americans have been in prison. The Justice Department is using computers to determine who gets a shot at early release. But it turns out the algorithm appears to give biased results, treating people of different races differently.

NPR's Carrie Johnson joins us now.

Carrie, good morning.

CARRIE JOHNSON, BYLINE: Good morning, Steve.

INSKEEP: What are you finding here?

JOHNSON: Well, there are persistent racial disparities and other problems with how the First Step Act is working. Remember, this was supposed to create a way for people to leave prison early if they take life skills classes to help...

INSKEEP: Yeah.

JOHNSON: ...Prepare for their release. The key is that they have to be considered a low or a minimum risk of a return to crime to be eligible for those programs. And the law says the prison system should decide that central question based on a new algorithm called Pattern.

Here's how David Patton, the top federal public defender in New York, described the issue to Congress.

(SOUNDBITE OF ARCHIVED RECORDING)

DAVID PATTON: That score that people receive will directly impact how much time they spend in prison. It is vital.

JOHNSON: These kinds of risk tools are common in the criminal justice system in many states. But Pattern is the first time the federal government has been using an algorithm with such high stakes.

Jim Felman is a lawyer in Tampa, Fla. He's been following the First Step Act for years now.

JIM FELMAN: The significance of this risk assessment tool - it divides all federal prisoners, essentially, into two groups - people who can get credit for doing this programming and get out early and people who can't.

JOHNSON: The implementation has been rocky. The Justice Department finished the first version of Pattern in a rush because of a tight deadline from Congress. The DOJ said that tool suffered from math errors and human errors, so it made some tweaks. About 14,000 men and women in federal prison still wound up in the wrong risk categories. And there were big disparities for people of color.

Aamra Ahmad is senior policy counsel at the ACLU.

AAMRA AHMAD: From the beginning, civil rights groups cautioned Congress and the Justice Department that use of a risk assessment tool to make these determinations would lead to racial disparities.

JOHNSON: Authorities have corrected some of the sloppy mistakes, but those racial disparities persist. Ahmad says they're clear in the Justice Department's own data, released before Christmas.

AHMAD: The Justice Department found that only 7% of Black people in the sample were classified as minimum-level risk, compared to 21% of white people. This indicator alone should give the Department of Justice great pause in moving forward.

JOHNSON: Pattern overpredicted the risk that Black, Hispanic and Asian people in prison would commit new crimes or violate rules, but it underpredicted the risk for some inmates of color when it came to possible return to violent crime.

Patricia Richman works on national policy issues for the Federal Public and Community Defenders.

PATRICIA RICHMAN: When you use a term like risk assessment tool, it has this patina of science. It sounds highly technical. But it's not. A risk assessment tool is just a series of policy decisions.

JOHNSON: Policy decisions like what should count and how much - take criminal history. That can be a problem because law enforcement has a history of overpolicing some communities of color. Then there is education level and whether someone paid restitution to their victims. Those factors can intersect with race and ethnicity, too.

Melissa Hamilton is a professor of law and criminal justice at the University of Surrey. Hamilton studies risk assessments. She says she's glad to see the Justice Department has tweaked the algorithm, but she says there's still a ways to go to make it work.

MELISSA HAMILTON: The legislation, I think, came from a good place for most of the congresspersons who voted for it. It's just - the rule of unintended consequences is not really realizing the impediments it was going to have.

JOHNSON: The Justice Department says it's aware of the problems with Pattern, and it's working with experts to make the risk assessment tool more fair and more accurate. Another overhaul of the tool is underway. Then Justice has to reevaluate the 14,000 people in prison it says got lumped into the wrong category.

Sasha Costanza-Chock is director of research and design for the Algorithmic Justice League, which studies the social implications of artificial intelligence.

SASHA COSTANZA-CHOCK: This is just one example of the ways that harmful AI systems are being rolled out in everything from, you know, the criminal legal system to employment decisions to who gets access to housing and social benefits.

JOHNSON: Costanza-Chock says the burden is on the Justice Department to prove the Pattern tool doesn't have racist and sexist outcomes.

COSTANZA-CHOCK: Especially when the systems are high risk and affect people's liberty - we need much clearer and stronger oversight.

JOHNSON: Jim Felman is the Florida lawyer working with the American Bar Association to monitor the First Step Act. He worries the tool is already putting many prisoners of color at a disadvantage.

FELMAN: We will start to see more prisoners get out early. My concern is that the color of their skin will not be reflective of fairness.

JOHNSON: Aamra Ahmad from the ACLU says she's seen enough.

AHMAD: There are no technical fixes to these problems that could make Pattern and similar tools safe and fair to use. We would urge the Justice Department to suspend the use of Pattern until it can adequately address these concerns.

JOHNSON: Melissa Hamilton, who studies risk assessment, says Pattern may be worth saving. Consider the alternative, Hamilton says - decisions made by people who have all kinds of biases.

HAMILTON: So that's the unfortunate thing - is it's better than gut instinct of very flawed humans that we all are. And can we improve it more than marginally? And that's what we're working on.

INSKEEP: NPR's justice correspondent Carrie Johnson is still with us. And, Carrie, what is the Justice Department saying about what you found?

JOHNSON: DOJ didn't want to talk on tape about this, but they sent a written statement. Attorney General Merrick Garland has directed people in the department who try to address some racial bias in this tool and also to make the process more transparent. One option on the table, Steve, is to adjust the cut-off points between these risk categories. It's technical but important. It would allow more prisoners to take programs, earn credits for release and eventually get released early. Justice says it also wants to keep public safety in mind here, too. It's highly technical stuff. It could take a while to finish. And remember, there are still 14,000 men and women in prison who need to be reevaluated 'cause they got lumped into the wrong category in the first place. Even if DOJ moves ahead, it's not clear they could find a way to eliminate all the racial bias here. That's why some advocates want to see Justice and Congress just drop this algorithm altogether.

INSKEEP: NPR national justice correspondent Carrie Johnson - Carrie, thanks as always for your reporting.

JOHNSON: Thank you.

(SOUNDBITE OF MAKAYA MCCRAVEN'S “INNER FIGHT”) Transcript provided by NPR, Copyright NPR.