BOISE, Idaho — Two Treasure Valley lawmakers from opposite sides of the aisle are teaming up on a bill they say will keep defendants from being unfairly labeled by a computer algorithm as likely future criminals.
House Bill 118, introduced last week by Rep. Greg Chaney, R-Caldwell, and Sen. Cherie Buckner-Webb, D-Boise, takes aim at the pretrial risk-assessment algorithms used in Idaho courtrooms to give judges an idea how likely a person is to commit more crimes in the future.
A defendant’s risk level – whether low, medium, or high - can affect everything from how high their bond is set, whether they receive probation or prison time, and even whether they could be a candidate for early release.
But a 2016 investigation by ProPublica found that some computer programs used to determine those scores can have a racial bias, routinely scoring minority defendants as a higher risk to re-offend than white defendants.
The report, which looked at a sample of 7,000 people arrested in Florida and risk-assessed by Northpointe’s COMPAS computer algorithm found that black defendants were incorrectly flagged as future criminals almost twice as often as whites, and that white defendants were mislabeled as low-risk – someone unlikely to commit another crime – more often than black defendants.
Even when the study controlled for other factors, isolating the effect of race from age, gender, criminal history, and recidivism, black defendants were still 45 percent more likely to be predicted to commit any crime in the future, and 77 percent more likely to be rated as a risk to commit a future violent crime.
The risk-assessment algorithms are not wrong every time. But when they are, the designation of who will go on to commit more crimes can mean defendants are unjustly handed lengthy sentences on the weight of what they might do in the future.
Rep. Chaney told KTVB Thursday that’s the issue House Bill 118 is seeking to address.
“There are basically computers out there telling judges how likely it is that you'll commit a crime,” he said. “It's all very much like the movie Minority Report, where they're punishing people for crimes they haven’t committed yet.”
In addition, Chaney said, there is little recourse for a defendant who disputes his or her assigned risk level, or wants to find out why they received a certain score.
“The frightening thing is, a lot of these computerized predictors are being put out by vendors in other states who refuse to turn over just how it is they got to the score that they did,” he said.
The bill co-sponsored by him and Buckner-Webb calls for any algorithm used to conduct pre-trial risk assessment to formally tested and shown to be free of bias, and for the program to be transparent in making any documents or information relating to its scoring open to the public.
Chaney stressed that the bill will not do away with criminal risk-assessment altogether or mandate that all criminal cases should be approached the same way.
A defendant with ten prior convictions, for example, should “absolutely” be treated differently than a defendant with no criminal history, he said.
“But we don't need a computer to do that, and we don't need a secretive program to do that,” he said.
An individual’s criminal past is not the only factor taken into account by some of the algorithms: The program studied by ProPublica included seemingly-unrelated questions, including ones about the defendant’s income level, whether his or her parents had separated, and whether friends or relatives had been arrested.
Chaney said he’s not surprised the bill is receiving bipartisan support in the Idaho Legislature.
“It’s a point at which those on the right and those on the left can find some pretty favorable common ground,” he said. “Nobody wants to see racial bias baked into our system and we need to make sure we have a criminal justice system that’s even-handed.”
The bill is expected to get a hearing in the Judiciary Rules and Administration Committee next week.