For many years, criminal courts throughout the United States have struggled with a critically important issue: how do you fairly determine whether a particular defendant is a good risk for bail, probation, or a light prison sentence? Many jurisdictions today turn to predictive risk assessment software to help guide such decisions. There’s a catch, however — the algorithms that translate input data into risk assessment scores are proprietary software, and neither courts nor lawyers know just how such scores are generated.
Obvious issues of fairness in sentencing and due process lurk behind the use of proprietary software in bail and sentencing decisions, but for the moment, they will remain matters for academic debate. On June 26, the U.S. Supreme Court denied a petition for further review in Loomis v. Wisconsin, where a man who pleaded guilty to lesser charges of fleeing an officer and operating a vehicle without authorization received a six-year sentence for his role behind the wheel in a drive-by shooting that resulted in no injuries.
The prosecution alleged that Loomis drove the car during the shooting, while the defendant maintained that he only drove the vehicle afterward. Either way, the charge was dropped as part of the plea negotiation, but all five of the original charges, including weapons offenses, were in the record at the time of sentencing. Loomis also was identified as a high-risk offender through Northpointe Inc.’s COMPAS risk assessment software.
Loomis unsuccessfully challenged his sentence all the way to the Wisconsin supreme court, which upheld the trial court’s decision, even though neither the sentencing judge, the prosecution, nor the defense attorney knew just how proprietary risk assessment software processed the answers to a 137-question survey completed by corrections officers and Loomis himself.
The basis of the appeal was denial of due process through the use of a proprietary risk assessment tool that could not be examined or challenged for scientific validity, due to its guarded proprietary nature. While the Wisconsin supreme court recognized that a criminal defendant has the right to be sentenced on the basis of accurate information, the court also observed that the same sentence would have been ordered anyway. Nevertheless, both the prosecution and the court made frequent reference to the COMPAS risk assessment during the arguments in court.
Wisconsin’s high court implicitly found that the sentencing judge correctly followed the instructions that accompanied the risk assessment tool, including this caveat: “It is very important to remember that risk scores are not intended to determine the severity of the sentence or whether an offender should be incarcerated.”
The sentencing judge pointed to several factors that supported a heavy sentence apart from the high-risk scores that the COMPAS tool assigned Loomis. The state supreme court therefore held that because other information in the record was sufficient to support the sentence, the use of the proprietary software to characterize the defendant’s risk was not a violation of his due process rights.
Eric Loomis might not have been the most sympathetic felon to have challenged the use of proprietary risk assessment software in sentencing decisions. He was a registered sex offender with a long rap sheet, including four arrests while on probation. The broader point, however, that closely guarded algorithms can determine the fate of criminal defendants, continues to generate vigorous legal and public policy debate around the country.
One version of nonproprietary risk assessment software has been found to reduce incarceration with no cost to public safety in Virginia, where fewer defendants are being sent to prison after conviction. Elsewhere, a controversial study of the COMPAS software in Broward County, Florida was found to understate the reoffense risk of white defendants while overstating that of African-Americans, even though no racial data is used in generating the risk assessment scores. Subsequent research has challenged the findings and methods of the Florida study.
As the dependence of American society on Big Data continues to deepen, the use of proprietary software in court decisions concerning bail, incarceration, or release can be expected to increase. What remains in question is the willingness of policymakers and courts to make sure that such decisions are made without excessive reliance on computer systems that lack transparency or external validation.