AI Is Now Deciding Bail for Suspected Criminals
February 8, 2018 - 4 minutes readIn our previous news segments, we’ve discussed how artificial intelligence (AI) is affecting healthcare, education, and finance. Now, we can add the law to that list. AI-powered algorithms are helping judges in courtrooms across the country make bail decisions.
A Need for Reform
Cash bail, a policy that has been part of the U.S. courts since their inception, has been the subject of heavy scrutiny for a long time now. Created for the purpose of ensuring that legal offenders actually show up for trial, it has gained a reputation for allowing the wealthy to walk free while those who cannot afford to pay large monetary sums are forced to remain in jail.
Studies have shown that the practice of cash bail worsens racial disparities. It’s easy to see why a bipartisan reform effort is trying to leverage AI development to improve it. Usually, the choice of releasing defendants on bail is left up to the judge presiding over the case. But algorithms can sift through legal data faster than any human and “connect the dots” to distill vital conclusions to inform the judge’s decision.
A Paradigm Shift in Legal Proceedings
The most popular algorithm aiding judges in pre-trial proceedings right now is the Public Safety Assessment score, developed in Houston by the Laura and John Arnold Foundation. Currently, it’s being used in Alaska, Arizona, and Kentucky, among other states.
New Jersey fully embraced the algorithmic assessment concept by revamping its pre-trial court system with it. From the moment a suspect’s fingerprints are taken by police, the data is transmitted to Pretrial Services, a new division that manages the defendant’s data processing.
AI is primarily used in pre-trial proceedings to predict the likelihood that people will either skip their court date or commit another crime. Defendants receive a score for each factor. The lower the score is out of a maximum of six, the more eligible the defendant is for release until their court date. If discharged from jail, the defendant receives text alerts so they don’t forget their court date.
The algorithm takes nine determinants into account for evaluation, including previous convictions and age. It doesn’t take employment history, the location of residence, arrest history, race, or gender into account. Ideally, the algorithm reduces bias and allows people to stay out of jail if they’re not really a threat to the public.
Guilty by Association
The Public Safety Assessment is not to be confused with COMPAS, a controversial commercially available system that aids judges in determining prison sentences for convicted offenders. Investigations hint that COMPAS actually has a racial bias in its decision-making, and others argue that its inclusion of gender as a factor is unconstitutional.
Of course, the Arnold Foundation’s algorithm doesn’t need any association with COMPAS to garner its fair share of critics. Doubters think that judges will just begin to rely on the data to make the decision. Other disparagers worry that it won’t just replace judges in this part of the legal system, but will also ironically create biases that are even more difficult to recognize.
Kristian Hammond, a computer scientist at Northwestern University, thinks the solution to the problem boils back down to the algorithm itself: “[We should] refuse to build boxes that give you answers. What judges really need are boxes that give you answers and explanations and ask you if there’s anything you want to change.”
Tags: AI, AI App Developer, AI App Development, AI applications, AI apps, AI assistant, app developers Houston, artificial intelligence, artificial intelligence app development, Houston app development, Houston mobile app developer, law, laws