Enable javascript in your browser for better experience. Need to know to enable it?

魅影直播

Blogs Banner

How artificial intelligence is transforming the criminal justice system

Many of us are familiar with the idea that artificial intelligence systems are regularly making benign decisions, like recommendations on Netflix or Amazon. What about decisions that have a significant impact on someone鈥檚 life though? Should AI systems be used in life-changing situations, like criminal sentencing?

How Artificial Intelligence is transforming the criminal justice system

In fact, it's and has been for several years. It's about time we look at this tech and consider: what are the risks and what are the benefits of AI in criminal sentencing? Which problems does AI reinforce, and which does it alleviate?

These questions came to light for me as part of my research work on the , where I was contributing to an AI system for . What I discovered was both shocking and fascinating.

One of the most important considerations in judicial decisions, from setting or denying bail, to sentencing, is 鈥渞isk of recidivism鈥 鈥 the likelihood of reoffending. Prior to using algorithms to make these decisions 鈥 something which has grown over the last 30 years 鈥 the risk of recidivism and flight risk was left to the subjective decision-making and gut choices of individual judges.

All human decision-making is susceptible to bias, and therefore despite the best of intentions, the .

How has the introduction of AI impacted this process? To understand this, first let鈥檚 look at the 鈥渢raditional鈥 process, without AI.

The story before AI

Imagine that you have been arrested and that you are suspected of being involved in an armed robbery. You are innocent, but you bear a resemblance to the suspect, live in the area, and your car matches witness descriptions of a vehicle involved.

How Artificial Intelligence is transforming the criminal justice system
Photo credit:

After arrest and booking, you鈥檒l appear in front of a judge who will determine the conditions of your bail. Traditionally, the judge might consider relevant information such as whether or not you鈥檙e believed to be a flight risk, as well as the severity and nature of the crime. However, human aspects come into play as well. If it鈥檚 early in the morning, or after a scheduled break, towards you.

Let鈥檚 imagine it鈥檚 just before lunch and this unconscious inequity befalls you. The judge sets your bail at an amount you can鈥檛 afford, and you will spend the time between now and your trial in jail. This is a common scenario: 82% of defendants who spend pre-trial time in jail are there because they . Here's one of the crucial issues of criminal justice 鈥 although you haven鈥檛 been found guilty of a crime, you have found yourself mixed up in the criminal justice system, and the impacts on your life are already manifesting themselves.

The between arrest and conviction is six months. It鈥檚 fairly likely that you lose your job. It鈥檚 also possible you could lose custody of your children, get behind on bills causing damage to your credit rating; you could even lose your home. This impact is especially devastating for those people who are already struggling to make ends meet.

Further, although you鈥檙e innocent, you鈥檝e now spent half a year locked up with violent and other high-risk individuals. By learning how to survive on the inside, you now have a substantially on the outside.

Another possible outcome

How might your story be different if the court you were using AI?

Let鈥檚 reset to the beginning of your story. You鈥檝e been arrested, and you鈥檙e at the bail hearing. Based on factors such as prior convictions and arrests, as well as your age and the current pending charge, the judge receives a report from the AI software showing you鈥檙e a low flight risk. What are some of the benefits and drawbacks of this change in the story?

First, some human bias may be removed. While it is the judge, who makes the final decision, having this AI-driven report as a baseline means the outcome was less likely to be impacted by when your bail hearing occurred, or how the judge was feeling that day. Additionally, algorithms that remove factors like gender, race, and geography, may take out some of the implicit bias that pervades human decisions.

There鈥檚 also likely an efficiency gain. Having the initial report from the AI could help to speed up the process. Software processes information much more quickly than a human deliberating on the matter. Therefore, judges can hear more cases, which means people could spend less time waiting in jail for bail hearings, and taxpayer money would be spent more efficiently.

However, it is necessary to look at the issue of bias more deeply. The AI system may well have eliminated this specific judge鈥檚 implicit biases, but it also incorporated the . Many people instinctively think of computers as being objective computing machines 鈥 like calculators that always give you a logical result. AI systems are anything but, and research has shown that they . Therefore, they will , biases and all.

This is not an intended outcome of AI, but rather an inherent, immovable feature of the technology itself, and this is something which is not generally well-understood outside of specific technology communities.

It can be argued that over time, as the accuracy of the AI increases, it may minimize the bias to such a degree that it is statistically insignificant. However, there are a number of complications to that argument. First, this is not something which is easy to measure, and systemic bias and studied in existing AI systems in practice.


Photo Credit:

Further, the companies who make AI systems for criminal justice are profit-making and regard their algorithms as trade secrets. People outside the companies are generally unable to review the code.

Given all of this, how do you think the story of your arrest might end? Will you be saved from a judge鈥檚 personal biases? Will you fall victim to the embedded bias in the AI system? The truth is, either outcome is possible.

What happens next?

How can we take advantage of the benefits of AI in judicial decision making while ensuring that we aren鈥檛 codifying and further embedding bias into an already flawed system?

To start, there should be more thorough research into bias in AI and the impacts of using AI in criminal justice. There are who are already tackling this challenge. Further, developers should look more deeply at who is participating in the creation of the software to ensure that multiple perspectives and viewpoints are being considered and included. This is a challenge not only in the development of AI but in the .

What would inclusive development look like? Companies could workshop with people representing different races, genders, geographies, and socioeconomic statuses, as well as individuals who have been arrested, imprisoned, and felt the impact of bias in the criminal justice system.

Another important change is having a more transparent development process. The lack of transparency was . However, the state court鈥檚 initial decision was upheld, because the algorithm was viewed as only being part of the decision-making process.

There are also factors that agencies could consider when adopting AI software written by a third-party vendor. They could avoid hidden 鈥渂lack box鈥 software, and ask critical questions of the company producing the software. Specifically, of the AI, they could ask about how, and on what data, their model is trained. Beyond the scope of the AI, they could choose to engage organizations that are independent of the prison industry and do not benefit from high incarceration rates.

Processes could be put in place for oversight, monitoring, and accountability of the systems to ensure their integrity over time. Oversight and integrity should also extend to the features upon which the AI is trained. Finally, education and sharing knowledge might be the most accessible way for an individual to make an impact.

It鈥檚 time to acknowledge that these systems are in existence today, and are affecting people鈥檚 lives. Parts of the tech community have started It鈥檚 a discussion that needs to extend through and beyond technologists, though. From there, we have the power to shape the future directions of these systems and even ask if they are systems we should be developing, through research, education, advocacy, and policy. We have to understand the trade-offs and to help policy-makers and organizations to work for the betterment of systems of justice as AI develops.

The impacts of AI are being and will be felt in people鈥檚 lives across society 鈥 and we have an important role to play in shaping what happens next.

Disclaimer: The statements and opinions expressed in this article are those of the author(s) and do not necessarily reflect the positions of 魅影直播.

Keep up to date with our latest insights