Tech

Courts and police departments are turning to AI to reduce bias, but some argue it'll make the problem worse

AI is being used to predict crime and send people to jail, but it could be just as biased as humans
VIDEO8:2008:20
AI is being used to predict crime and send people to jail, but it could be just as biased as humans

We all know humans are imperfect. We're subject to biases and stereotypes, and when these come into play in the criminal justice system, the most disadvantaged communities end up suffering. It's easy to imagine that there's a better way, that one day we'll find a tool that can make neutral, dispassionate decisions about policing and punishment.

Some think that day has already arrived.

Around the country, police departments and courtrooms are turning to artificial intelligence algorithms to help them decide everything from where to deploy police officers to whether to release defendants on bail.

Supporters believe that the technology will lead to increased objectivity, ultimately creating safer communities. Others however, say that the data fed into these algorithms is encoded with human bias, meaning the tech will simply reinforce historical disparities.

Learn more about the ways in which communities, policemen and judges across the U.S. are using these algorithms to make decisions about public safety and people's lives.