Infographic: The social impact of Crime-stopping AI


Artificial intelligence is being applied to everything in business and in our day-to-day lives. The hope is that machines can streamline all the processes of our lives and make them more efficient. But what happens when machine learning is taught something it shouldn’t be, like racism, sexism, or classism? Unfortunately, programmers are learning that this is often the case – our subconscious biases make their way into our programming and can cause some pretty serious unintended effects. One area the impact is being felt is with crime-stopping AI.

Currently, artificial intelligence is being used to predict and prevent crime. The thought goes: crime breeds more crime, and crime is like a virus and happens in clusters, so information about known crimes is fed into an algorithm and the output tells police where to spend more time on patrols. It sounds reasonable, but there are a few problems with this model.

First, not all crimes are reported. Artificial intelligence algorithms depend on historical data in order to make predictions, and if that data is incomplete or inaccurate so, then, are the conclusions drawn from it. 

Second, artificial intelligence has a demonstrated history of bias. African-Americans are more likely to have their images included in a database due to the over-policing of black communities, but at the same time, the algorithms have demonstrated over time that they can’t accurately identify people of color. In a study of Amazon’s facial recognition software, 28 members of Congress were incorrectly identified as criminals. What’s more, 39% of the incorrect matches were people of color, and only 20% of Congress are people of color.

Unfortunately, this technology is already being deployed in crime-fighting efforts across the world. Tech like license plate detectors, facial recognition, ShotSpotter, and more are already being used in communities to find and identify criminals. Last year, Huntington Beach, California deployed a 500-pound autonomous robot police officer that patrols the streets and notifies police when it finds “blacklisted” individuals within the community.

Learn more about crime-fighting AI and its unintended consequences below.

Crime-Stopping AI
Source: Best MSW Programs

What do you think of crime-stopping AI? Let us know in the comments below or on Twitter, or Facebook. You can also comment on our MeWe page by joining the MeWe social network.

In some of our articles and especially in our reviews, you will find Amazon or other affiliate links. As Amazon Associates, we earn from qualifying purchases. Any other purchases you make through these links often result in a small amount being earned for the site and/or our writers. Techaeris often covers brand press releases. Doing this does not constitute an endorsement of any product or service by Techaeris. We provide the press release information for our audience to be informed and make their own decision on a purchase or not. Only our reviews are an endorsement or lack thereof. For more information, you can read our full disclaimer.

Last Updated on February 3, 2021.


JLab GO Air review: These are the best budget true-wireless headphones

Samsung Q90T 4K first look review: Hands-On with Samsung’s 4K TV


Latest Articles

Share via
Copy link
Powered by Social Snap