Artificial intelligence-inspired policing technology and techniques like facial recognition software and digital surveillance continue to find traction and champions among law enforcement agencies, but at what cost to the public?
Some cities like Wilmington, North Carolina, have even adopted AI-driven policing, where technology like ShotSpotter identifies gunshots and their locations. The software also recommends to patrol officers “next best action” based on their current location, police data on past crime records, time of day, and housing and population density.
Renée Cummings, data activist in residence at the University of Virginia’s School of Data Science, warns that the rules of citizenship are changing with the development of AI-inspired policing technologies. She explains, “If the rules are changing, then the public needs to have a voice and has the right to provide input on where we need to go with these technologies as well as demand solutions that are accountable, explainable and ethical.”
As artificial intelligence is used toward the development of technology-based solutions, Cummings’ research questions the ethical use of technology to collect and track citizen data, aiming to hold agencies more accountable and to provide citizens greater transparency.
“Law enforcement, national security, and defense agencies are spending a lot of money on surveillance tools with little oversight as to their impact on communities and an individual’s right to privacy,” Cummings said. “We’re creating a tool that would give citizens the ability to see how these powerful tools are used and how they impact our lives.”
Cummings and a team of data science graduate students are developing an algorithmic tool to evaluate the impact of AI-inspired law enforcement technologies. Their goal is to create an “algorithmic force score” that would eventually be used in an application that tracks technologies currently used by law enforcement agencies by force and zip code.
Sarah Adams and Claire Setser, both students in the online M.S. in Data Science program, said they chose the project because they wanted to put their data science skills to work for the public good. Cummings praised their effort. “The algorithmic foundation was created with tremendous effort by Sarah and Claire who went through massive amounts of existing data to create an algorithm force model.”
Adams said she wanted to work on a capstone project that contributed to and supported the ongoing efforts toward increasing police accountability and citizen activism. “Our cohort chose our capstone projects at the beginning of 2021, which was less than one year after the loss of George Floyd and our country had been in civil unrest for quite some time. I was inspired by Renèe Cummings’ energy and passion for data ethics and its application in criminology.”
Setser agreed. “I was attracted to this capstone project because of the possibility to enact and help push for real change. Citizens have a right to understand the technologies that are used to police them and surveil their lives every day. The problem is that this information is not readily available, so the idea of creating a tool to educate the public and encourage dialogue was of great interest to me.”
Students in the M.S. in Data Science program are required to complete a capstone project sponsored by corporate, government and non-profit organizations. Students collaborate closely with sponsors and faculty across disciplines to tackle applied problems and generate data-driven solutions. Capstone projects range in scope and focus, and past projects have explored health disparities, consumer behavior, election forecasting, disease diagnosis, mental health, credit card fraud and climate change.
“The capstone project was a valuable opportunity to combine and implement almost all of the skills and knowledge that we gained throughout the program,” Setser said. “It’s an opportunity to experience the data pipeline from beginning to end while providing your sponsor a better understanding of the data. This is incredibly rewarding.”
The project’s next stage is to fine-tune and test, and Cummings and her team hope to collaborate with UVA and the wider Charlottesville community. “What makes this so exciting is that we’re creating something brand new and adding new insights into emerging technology. Sarah and Claire have been amazing, delivering something extraordinary in such a short space of time. It really speaks to their expertise, determination, and commitment toward AI for the public and social good.”
Cummings joined the School of Data Science in 2020 as its first data activist in residence. She is a criminologist, criminal psychologist, therapeutic jurisprudence specialist, AI ethicist and AI strategist. Her research places her on the frontline of artificial intelligence for social good, justice-oriented AI design, and social justice in AI policy and governance. She is the founder of Urban AI and a community scholar at Columbia University.
Media Contact
Article Information
September 16, 2021
/content/closer-look-artificial-intelligence-inspired-policing-technologies