Report Claims Predictive Policing Targets Marginalized

WASHINGTON, DC – Law enforcement’s reliance on predictive policing strategies is drawing concerns, noting the technology’s potential to reinforce racial biases and disproportionately target already marginalized communities, according to a recent report from National Academies Press.

Researchers and advocacy organizations have pointed to systemic flaws in the algorithms, making the argument that predictive policing is not an objective tool, as it claims to be, and is, rather, an extension of long-standing policing inequities, said National Academies Press.

According to a study conducted by the AI Now Institute at NYU, predictive policing tools often rely on historical crime data that has already been skewed by decades of racially-biased policing practices.

“Predictive policing does not predict crime,” the institute’s report states. “It predicts policing, reinforcing patterns of law enforcement that disproportionately target Black and Brown communities.”

The Electronic Frontier Foundation (EFF), a nonprofit that defends civil liberties in the digital world, has also scrutinized the validity and usability of predictive policing algorithms.

EFF warns these tools “exacerbate over-policing in communities that have historically been subject to aggressive law enforcement,” citing previous cases where such systems have led to heightened surveillance and excessive force in minority-dominated communities.

A 2023 investigation by The Markup revealed PredPol, one of the most widely integrated predictive policing programs, sent officers to neighborhoods that were predominantly Black at twice the rate of neighborhoods that were predominantly white, despite comparable crime rates.

The Markup’s analysis found “PredPol’s algorithm disproportionately directs police resources towards areas with a higher concentration of previous arrests, rather than areas with the highest likelihood of new crime.”

Civil rights groups have also noted concern with the secrecy surrounding predictive policing programs.

The American Civil Liberties Union (ACLU) has criticized law enforcement agencies for failing to disclose how these algorithms actually work, noting, “Without transparency, there is no way to hold these systems accountable. People have the right to know how police departments are making decisions that impact their safety and civil liberties.”

The National Institute of Justice, a research agency within the U.S. Department of Justice has made the statement that predictive policing “must be implemented with caution to avoid reinforcing systemic biases.”

Critics argue further, stating that even with safeguards, the fundamental premise of these tools remains flawed.

More and more cities are not reevaluating their use of predictive policing technology, the National Academies Press report said, citing the Los Angeles Police Department discontinued in 2022 its controversial LASER program, which used data analysis to identify so-called “chronic offenders.”

The program was shut down after reports from UCLA’s Million Dollar Hoods project found it disproportionately targeting Black and Latino individuals, despite it not producing results of reducing crime rates, according to National Academies Press.

Similar concerns have popped up in Chicago, where the Strategic Subject List (SSL) sought to predict individuals most likely to be involved in gun violence, reported National Academies Press, citing a 2019 study by RAND Corporation revealing SSL was not effective and disproportionately flagged Black and Latino residents.

Ultimately, SSL was discontinued by Chicago law enforcement due to backlash and legal confrontations from civil rights organizations, said the National Academies Press report.

“Predictive policing gives the illusion of scientific objectivity, but in reality, it automates discrimination, It’s time for law enforcement agencies to rethink their approach to crime prevention and prioritize solutions that do not rely on flawed technology,” Kate Crawford, an AI ethics researcher and co-founder of the AI Now Institute, asserts.

Despite mounting criticism, predictive policing continues to be utilized by law enforcement agencies across the country, reported National Academies Press, noting advocacy groups and researchers warn that with further use and without firmer oversight and accountability, these technologies will only worsen existing inequalities within the criminal justice system.

Policymakers and researchers alike suggest alternative approaches that emphasize community-based policing and restorative justice initiatives rather than data-driven enforcement programs, said National Academies Press.

“We need to stop assuming technology is a neutral solution,” Rashida Richardson, a law/technology researcher, explains to National Academies Press, adding “Real crime prevention comes from investing in education, healthcare, and economic opportunity—not predictive algorithms that perpetuate racial disparities.”

Author

Categories:

Breaking News Everyday Injustice

Tags:

Leave a Comment