Pittsburgh's Algorithmic Policing Program Predicting Hot Spots For Criminal Activity Suspended For "Potential Racial Bias"
06/29/2020
A+
|
a-
Print Friendly and PDF

Earlier: Pittsburgh Anchor Wendy Bell Fired For Inflammatory (I.E. True) Post About Black Mass Shooting

Because reality has a well-known racial bias, mechanisms put in place to make the city safer for law-abiding citizens of all races must be abandoned.

Immediately.

Pittsburgh suspends policing program that used algorithms to predict crime ‘hot spots,’ Pittsburgh Post-Gazette, June 23, 2020

Citing concerns about potential racial bias, the city of Pittsburgh has suspended an algorithm policing program that predicted “hot spots” for criminal activity, and instead will focus any new data-driven programs on deploying social services, according to Mayor Bill Peduto.

The Carnegie Mellon University-developed tool aimed to rely on data sources to predict where crime would occur and then dispatch patrols to those areas. The city began piloting the program in 2017.

The city has “no plans to restart it at this time,” Mr. Peduto wrote in a letter last week to the Pittsburgh Task Force on Public Algorithms, hosted by the University of Pittsburgh’s Institute for Cyber Law, Policy and Security.

The exact date on which the city public safety officials halted the program and the extent to which they used it previously are unclear.

Public safety officials referred all questions to the mayor’s office.

In a letter to the Peduto administration, the task force commended the decision to pause the program, and criticized the lack of transparency and public engagement around the use of the algorithm.

“We understand that the decision to adopt the tool in the first instance may have been based on a demonstrated lack of racial bias and promising findings of impact that have not been shared with the public,” the June 17 letter read. “However, without transparency and public consultation, this program will not and should not achieve meaningful legitimacy.”

Mr. Peduto pointed the task force to his administration’s new Office of Community Health and Safety as well as the creation of a Community Task Force on Police Reform.

“ ‘Hot Spots’ may benefit from the aid of a social worker, service provider or outreach team, not traditional policing,” the mayor wrote.

The project was a partnership between CMU and the Pittsburgh Bureau of Police, the Pittsburgh Department of Innovation and Performance, and the Pittsburgh Department of Public Safety, according to a statement from CMU’s Metro21: Smart Cities Institute.

“The goal of the project was to reduce serious violent crime in Pittsburgh through prevention without increasing arrests by predicting locations — not individuals — at heightened risk of violent crime. Police patrol activity was then directed to those locations. The project was run citywide and included all Pittsburgh neighborhoods. The project concluded in December 2019, and we are no longer sharing data with the police,” CMU’s statement said.

The model “did not use racial, demographic or socioeconomic data. Nor did it use data on individual persons. The model only used crime offense data for crimes with victims and 911 calls for service. Using this information, CMU researchers identified chronic hot spots based on the number of serious crimes committed in an area using data from previous years,” according to the CMU statement.

Where diversity (and a paucity of white people) was found, it turns out the city’s algorithmic policing program predicting “hot spots” for criminal activity quickly established racial patterns those in power wished to conceal. Because it’s these patterns explaining why Pittsburgh has such high rates of residential segregation: White people want to live in communities devoid of crime, thus they seek to live in residential areas with an abundance of whites. A surplus of whiteness means social capital flourishes, while Pittsburgh’s algorithm policing program dared showcase communities lacking in whiteness were also where crime was most rampant.

Thus, it had to go immediately.

If there’s hope, it lies not in the proles, but the successful implementation of algorithm policing programs predicting crime nationwide.

Data never lies the way social scientists creating new words/phrases to excuse away black criminality do.

[Comment at Unz.com]

Print Friendly and PDF