Anorak News | Predictive Policing perpetuates racism and makes us all suspects

Predictive Policing perpetuates racism and makes us all suspects

by | 2nd, November 2018

Predpol is, according to it website, “the market leader in predictive policing”. Predpol collects data and uses it to show police where future offences will take place. Crime is contagious, the thinking goes – the same offenders target the same people in the same area. Pump in the bald facts for ostensibly objective analysis and an efficient police service. “PredPol is currently being used to help protect one out of every 33 people in the United States,” says the company. Really? The facts are unclear. But predictive policing is here in the UK.

Predictive police has many fans. Jeff Brantingham, an anthropology professor at the University of California, Los Angeles who helped to develop the Predpol algorithm, says: “If you are victimized today the risk that you’ll be a victim again goes way up.” Andrew Guthrie Ferguson, a law teacher at the University of the District of Columbia, warns that “under current Fourth Amendment doctrine predictive policing will have a significant effect on reasonable suspicion analysis”. Lindsey Barrett agrees: “These algorithms have the potential to increase accuracy and efficiency, but they also threaten to dilute the reasonable suspicion standard and increase unintentional discrimination in a way that existing law is ill-equipped to prevent.” It’s not the coppers who are racist; it’s the robot.

If past data is the barometer of future crime, how trusty is that data? For instance, if police spend more time in, say, black neighbourhoods nicking people for weed possession will they just repeat past patterns and mistakes? Can Predpol tell us where most white collar crime takes place and prevent it?

…civil liberties groups and racial justice organizations are wary. They argue that predictive policing perpetuates racial prejudice in a dangerous new way, by shrouding it in the legitimacy accorded by science. Crime prediction models rely on flawed statistics that reflect the inherent bias in the criminal justice system, they contend—the same type of bias that makes black men more likely to get shot dead by the police than white men. Privacy is another key concern. In Chicago, Illinois, one scientist has helped the police department generate a list of individuals deemed likely to perpetrate or be victims of violent crime in the near future; those people are then told they’re considered at risk, even if they have done nothing wrong.

Corey Doctorow took a look:

An anonymous security researcher recently contacted me with what may be a list of Predpol’s customers. This researcher had seen that Predpol assigns easy-to-guess subdomains to each Predpol customer, in the form of, for example,

This researcher wrote a script that combined the name of every US city and town with “” and checked to see whether this domain existed. The full list of cities that had Predpol domains is both short and confusing:

Predpol itself was tight-lipped in the extreme: they initially ignored all press requests, then sent a terse “neither confirm nor deny” response to my questions about this list. They wouldn’t even confirm whether the login forms at these domains were secure, despite repeated warnings from me that I would be making them public, requesting that they ensure that these forms require strong logins and passwords to avoid exposing sensitive policing data.

Robocop’s watching you. What can go wrong?


Posted: 2nd, November 2018 | In: Key Posts, News, Technology Comment | TrackBack | Permalink