UK use of predictive policing is racist and should be banned, says Amnesty

British policing’s use of algorithms and data to predict where crime will happen is racist and picks on the poor, a report from Amnesty International says.

The human rights group says predictive policing tools, used by most police forces in the UK, are so unfair, dangerous and discriminatory that they should be banned.

Amnesty says the data driving the predictive systems and assumptions they rely on come from established “racist” police practices such as stop and search, where most stops find no wrongdoing and which overly targets Black people. That in turn is corrupting cutting-edge police predictive crime systems, billed as part of the future of battling crime.

Police say predictive policing helps cut crime, allowing officers and resources to be deployed where they are most needed.

Predictive policing involves computer programmes that use data and algorithmic models to estimate where crimes are most likely to happen. It was once the stuff of dystopian fiction, for example in the Steven Spielberg film Minority Report, but is an increasingly popular tool for law enforcement.