According to Amnesty International, almost three-quarters of British police forces are using technology to predict future crimes – with disastrous results
Just when you thought Britain couldn’t get more dystopian, police forces are “super-charging racism” with crime-predicting technology, according to a new report by Amnesty International.
The technology involved might be closer to an Excel spreadsheet than Minority Report, but the effects are significant. “The use of predictive policing tools violates human rights. The evidence that this technology keeps us safe just isn’t there, the evidence that it violates our fundamental rights is clear as day,” says Sacha Deshmukh, chief executive at Amnesty International UK.
The report found that almost three-quarters of police forces are using predictive policing, which can be split into two main types. First, there’s geographic crime prediction, which is based on where crimes are likely to occur and involves profiling specific places as crime ‘hotspots’. These systems work by using computer algorithms and historic data, but because the police have always disproportionately targeted areas which have higher populations of Black and racialised people, the data is biased from the outset. This creates a feedback loop where the racist policing of the past is used to justify the racist policing of the present. As the National Police Chief’s Council acknowledged in 2024, Black people are twice as likely to be arrested, three times as likely to be subject to police use of force, and four times as likely to be stopped and searched than white people. The use of location-based predictive policing contributes to these longstanding disparities.
Second, there’s individual crime prediction, which is about trying to predict who will commit crimes in the future. This involves individuals being placed on secret databases and being given a ‘risk score’ of low, medium, or high. Depending on the score, they may be subject to greater monitoring and surveillance by the police, including an increased likelihood of being subject to stop and search. These ‘risk scores’ can also be shared with other government agencies including the Crown Prosecution Service, the Department of Work and Pensions and local authorities, which may influence sentencing decisions and access to essential services like housing and benefits.
“Effectively, what it does is allow the police to punish people using it, for as long as they want to,” John Pegram of Bristol Copwatch said in the Amnesty report. “They can say, OK, it doesn’t matter if you offended 13 or 14 years ago for something, you’re known to us for this, and therefore we’re going to assign a score to you. So it’s risk scoring, it’s profiling, often racist profiling.”
The criteria for ending up on one of these systems can be extremely vague. Under a system in Manchester known as ‘Operation XCalibre’, young people can be deemed “active within a gang context” if they are “somebody who socialises and is seen to socialise or frequently be with members of a gang”. As Amnesty argues, this is a clear infringement of people’s right to freedom of association, particularly because “gang” is defined so broadly in the first place. This kind of surveillance is having an impact on young people’s ability to take part in culture. XCalibre officers have admitted to monitoring music videos on the internet as part of their surveillance of gang activity, and the Greater Manchester Police (GMP) has banned people from attending community events like Parklife, Eid and Mega Mela (a South Asian festival) over alleged gang affiliations.
In 2022, the GMP sent out letters to young people banning them from attending Caribbean Carnival over a set of non-specified and extremely vague allegations, including being “perceived by others to be associated to a street gang.” According to Zara Manoehoetoe of the Northern Police Monitoring Project, who is quoted in the report, this is a form of pre-emptive punishment: “Nobody’s committed a crime, there’s no evidence or intention that they’re attending carnival to commit a crime, but the police have pre-empted that it’s highly likely, and therefore already punished that person, and it’s going to [...] ban them from entry.” This targeting of Black cultural events recalls the Metropolitan Police’s controversial Form 696 (now scrapped), which required event organisers to provide police with details about the genre of music to be played and the ethnic background of the people attending.
Amnesty describes predictive policing as a form of “mass surveillance” which violates the presumption of innocence and people’s right to privacy, to freedom of expression and freedom of association. “These technologies have consequences. The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socio-economic background,” said Deshmukh.
The organisation is calling for the UK government to prohibit the use of these technologies and for greater transparency about how they are being used, as well as accountability for the people and communities affected.