West Midlands Police will produce a prototype by the end of March 2019 as the force is leading the project. London Metropolitan police, Greater Manchester, and five others will also be involved in the National Data Analytics Solution (NDAS) which pairs AI and statistics to try to assess the likelihood of someone committing or falling victim to gun and knife crime, or modern slavery. If it’s a success it will be rolled out throughout the UK.
So far Iain Donnelly, who is heading up the project, told New Scientist that it will not be used to arrest people before they’ve actually done anything wrong. He says the NDAS could target people with a history of mental illness who might be predisposed to violent crime and link them up with social services or health workers in their area.
However, there are a lot of reasons to be sceptical about the project, especially given the current databases the police have recently come under fire for. One route for gathering information will be via stop and search, which statistics say are on the rise and targets black people disproportionately. Left unchecked, the NDAS may become just another way of monitoring black people for crimes they have not yet committed.
It recently came to light that the Metropolitan Police has been practising predictive policing covertly for years. The controversial “gang matrix” started after the 2011 riots. Amnesty International revealed that there are almost 4,000 (mostly black) people on the database who have been flagged as potential candidates for gang violence due to the music they listen to, things the police have seen them post on social media, or people they have interacted with. As with the NDAS, the Gang Matrix can be shared with other services including housing associations, job centres and schools – effectively tarnishing an individual's names when it comes to these vital lifelines.
Sandra Wachter at the Oxford Internet Institute said the difficulty with these methods is being able to say whether the police’s paranoia about an individual is valid enough to warrant surveillance. “How would I know that this actually makes the right decision? That’s something that is very hard to measure,” she said.
In an ideal world, it would offer you some counselling, at worst it puts a target on your back before you’ve actually done anything. Should there be lots of shadowy databases monitoring people who aren’t criminals and subjecting them to potential prejudicial treatment? Probably not.