In Philip K. Dick's 2002 book Minority Report, adapted for cinema by Steven Spielberg, Tom Cruise's character, John Anderton, is a police officer who uses a system capable of accurately predicting crimes. And everything starts to go wrong when the system that foresees illegal actions projects a murder that Anderton himself will commit. The advancement of artificial intelligence (AI) in recent years allows us to recognize patterns in such a vast and detailed way that authorities in the United Kingdom, among others around the world, wanted to use it to create something similar to what we saw in Minority Report. And, as in the Seventh Art, the experiment did not work in real life, due to ethical issues.
- Mobile technology can allow the anticipation of crimes as in Minority Report
- Minority Report: software to predict crimes is being tested in the USA
- Police rely on artificial intelligence to predict crimes in Italy
Known as Most Serious Violence (MSV) – something like “Most Serious Violence” – the system is part of the UK's National Data Analytics Solution (NDAS) project, funded by the government with an investment of around $ 13 million in the past two years. The MSV was created to predict whether people would commit their first violent crime with a gun or knife in that period, based on a “criminal score”.
To reach this score, historical data were collected on 2.4 million people from the West Midlands database and 1.1 million from West Yorkshire, both counties in England – among the records were crimes, custody, intelligence reports and other information from the Federal Police National Computer network. The higher the score, the greater the likelihood that individuals would commit new illegal activities.
The MSV identified "more than 20" indicators that were believed to be useful in assessing how risky a person's future behavior could be. This includes age, days since the first crime, connections to other people in the data used, the severity of these offenses and the maximum number of “knife” mentions in linked intelligence reports – location and ethnicity data were not included. Many of these factors, says the presentation, have been weighted to give more prevalence to the most recent data.
It looked beautiful in theory, but in practice …
Although there are no complete details on how this “criminal score” was exactly labeled on each individual, the initial NDAS simulations showed promising results. Of every 100 people with a high score, 54 of them would commit crimes with guns and knives in West Midland. The rate was even higher in West Yorkshire, at 74%. But when MSV started operating in real life earlier this year, the team found a big mistake, which brought those numbers down.
There are no detailed explanations of the criteria for this score, but the MSV was disapproved by the West Midlands Police Ethics Committee, which is responsible for examining the work of NDAS, as well as the police force's own technical developments. “A coding error was found in the definition of the training data set, which made the current MSV problem statement unviable,” says an NDAS summary published in March. A spokesman for the project said the error was a data ingestion problem discovered during the development process. No other more specific information was released.
For serious gun or knife violence, the accuracy of the system dropped to somewhere between 14% and 19% for West Midlands police and was between 9% and 18% in West Yorkshire. After a review, NDAS found that its system was more accurate when all initial criteria were removed. In other words, the original performance was exaggerated: at best, the system can achieve an accuracy of 25% to 38% for West Midlands police and 36% to 51% for West Yorkshire.
What lesson do we learn from this?
All agencies involved in the project agreed that MSV's lack of precision makes it unviable and the project was discontinued. However, what has been researched can still be used in other ways. On a similar front, officials have been using machine learning to detect modern slavery, the movement of firearms and different types of organized crime. The difference is that, instead of treating these projects as a system that will be the “judge”, they would fit into the so-called “increased intelligence” – as complementary tools for investigation and decision-making in reports of serious violence.
What's more, this experiment shows that other platforms currently in use also need revision, such as the UK Interior Ministry's own automated visa assessment system, which used a person's nationality as a key determinant of their immigration status. . After allegations that it had traces of structural racism, it was disabled earlier this week.