Facial recognition system is considered illegal by UK court

Facial recognition system is considered illegal by UK court

A Wales court ruled that the use of automatic facial recognition systems by South Wales police was illegal in a court battle that had been going on since mid-2018. Now, after initially dismissing the case, the judges issued a positive opinion on the dispute , stating that the use of technology does not comply with basic rules of privacy and data protection, in addition to not having clear rules of use and constituting loopholes in human rights laws.

  • Without consent, facial recognition surveys use Instagram selfies
  • COVID-19 | Use of masks is breaking facial recognition systems

The lawsuit was filed by Liberty, a British group for the protection of individual rights, on behalf of Edward Bridges, a scholar and activist in matters of privacy and individual freedom. In the initial action, he claimed that his face was improperly registered and processed on two occasions, in December 2017 and March 2018, by a system called AFR Locate, which was used by South Wales police at public events.

According to official information from the district authorities, the automated facial recognition tool would be used in conjunction with a list of 480 to 800 individuals, depending on the location of the event. The idea would be to identify, among the crowd, wanted by justice, fugitives, fugitive prisoners, people of interest from the point of view of national security or missing persons. The faces of those present would be registered and processed by the technology, which would compare them with the database and indicate to the police officers about possible positive results.

The lawsuit brought by Liberty and Bridges is precisely related to this type of automatic identification, which could lead to false positives. In addition, the suit raised concerns about the capture and processing of the faces of those not in the database used, something that would go against privacy laws and the right to be forgotten in force in the United Kingdom, which at the time was part of the Union European Union, where such rules are also in force.

The case, however, was dismissed in September, with two judges considering the use of technology legal. Liberty, however, reviewed the process, appealed and obtained a favorable decision last week, with the court giving a positive opinion to three of the five allegations made by the group, also recommending the interruption in the use of AFR Locate until such issues are resolved.

The judges agreed that the South Wales police were not precise about the situations in which the facial recognition system could be used or in what situations individuals could be part of the database. In addition, the court indicated the existence of inadequate systems to protect the collected data and the absence of safeguards regarding biases related to race and skin color during the process of processing the registered faces.

In a statement, district officials acknowledged the decision and said they would not appeal. According to the pronouncement, the use of AFR Locate has occurred on about 50 occasions as a test, and has been neglected in favor of the use of a new system, developed and operated by NEC and with full support from human officials in the public identification and surveillance. The new technology is also the subject of legal action by Liberty and other privacy groups.

Speaking to the press, Bridges said he believed the court's decision would serve as a precedent for new actions that question attacks on individual freedoms represented by systems such as AFR Locate. In his view, platforms of this type are used without citizens' consent, leading to false positives and possible placing of individuals on suspect lists due to racial and gender bias.

Despite saying that they will not appeal the denied opinions, Bridges and Liberty also criticized the decision of the judges, who suggested carrying out studies that put the potential risks to individual freedom in the balance in relation to a possible increase in security through the ostensible identification of suspects. The magistrates said they understood that the benefit to society would be greater, indicating an open door for the continuity in the use of facial recognition technologies.