Alan Stockdale - stock.adobe.com

Met Police use of facial recognition could be unlawful, finds report

The first independent report into use of facial recognition technology by the police found a number of shortcomings around trials that would not withstand legal scrutiny

Use of live facial recognition (LFR) by the Metropolitan Police could be held unlawful if challenged in court, the first independent report into the application of the technology by a UK police force has found.

The report by the Human Rights, Big Data & Technology Project, led by the University of Essex, raises a number of concerns around the procedures, practices and human rights compliance during the LFR trials.

The report was produced by criminologist specialising in surveillance and society Peter Fussey, and international human rights law lecturer with a focus on counter-terrorism Daragh Murray.

To compile the report, the researchers were granted access to the final six of the 10 trials run by the Metropolitan Police, running from June 2018 to February 2019.

They joined officers on location in the LFR control rooms, engaged with officers responding on the ground, and attended briefing and de-briefing sessions, as well as planning meetings.

Across the six trials evaluated, the LFR technology made 42 matches. The researchers were absolutely confident about only eight of those matches, given the unclear criteria set for people on the “watchlist” and “significant ambiguity” over categories that the LFR trials sought to identify.

Such issues meant that, for example, the police continued to stop people despite the fact their case had already been addressed.

According to the report, key issues in the police trials include insufficient pre-test planning and work around the concept. There were a number of operational blunders observed by the researchers, including inconsistencies in the process used by officers to verify a match made by the technology, how the police engaged with individuals involved, and difficulties in defining and obtaining consent of those affected.

Consent, public legitimacy and trust were among the various concerns raised by the researchers around mixing trials with operational deployments. For example, the report highlighted the differences between individuals’ consent to participate in research and their consent to the use of technology for police operations.

The way facial recognition was dealt with in the trials is similar to the approach used in traditional CCTV imagery, the report observed. Such an approach failed to take into account factors such as the intrusive nature of LFR and the use of biometric processing, meaning that a detailed assessment of the impact of the application of the technology was not carried out.

There is no explicit legal authorisation for the use of the technology in domestic law, the report noted. It added that the implicit legal authorisation claimed by the Met – coupled with the absence of publicly available, clear online guidance – is unlikely to satisfy requirements established by human rights law, if challenged in court.

The Met chose not to provide feedback to the researchers on the report’s findings.

“The report demonstrates a need to reform how certain issues regarding the trialling or incorporation of new technology and policing practices are approached,” Fussey said.

“It underlines the need to effectively incorporate human rights considerations into all stages of the Metropolitan Police’s decision-making processes,” he added. “It also highlights a need for meaningful leadership on these issues at a national level.”

The annual report from the biometrics commissioner urged the government to introduce legislation to regulate the use of biometrics technologies in police forces as risks around privacy and public loss of confidence in the technology increase.

Read more about UK government use of biometrics

Read more on IT for government and public sector

CIO
Security
Networking
Data Center
Data Management
Close