Alan Stockdale - stock.adobe.com
Human rights group Liberty plans to appeal a ruling by the High Court in Cardiff that the use of automatic facial recognition (AFR) technology by South Wales Police is lawful, yet privacy-infringing.
The case, brought by Liberty on behalf of its client, Cardiff resident Ed Bridges, is the first of its kind to legally challenge the police use of the mass surveillance tool in the UK.
Despite ruling that it “does entail infringement” of Bridges’ Article 8 privacy rights, the two presiding judges decided that South Wales Police’s use of AFR had “struck a fair balance and was not disproportionate,” making its deployment justified.
As such, South Wales Police can continue to use the technology.
The legal claim was originally brought forward following Bridges’ concerns that he had been tracked on two occasions: once during a peaceful anti-arms protest and another while out shopping in Cardiff.
The judgement detailed that approximately half a million faces, although not necessarily half a million individuals, may have been scanned across 50-plus AFR deployments since 2017.
“South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent,” said Bridges in response to the verdict.
“This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”
Right to appeal
Bridges and Liberty will continue to campaign for an outright ban on police use of the technology while they appeal the decision, it has been confirmed.
According to Liberty lawyer Megan Goulding: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms.
“Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all. It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use.”
Despite their ruling, the judges said “the future development of AFR technology is likely to require periodic re-evaluation of the sufficiency of the legal regime,” leaving the door open for further conflict over use of the technology.
“I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern,” said South Wales chief constable Matt Jukes.
“I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme. With the benefit of this judgment, we will continue to explore how to ensure the ongoing fairness and transparency of our approach.”
Data protection watchdogs
However, there has been a ground swell of concern over recent months, with data protection watchdogs and civil liberties expressing doubt over the legality of police AFR.
In March, for example, the UK’s Science and Technology Committee was warned by the Information and Biometrics Commissioners that AFR should not be deployed by UK law enforcement until concern about its effectiveness were resolved.
This was followed by a report based on research led by the University of Essex, which raised a number of concerns around the procedures, practices and human rights compliance during the Metropolitan Police’s AFR trials.
The Information Commissioners Office also highlighted that any police force, or indeed private organisation, using live facial recognition technology is processing personal data and needs to pay attention to data protection laws.
When asked whether the High Court ruling would change how the Metropolitan Police deployed AFR, a Force spokesperson told Computer Weekly: “The implications of this ruling for the MPS will now be carefully considered before a decision is taken on any future use of live facial recognition technology.
“We believe the public would expect policing to use all available and proportionate means to catch violent offenders and it is right that we trial emerging technologies that could help us do so,” the spokesperson added.
Continuing controversies: Watch lists and data sharing with private companies
One remaining point of contention is around how “watch lists” (databases of custody images the technology uses to identify people in crowds) are compiled.
Speaking at the London Assembly’s Police and Crime Committee on 14 May 2019, Assembly Member Sian Berry asked Met police commissioner Cressida Dick whether protestor images captured on the streets could potentially be used to compile the AFR watch lists.
Speaking again at the committee on 4 September, Berry told Deputy Commissioner Steve House that “she expressed some surprise that I would even ask this, but she did promise to write with more information and I’ve not had any reply to that yet”.
In response, House said the Met would take note and apologised for the delay. “My view is that facial recognition should be deployed on the streets of London, so long as it is done within a legal framework,” he said.
However, questions put to London mayor Sadiq Khan about AFR and private companies during the mayor’s Question Time show that the Met have been sharing images related to facial recognition with King’s Cross Central Limited Partnership, despite previously denying it.
The group was recently revealed to have been using the technology to track tens of thousands of people across its development site in the Kings Cross area.
When originally asked whether the Met worked with private sector organisations, the mayor stated on 29 August that this was not the case and that there were no plans for this kind of partnership.
Read more about facial recognition and the police
- As adoption of facial recognition systems continues to grow worldwide, there is increasing concern that this technology could undermine fundamental privacy rights and how it can be kept in check.
- The Metropolitan Police Service has completed its ninth deployment of live facial recognition technology, which took place in Romford town centre on 31 January. Eight people were arrested during the eight-hour deployment, said the Met.
- South Wales Police’s decision to start trailing a new facial recognition mobile app has been described by campaigners as “shameful” given the force’s use of the technology is currently subject to ongoing court proceedings.
The Mayor later corrected this statement on the 4 September: “The MPS has just now brought it to my attention that the original information they provided MOPAC with was incorrect and they have in fact shared images related to facial recognition.
“As a matter of urgency I have asked for a report from the MPS on this concerning development and on their wider data-sharing arrangements, including what information has been shared and with whom.”
According to a spokesperson for the Met, “The MPS has not shared any images with the King’s Cross Estate, or any related company, for facial recognition purposes since March 2018.”
As of 2019, only three forces have used the technology – the Metropolitan Police, South Wales Police and Leicestershire Police – which has largely been funded by the Home Office.
Most of the technology is provided by Japan’s NEC Corporation, which has received £198,000 for software and £23,800 on hardware from the Met alone according to a recent report from The Independent.
“Although the High Court has resoundingly dismissed this claim against South Wales Police, it will not be the end to legal challenges to the developing use of Live Facial Recognition systems,” said Jon Baines, data protection advisor at law firm Mishcon de Reya.
“As the technology develops, the potential for intrusive misuse increases – and it is important to note that police powers to use this sort of surveillance are much wider than the powers available to other organisations.”