Andrey Popov - stock.adobe.com

Equality watchdog calls on police to stop using facial recognition

Equalities and Human Rights Commission says use of automatic facial recognition and predictive algorithms by police is discriminatory, stifles freedom of expression and lacks a proper legislative framework

The UK’s equality watchdog has called for the suspension of automatic facial recognition (AFR) and predictive algorithms in policing until its impact has been independently scrutinised and laws governing its application improved.

In evidence submitted to United Nations (UN) as part of its work to monitor the state of civil and political rights in Great Britain since 2015, the Equalities and Human Rights Commission (EHRC) noted the absence of a firm legal basis for the use of AFR, as well as concerns over its discriminatory impact.

“The legal framework authorising and regulating the use of AFR is insufficient – it is based solely on common law powers and has no express statutory basis,” said the submission.

“Evidence indicates that many AFR algorithms disproportionately misidentify Black people and women, and therefore operate in a potentially discriminatory manner.”

The submission cited a series of Freedom of Information (FoI) requests submitted by civil liberties NGO Big Brother Watch, which showed that, since 2016, the Metropolitan Police Services’ (MPS) AFR surveillance has been 96% inaccurate.

It added that 14 UK police forces are already using or planning to use predictive policing technologies, which includes algorithms to analyse data and identify patterns.

“Such technologies also raise concerns, including that predictive policing replicates and magnifies patterns of discrimination in policing, while lending legitimacy to biased processes,” it said.

“A reliance on ‘big data’ encompassing large amounts of personal information may also infringe upon privacy rights and result in self-censorship, with a consequent chilling effect on freedom of expression and association.”

In a statement, chief executive at the EHRC Rebecca Hilsenrath said it was essential the law keeps pace with technological development to prevent rights being infringed and damaging patterns of discrimination being reinforced.

“The law is clearly on the back foot with invasive AFR and predictive policing technologies. It is essential that their use is suspended until robust, independent impact assessments and consultations can be carried out, so that we know exactly how this technology is being used and are reassured that our rights are being respected,” she said.

In January 2020, the Network for Police Monitoring (Netpol) – which monitors and resists policing that is excessive, discriminatory or threatens civil liberties – told Computer Weekly there is a strong possibility that AFR technology would be used against protesters.

If you know you’re being constantly scanned for participation in a protest, then the chances are that you’re much less likely to attend that demonstration
Kevin Blowe, Netpol

“I don’t think there are any doubts [the MPS] would use facial recognition at protests whatsoever – there’s been no restraint on other forms of surveillance, and the routine filming of demonstrations is now something that happens at even the smallest of protests,” said Netpol coordinator Kevin Blowe at the time.

“If you know you’re being constantly scanned for participation in a protest, and you have no idea whether you’re appearing on the data set they’re using, then the chances are that you’re much less likely to attend that demonstration too.”

In the UK, AFR has already been deployed against peaceful arms fair protesters by South Wales Police.

Blowe added that those who seek to cover their faces could unwittingly attract more attention form the police, who may assume they are ‘troublemakers’ if only a small number make the choice.

“The real challenge for the police, however, is if thousands of people turn up for a protest wearing a mask – if live facial recognition finally makes wearing one normal. We plan to actively encourage the campaigners we work with to do so,” he said.

Previous calls to halt AFR

The EHRC submission is the latest in a long line of calls for a moratorium on police use of AFR and predictive technologies.

For example, in May 2018, the Science and Technology Committee published a report which said: “Facial recognition technology should not be deployed, beyond the current pilots, until the current concerns over the technology’s effectiveness and potential bias have been fully resolved.”

This position was re-iterated in March 2019, when the biometrics and deputy information commissioner’s told the Committee’s MPs that UK police should not deploy AFR until issues with the technology are resolved.

Following a 17-month investigation, the Information Commissioner’s Office (ICO) published a report in October 2019 that called on the government to introduce a statutory and binding code of practice on the deployment of AFR.

“The absence of a statutory code of practice and national guidelines contributes to inconsistent practice, increases the risk of compliance failures, and undermines confidence in the use of the technology,” it said.

Despite these calls, the MPS announced its decision to roll out AFR operationally for the first time without any dedicated legislative framework on 24 January 2020.

MPS commissioner Cressida Dick would later call for a legislative framework for emerging technologies in the police, while simultaneously defending the decision to use AFR technology operationally without it.

“The only people who benefit from us not using [technology] lawfully and proportionately are the criminals, the rapists, the terrorists, and all those who want to harm you, your family and friends,” she said at the time, before claiming that there was a “very strong” legal basis for the use of AFR.

Dick made the remarks at the Royal United Services Institute (Rusi) in late January 2020, during the launch of the security think tanks latest report on police algorithms.

The report found that new national guidelines to  ensure police algorithms were deployed legally and ethically were needed “as a matter of urgency.”

Read more about police technology

Read more on Privacy and data protection

CIO
Security
Networking
Data Center
Data Management
Close