Equality watchdog calls on police to stop using facial recognition
Equalities and Human Rights Commission says use of automatic facial recognition and predictive algorithms by police is discriminatory, stifles freedom of expression and lacks a proper legislative framework
The UK’s equality watchdog has called for the suspension of automatic facial recognition (AFR) and predictive algorithms in policing until its impact has been independently scrutinised and laws governing its application improved.
In evidence submitted to United Nations (UN) as part of its work to monitor the state of civil and political rights in Great Britain since 2015, the Equalities and Human Rights Commission (EHRC) noted the absence of a firm legal basis for the use of AFR, as well as concerns over its discriminatory impact.
“The legal framework authorising and regulating the use of AFR is insufficient – it is based solely on common law powers and has no express statutory basis,” said the submission.
“Evidence indicates that many AFR algorithms disproportionately misidentify Black people and women, and therefore operate in a potentially discriminatory manner.”
It added that 14 UK police forces are already using or planning to use predictive policing technologies, which includes algorithms to analyse data and identify patterns.
“Such technologies also raise concerns, including that predictive policing replicates and magnifies patterns of discrimination in policing, while lending legitimacy to biased processes,” it said.
“A reliance on ‘big data’ encompassing large amounts of personal information may also infringe upon privacy rights and result in self-censorship, with a consequent chilling effect on freedom of expression and association.”
In a statement, chief executive at the EHRC Rebecca Hilsenrath said it was essential the law keeps pace with technological development to prevent rights being infringed and damaging patterns of discrimination being reinforced.
“The law is clearly on the back foot with invasive AFR and predictive policing technologies. It is essential that their use is suspended until robust, independent impact assessments and consultations can be carried out, so that we know exactly how this technology is being used and are reassured that our rights are being respected,” she said.
If you know you’re being constantly scanned for participation in a protest, then the chances are that you’re much less likely to attend that demonstration
Kevin Blowe, Netpol
“I don’t think there are any doubts [the MPS] would use facial recognition at protests whatsoever – there’s been no restraint on other forms of surveillance, and the routine filming of demonstrations is now something that happens at even the smallest of protests,” said Netpol coordinator Kevin Blowe at the time.
“If you know you’re being constantly scanned for participation in a protest, and you have no idea whether you’re appearing on the data set they’re using, then the chances are that you’re much less likely to attend that demonstration too.”
In the UK, AFR has already been deployed against peaceful arms fair protesters by South Wales Police.
Blowe added that those who seek to cover their faces could unwittingly attract more attention form the police, who may assume they are ‘troublemakers’ if only a small number make the choice.
“The real challenge for the police, however, is if thousands of people turn up for a protest wearing a mask – if live facial recognition finally makes wearing one normal. We plan to actively encourage the campaigners we work with to do so,” he said.
Previous calls to halt AFR
The EHRC submission is the latest in a long line of calls for a moratorium on police use of AFR and predictive technologies.
For example, in May 2018, the Science and Technology Committee published a report which said: “Facial recognition technology should not be deployed, beyond the current pilots, until the current concerns over the technology’s effectiveness and potential bias have been fully resolved.”
“The absence of a statutory code of practice and national guidelines contributes to inconsistent practice, increases the risk of compliance failures, and undermines confidence in the use of the technology,” it said.
MPS commissioner Cressida Dick would later call for a legislative framework for emerging technologies in the police, while simultaneously defending the decision to use AFR technology operationally without it.
“The only people who benefit from us not using [technology] lawfully and proportionately are the criminals, the rapists, the terrorists, and all those who want to harm you, your family and friends,” she said at the time, before claiming that there was a “very strong” legal basis for the use of AFR.
Dick made the remarks at the Royal United Services Institute (Rusi) in late January 2020, during the launch of the security think tanks latest report on police algorithms.
UK policing bodies have laid out their top five digital priorities for the decade ahead, which includes boosting collaboration between public and private sector actors, as well as building common frameworks to dictate how new technologies are used by police