Newham Council rejects use of live facial-recognition tech by police

Live facial-recognition technology should not be used by police in Newham until biometric and anti-discrimination safeguards are in place, according to a motion passed unanimously by the council, but the Met Police and the Home Office have indicated they will not suspend its use

Newham Council is officially opposed to the police’s use of live facial-recognition (LFR) technology after voting unanimously in favour of a motion to suspend its use across the London borough.

Proposed on 16 January 2023 by Areeq Chowdhury, the Labour councilor for Canning Town North, the motion mandates that Newham Council’s chief executive must now write to the Home Office, the Mayor of London, and the Metropolitan Police Service (MPS) to make its opposition to LFR technology clear.

It further mandates that the executive must request a suspension of LFRs use throughout the borough, at least until sufficient biometric regulations and anti-discrimination safeguards are in place.

“The use of live facial recognition surveillance has attracted worldwide condemnation for its intrusive nature, unreliability, and its potential to exacerbate racist outcomes in policing,” said a copy of the motion shared with Computer Weekly.

“Major human rights organisations (including Amnesty International, Liberty, and Human Rights Watch) have called for bans to be imposed on the technology. The continued use of this form of surveillance risks creating tension between the police and the public, as well as undermining the privacy and human rights of Newham’s residents.”

The potential of LFR to “exacerbate racist outcomes in policing” is particularly acute in Newham, which is the most ethnically diverse of all local authorities in England and Wales, according to the most recent UK census data.

Speaking ahead of the vote, Chowdhury warned fellow councilors that “dystopian futures do not occur overnight” and are instead built piecemeal, with each change “disguised as a minor technical adjustment to our existing way of life” until “the rights we once took for granted are now impossible to exercise”.

He added that unlike CCTV, facial-recognition technology “takes measurements” rather than pictures, and likened use of the technology as “fingerprinting without consent” due to the way it can intricately map a person’s face.

“It’s also a form of surveillance which mainstream technology companies such as Microsoft and IBM have sought to distance themselves from,” said Chowdhury. “London is one of the few places in the world to deploy live facial-recognition technology.

“When trialled outside Westfield shopping centre in 2018, Newham became a global birthplace of this new, dystopian form of surveillance. It’s first operational deployment was also in Newham.”

In June 2020, in the wake of the police murder of George Floyd, tech giants Amazon, Microsoft and IBM agreed to halt sales of their respective facial-recognition technologies to US law enforcement agencies, with IBM agreeing to cease any further research or development of the technology as well.

Chowdhury added: “As councillors our role isn’t to just be a middle entity for managing case work, it’s to be political representatives for the people of Newham, taking an interest in all things that happen within the borough, no matter how complex, and regardless of whether it sits outside our statutory duties.”

While Newham as a local council does not have the power to halt LFR deployments throughout the borough itself, Chowdhury added he hopes it will increase pressure on the government to introduce a national moratorium on police’s use of the technology.

Initial responses

Responding to the motion, as well as questions from Computer Weekly about whether it intends suspend the use of LFR by police in Newham given the lack of consent from the council, the Home Office said the technology plays “a crucial role in helping the police tackle serious offences including knife crime, rape, child sexual exploitation and terrorism”.

“It is right that we back and empower the police to use it, but we are clear it must be done in a fair and proportionate way. This is why they must follow the College of Policing guidance, which among other things sets out that it is imperative to assess the accuracy and performance of this technology across different demographics,” it added.

When asked if it was able to provide any evidence that LFR had led to arrests for the serious offences it listed, the Home Office said the MPS would be best placed to answer as operational leads on the technology. 

The MPS confirmed that no arrests have been made for those reasons, adding it deploys LFR “based on a specific intelligence case and with a focus on locating those people who pose a serious risk to the public” but who are difficult to find.

“As part of the authorisation process and before any deployment, a specific community impact assessment is completed by the local [Basic Command Unit] BCU,” said a spokesperson.

“This assessment involves speaking to a wide number of local groups so that policing is informed of those views and can take those into consideration before any decision to deploy is made. The MPS has commission an independent Facial Recognition Technology in Law Enforcement Equitability Study. The results of this study will be published once it has been completed.”

A spokesperson for the Mayor of London, said: “New technology has a role in keeping Londoners safe, but it’s equally important that the Met Police are proportionate in the way it is deployed and are transparent about where, when and how it is used in order to retain the trust of all Londoners. City Hall will continue to monitor the use of facial-recognition technology as part of their role in holding the Met to account.”

Both Parliament and civil society have repeatedly called for new legal frameworks to govern law enforcement’s use of biometrics – including a House of Lords inquiry into police use of advanced algorithmic technologies; the UK’s former biometrics commissioner, Paul Wiles; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

But the government maintains there is “already a comprehensive framework” in place.

In January 2022, for example, then-policing minister Kit Malthouse said there is already a strong framework in place, adding that any new policing tech should be tested in court, rather than legislated for, on the basis that new laws would “stifle innovation”.

Despite long-standing concerns, however, the MPS ramped up its use of LFR throughout 2022, scanning roughly 144,366 people’s biometric information over six deployments between January and July. This resulted in just eight arrests, including for drug possession, assault of an emergency worker, failure to appear in court, and an undisclosed traffic violation.

Based on the gulf between the number of people scanned and the number of arrests made, as well as the answers provided to Computer Weekly by the MPS about its deployments, civil society groups, lawyers and politicians condemned the force’s approach to LFR as fundamentally flawed and “irresponsible”.

London Assembly member Caroline Russell, who is leader of the Greens and sits on the Police Committee, told Computer Weekly at the time that there needs to be certainty “all the proper safeguards are in place before the technology is deployed”, adding that “it’s irresponsible to be using it when there are such widely known and worrying flaws in the way that it works”.

Read more about facial recognition

Read more on IT governance

Data Center
Data Management