
andreusK - stock.adobe.com
Met Police to double facial recognition use amid budget cuts
The UK’s largest police force is massively expanding its use of live facial recognition technology as it prepares to lose 1,700 officers and staff
The Metropolitan Police will more than double its number of live facial recognition (LFR) deployments to cover the loss of 1,400 officers and 300 staff amid budget cuts.
Detailing its restructuring plans – which also include bulking up the force's protest-focused "public order crime" team and putting more officers on the beat – the Met said LFR will now be deployed up to 10 times a week across five days, up from the current rate of four deployments over two days.
While the restructuring announcement noted 90 additional officers would be deployed to six "high-crime" zones – including Brixton, Kingston, Ealing, Finsbury Park, Southwark, and Spitalfields – it is unclear if these areas would also see a greater number of LFR deployments.
The initiative follows the force warning in April 2025 that it faces a £260m budget shortfall for the coming year.
Met commissioner Mark Rowley defended the move, saying the technology is used responsibly and only deployed to look for serious offenders.
“We routinely put it out there and capture multiple serious offenders in one go, many of whom have committed serious offences against women or children, or people who are wanted for armed robbery,” he said.
“It’s a fantastic piece of technology. It’s very responsibly used, and that’s why most of the public support it.”
On the restructuring in general, Rowley added: “While our budget has decreased in real terms, we are using this additional [£32m] funding from City Hall and Home Office productively to support our mission to take a targeted approach to tackling volume crime and bolster our specialist tactics to disrupt the criminal gangs who fuel anti-social behaviour, robbery and theft.”
Campaign group Liberty's policy and campaigns officer Charlie Whelton said increasing LFR use was "incredibly concerning" given the lack of regulation for the technology.
“Any tech which has the potential to infringe on our rights in the way scanning and identifying millions of people does needs to have robust safeguards around its use, including ensuring that proper independent oversight is in place," he said.
“The government must legislate now to regulate this technology, protect people’s rights, and make sure that the law on facial recognition does not get outpaced by the use.”
In July 2025, home secretary Yvette Cooper confirmed for the first time that the UK government will seek to regulate police facial recognition by creating “a proper, clear governance framework”, citing police reticence to deploy systems without adequate rules in place. However, she declined to say if any new framework will be statutory.
Ongoing concerns
While the Met maintains that its deployments are intelligence-led and focus exclusively on locating individuals wanted for serious crimes, senior officers previously admitted to a Lords committee in December 2023 that the force selects images for its watchlist based on crime categories attached to people’s photos, rather than a context-specific assessment of the threat presented by a given individual.
This includes those wanted for non-serious crimes such as shoplifting or traffic offences.
Academic Karen Yeung, an interdisciplinary professorial fellow in law, ethics and informatics at Birmingham Law School, challenged the proportionality and necessity of this approach during the same Lords session, claiming the coercive power of the state means police must be able to justify each entry to the watchlists based on the specific circumstances involved, rather than their blanket inclusion via “crime types”.
Critics have also raised concerns about the Met’s disproportionate use of LFR, in terms of watchlist sizes, faces scanned, and impacts on certain communities.
Civil liberties group Big Brother Watch, for example, has repeatedly highlighted how the size of the Met’s LFR watchlist – which is now routinely exceeding 15,000 faces – indicates the deployments are not intelligence-led or targeted.
Commenting in the wake of a February 2022 LFR deployment in Westminster, where the watchlist contained 9,756 images, Big Brother Watch director Silkie Carlo told Computer Weekly, “That’s not a targeted and specified deployment because of a pressing need – it’s a catch net.”
According to data gathered by Green Party London Assembly member Zoë Garbett, over half of the Met’s 180 LFR deployments that took place during 2024 were also in areas where the proportion of Black residents is higher than the city’s average, including Lewisham and Haringey.
While Black people comprise 13.5% of London’s total population, the proportion is much higher in the Met’s deployment areas, with Black people making up 36% of the Haringey population, 34% of the Lewisham population, and 40.1% of the Croydon population, where the Met is also planning to deploy permanent LFR cameras.
Garbett added that while nearly two million people in total had their faces scanned across the Met’s 2024 deployments, only 804 arrests were made – a rate of just 0.04%.
The Met said in July this year that since the start of 2024, more than 1,000 arrests have been made using LFR, 773 of which led to the individual being charged or cautioned.
Similarly, while the Met claims its use of the technology is supported by the majority of the public, there have been instances where it has deployed LFR despite public opposition.
In December 2024, for example, Computer Weekly revealed that, contrary to the force’s claim its LFR deployments in Lewisham are supported by the majority of residents and local councillors, there was minimal direct consultation with residents, while councillors clearly continued to express concerns about it.
“What people support is safer streets and improved equity and community cohesion,” Green Lewisham councillor Hau-Yu Tam told Computer Weekly at the time. “They don’t necessarily support live facial recognition, which they’re not given the full rundown of, or they’re given very misleading information about.”
In January 2023, Newham Council also unanimously passed a motion to suspend the use of LFR throughout the borough until biometric and anti-discrimination safeguards are in place.
While the motion highlighted the potential of LFR to “exacerbate racist outcomes in policing” – particularly in Newham, the most ethnically diverse of all local authorities in England and Wales – both the Met and the Home Office said that they would press forward with the deployments anyway.
Read more about facial recognition technology
- UK biometric surveillance exists in ‘legal grey area’: The rapid proliferation of ‘biometric mass surveillance technologies’ throughout the UK’s public and private sectors is taking place without legal certainty or adequate safeguards for the public.
- Essex Police discloses ‘incoherent’ facial recognition assessment: An equality impact assessment of Essex Police live facial recognition deployments is plagued by inconsistencies and poor methodology, undermining the force’s claim that its use of the technology will not be discriminatory.
- Scottish police hold almost no data on facial recognition use: It is currently impossible to assess Police Scotland’s use of retrospective facial recognition for efficacy and fairness because the force does not collect meaningful information that would enable a proper evaluation.