Savo Ilic - stock.adobe.com

Met claims success for permanent facial recognition in Croydon

Met Police boasts that its permanent deployment of live facial recognition cameras in Croydon has led to more than 100 arrests and prompted a double-digit reduction in local crime

The Met Police has announced that its deployment of permanent live facial recognition (LFR) cameras in Croydon has led to 103 arrests, with the force claiming it has reduced crime in the local area by 12%.

Beginning in October 2025, the Met fixed 15 LFR-enabled cameras to street furniture in Croydon, claiming they would only be activated when officers are present and conducting an operation in the area.

The Met’s announcement comes just a week ahead of a judicial review against its use of LFR, which will assess whether it has been using the technology lawfully. The legal challenge was launched by anti-knife campaigner Shaun Thompson after he was wrongly identified as a suspect by the force’s LFR system, alongside privacy campaigners at Big Brother Watch.

While LFR is typically deployed by the force in an overt manner, with specially equipped cameras mounted atop a visibly marked police van to scan and compare people’s unique facial features against watchlists in real time, this marks the Met’s first covert deployment of the cameras that can be monitored by officers remotely.

In a press release, the Met claimed that running deployments without a van has increased the efficiency of its LFR operations, with an arrest being made on average every 34 minutes when in use, while also reducing the average time to locate wanted individuals by more than 50% when compared with van-based deployments.

Of those arrested, it added a third were for offences related to violence against women and girls, such as strangulation and sexual assault, with other arrests over recall to prison, burglary and possession of an offensive weapon.

“The increase in LFR deployments across crime hotspots in London is driven by its proven impact and success – with more than 1,700 dangerous offenders taken off London’s streets since the start of 2024, including those wanted for rape and child abuse,” said Lindsey Chiswick, the Met and national lead for LFR.

“This is why we are trialling a new and innovative pilot in Croydon,” she said. “It allows us to explore a different way of using facial recognition by operating it remotely and more efficiently. The amount of arrests we have made in just 13 deployments shows the technology is already making an impact and helping to make Croydon safer. Public support remains strong, with 85% of Londoners backing the use of LFR to keep them safe.”

Read more about police facial recognition

The Met added that its pilot deployment of permanent LFR cameras will undergo an evaluation in the coming months to assess its effectiveness, but that there are currently no plans to expand its permanent deployment to other sites in London.

It also said the Met will continue to run engagement sessions with Croydon residents and councillors to explain how LFR works, outline the intelligence-led approach behind deployments, and set out the safeguards in place to protect privacy and rights.

However, in April 2025, in the wake of the Met’s initial announcement, local councillors previously complained that the decision to set up facial recognition cameras permanently took place without any community engagement from the force with local residents.

While the Met has further claimed that Croydon was selected for the permanent LFR deployment due to “its status as a crime hotspot”, local councillors also highlighted a pattern of racial bias in its choice of deployment locations.

“The Met’s decision to roll out facial recognition in areas of London with higher Black populations reinforces the troubling assumption that certain communities … are more likely to be criminals,” said Green Party London Assembly member Zoë Garbett at the time, adding that while nearly two million people in total had their faces scanned across the Met’s 2024 deployments, only 804 arrests were made – a rate of just 0.04%.

The Met Police’s roll-out of LFR in other boroughs has similarly taken place with little to no community engagement, and in some areas has occurred despite notable political opposition from local authorities.

Executive mayor of Croydon Jason Perry said in the Met’s press release, however, that the arrest figures show “that this pioneering technology is helping to make our streets safer”.

Broken windows in the panopticon

Perry added: “I look forward to continuing to work with the Met Police to tackle crime, as part of our zero-tolerance approach to fixing the ‘broken windows’, restoring pride in our borough and making Croydon a safer place for all our residents.”

Under the “broken windows” theory of policing, first posited by US criminologists James Wilson and George Kelling in the early 1980s, leaving even minor disorder unchecked (such as graffiti, antisocial behaviour or vandalism) encourages people to engage in more serious crimes.

While advocates of this approach therefore argue for the proactive, zero-tolerance policing of minor infractions as a way of instilling order and deterring more serious criminal conduct, critics argue it encourages aggressive or confrontational policing practices that disproportionally target poor and minoritised communities, ultimately breeding resentment against authorities.

In a recent interview with former prime minister Tony Blair, current UK home secretary Shabana Mahmood described her ambition to use technologies like artificial intelligence (AI) and LFR to achieve Jeremy Bentham’s vision of a “panopticon”, referring to his proposed prison design that would allow a single, unseen guard to silently observe every prisoner at once.

Typically used today as a metaphor for authoritarian control, the underpinning idea of the panopticon is that, by instilling a perpetual sense of being watched among the inmates, they would behave as authorities wanted.

“When I was in justice, my ultimate vision for that part of the criminal justice system was to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his panopticon,” Mahmood told Blair. “That is that the eyes of the state can be on you at all times.”

LFR consultation on legal framework

In December 2025, the Home Office launched a 10-week consultation on the use of LFR by UK police, allowing interested parties and members of the public to share their views on how the controversial technology should be regulated.

While the use of LFR by police – beginning with the Met’s deployment at Notting Hill Carnival in August 2016 – has ramped up massively in recent years, there has so far been minimal public debate or consultation, with the Home Office claiming for years that there is already “comprehensive” legal framework in place.

However, the Home Office said in late 2025 that although a “patchwork” legal framework for police facial recognition exists (including for the increasing use of the retrospective and “operator-initiated” versions of the technology), it does not give police themselves the confidence to “use it at significantly greater scale … nor does it consistently give the public the confidence that it will be used responsibly”.

It added that the current rules governing police LFR use are “complicated and difficult to understand”, and that an ordinary member of the public would be required to read four pieces of legislation, police national guidance documents and a range of detailed legal or data protection documents from individual forces to fully understand the basis for LFR use on their high streets.

There have been repeated calls from both Parliament and civil society over many years for the police’s use of facial recognition to be regulated.

This includes three separate inquiries by the Justice and Home Affairs Committee into shopliftingpolice algorithms and police facial recognition; two of the UK’s former biometrics commissioners, Paul Wiles and Fraser Sampson; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on live facial recognition as far back as July 2019.

More recently, the Ada Lovelace Institute published a report in May 2025 that said the UK’s patchwork approach to regulating biometric surveillance technologies is “inadequate”, placing fundamental rights at risk and ultimately undermining public trust.

In August 2025, after being granted permission to intervene in the judicial review of the Met’s LFR use, the UK’s equality watchdog said the force is using the technology unlawfully, citing the need for its deployments to be necessary, proportionate and respectful of human rights.

Read more on Cloud storage