The first deployment took place at London Bridge railway station on 11 February 2026, but the force claims that the future dates and locations of all BTP LFR operations will be published online before they occur.
Chief superintendent Chris Casey, BTP’s senior officer overseeing the project, said: “The initiative follows a significant amount of research and planning, and forms part of BTP’s commitment to using innovative technology to make the railways a hostile place for individuals wanted for serious criminal offences.”
A BTP press release added that people who are not included on a watchlist cannot be identified, and made assurances about how information will be handled: “People who prefer not to enter the recognition zone will have alternative routes available and images of anyone not on the authorised database will be deleted immediately and permanently.”
The decision to deploy LFR at major transport hubs occurred while the Home Office’s 10-week consultation on LFR – which ended on 12 February 2026 – was still ongoing. The consultation allowed interested parties and members of the public to share their views on how the controversial technology should be regulated.
It also follows home secretary Shabana Mahmood’s late January 2026 announcement of sweeping police reforms, which will see the largest ever roll-out of facial recognition technology in the UK.
Human rights group Liberty, which won the first legal challenge against police use of the tech in August 2020, previously urged the government to halt the expansion of police LFR while the consultation is taking place.
In December 2025, the Home Office said: “Although a patchwork legal framework for police facial recognition exists, it does not give police themselves the confidence to use it at a significantly greater scale … nor does it consistently give the public the confidence that it will be used responsibly.”
Responding to the BTP’s announcement, Green London Assembly member Zoë Garbett told Computer Weekly: “With the government’s consultation only just closed, pressing ahead with expansion makes a mockery of the process. What’s the point of asking for public views if deployment continues regardless?”
On 10 February 2026, Garbett dismissed the police claim that LFR is a “precise” tool, highlighting how nearly every watchlist used is larger than the one preceding it.
Live facial recognition technology subjects everyone to constant surveillance, which goes against the democratic principle that people should not be monitored unless there is suspicion of wrongdoing
Zoë Garbett, Green London Assembly
Garbett also called for new safeguards to be implemented that she feels would protect Londoners from “escalating” biometric surveillance.
She issued four key policy recommendations in a report published the same day, which called for the Met Police to immediately stop using LFR technology and publish all true financial and operational costs of its deployments.
“Live facial recognition technology subjects everyone to constant surveillance, which goes against the democratic principle that people should not be monitored unless there is suspicion of wrongdoing,” she said.
Computer Weekly contacted the British Transport Police about its decision to deploy the technology before the Home Office’s consultation had finished, as well as the concerns raised about its use in Garbett’s report.
“A response to the consultation is due to be published within 12 weeks of the closing date, and we will of course engage with whatever the findings may be,” said a BTP spokesperson.
They also pointed to comments made by BTP chief superintendent Casey, who stressed that the force is committed to using LFR ethically and in line with privacy safeguards: “Deployments will comply with all relevant legal and regulatory standards, and oversight will include internal governance and external engagement with ethics and independent advisory groups.
“When the pilot is complete, we’ll conduct a full assessment to review outcomes, identify lessons learned and inform future planning. I encourage anyone who encounters our use of LFR when the trial begins to engage with us so we can make sure that we’re using it in the best way and helping to make our railways as safe as possible.”
Questionable legal basis
In August 2020, the Court of Appeal found that South Wales Police had been deploying LFR unlawfully, on the grounds that there were insufficient constraints on the force’s discretion over where LFR could be used and who could be placed on a watchlist.
Under the Artificial Intelligence Act in Europe, authorities’ use of live facial recognition is generally prohibited and limited only to exceptional circumstances, such as preventing an imminent terror attack.
Even then, safeguards apply, such as a clear legal basis in national law and judicial authorisation.
Ahead of the High Court hearing, Big Brother Watch director and claimant Silkie Carlo said: “We are totally out of step with the rest of Europe on live facial recognition. This is an opportunity for the court to uphold our democratic rights and instigate much-needed safeguards against intrusive AI-driven surveillance.”
The High Court hearing was the first legal challenge in Europe brought by someone misidentified by LFR technology, which Thompson, a 39-year-old Black man, described as “stop and search on steroids”.
In the hearing, Thompson and Carlo’s lawyers argued that the Met’s policy on where it can deploy LFR “confers far too broad a discretion on individual officers” who can designate areas as “crime hotspots” based on “operational experience as to future criminality, which is opaque and entirely subjective”.
The Met, on the other hand, argued that because officers’ discretion around LFR deployments is not unconstrained, the case is not a legality issue, asserting that “so long as the court is satisfied there is not unfettered discretion on the constable deciding where to locate LFR, [there] is not a maintainable legality challenge”.
The Met added: “Because there are no parts of the policy that allow unfettered discretion for an officer to add whomever he or she wants to a watchlist or place the LFR camera wherever he or she wishes … there is no maintainable attack on the policy as lacking the quality of law.”
Research led by Garbett has shown that over half of all LFR deployments in 2024 took place in areas with higher-than-average Black populations, including Thornton Heath, Croydon (40%), Northumberland Park, Haringey (36%), and Deptford High Street, Lewisham (34%).
Overall, LFR is disproportionately used in areas with higher populations of Black, Asian and Mixed ethnic groups.
A 2019 study comparing 189 different algorithms found they were between 10 and 100 times more likely to misidentify Black and Asian faces compared with white faces. Black women had the highest rate of false positive matches.
In January 2023, Newham Council unanimously passed a motion to suspend the use of LFR throughout the borough until biometric and anti-discrimination safeguards are in place.
The motion highlighted LFR’s potential to exacerbate racist outcomes in policing, given that Newham is the most ethnically diverse of all local authorities in England and Wales.
In April 2023, testing by the National Physical Laboratory found that the facial recognition algorithms used by the Met and South Wales Police – which will now be adopted by BTP – had “no statistical significance between demographic performance” if certain settings are used.
However, critics said at the time that despite improvements in the algorithm’s accuracy, LFR can still entrench patterns of discrimination, as surveillance has historically tended to be used against communities of colour.
Jake Hurfurt, investigations lead at Big Brother Watch, has previously said: “It’s not just bias in the technology, it’s how it’s used. The watchlist is disproportionately targeted towards ethnic minorities, so you get different outcomes.”
While police repeatedly claim that LFR is being used solely for serious and violent offenders, watchlists regularly contain images of people wanted for drug, shoplifting or traffic offences, none of which legally meet this definition.
A 2025 paper by academics Karen Yeung and Wenlong Li highlighted “unresolved questions” about the legality of watchlist composition, specifically the “significance and seriousness” of the underlying offence used to justify a person’s inclusion, and the “legitimacy of the reason why that person is ‘wanted’ by the police” in the first place.
Read more about facial recognition
Met claims success for permanent facial recognition in Croydon: Met Police boasts that its permanent deployment of live facial recognition cameras in Croydon has led to more than 100 arrests and prompted a double-digit reduction in local crime, ahead of an upcoming judicial review assessing the technology’s lawfulness