Home Office announces sweeping police technology plans
The Home Office plans to ramp up its deployment of artificial intelligence and facial-recognition technologies under wide-ranging reforms to UK policing
The Home Office has outlined plans for the massive roll-out of artificial intelligence (AI) and facial-recognition technologies as part of sweeping reforms to the UK’s “broken” policing system.
Announced 26 January 2026 by home secretary Shabana Mahmood, the reforms will see the Home Office place substantial investment into the deployment of AI and facial recognition throughout UK policing, while establishing a new National Police Service (NPS) to streamline the fragmented, 43-force model the UK currently operates under.
The government has said the new service will also subsume a range of existing central bodies, such as the National Crime Agency (NCA) and Counter Terrorism Policing, and will play a critical role in coordinating, adopting and standarising the use of data-driven technologies.
According to a whitepaper published by government on the reforms, deployments of AI and facial recognition vary markedly across forces, as each force making its own decisions and investments has resulted in “policing is radically under-utilising technology and data”.
It added that the current fragmentation of data and technology infrastructure – which are plagued by aging systems, manual processes and poor data quality – is slowing down investigations and leaving police unable to keep pace with the increasing rate of digitally enabled crime, which the Home Office said now accounts for nine of every 10 crimes.
“Criminals are operating in increasingly sophisticated ways. However, some police forces are still fighting crime with analogue methods,” said Mahmood. “We will roll out state-of-the-art tech to get more officers on the streets and put rapists and murderers behind bars.”
By also applying AI to some of the biggest administrative burdens facing police – including disclosure, analysis of CCTV footage, production of case files, crime recording and classification, and translating or transcribing documents – the Home Office claims it will free up six million policing hours each year.
“To meet this moment policing needs national leadership in how we develop and deploy technology, greater consistency in the recording, sharing and analysis of data, and a culture of responsible innovation so that successful local initiatives can be rolled out at scale,” said the whitepaper.
“A reformed system is an essential step in unlocking the potential of technology, data and AI in policing…By delivering police digital, data and technology infrastructure in a coherent and strategic manner at the national level for the first time, the NPS will ensure that officers and staff have access to the best available technology and insights. Ultimately, this will deliver smarter operational policing and save officer time, helping them focus on tackling crime and keeping the public safe.”
Under the reform proposals, the Home Office will increase the number of live facial-recognition (LFR) vans available to police from 10 to 50; set up a new National Centre for AI in Policing (to be known as Police.AI) to build, test and assure AI models for policing contexts; and invest £115m over three years to help identify, test and scale new AI technologies in policing.
Through Police.AI – which is expected to be up and running by spring 2026 – the department will create a registry of the AI being deployed by UK police, which will outline the steps they have taken to ensure the reliability of tools prior to their operational use. The new body will also help to roll out successful projects nationally, such as AI chatbots being trialled by some forces to triage non-urgent online queries.
Further investments being made into data and technology include £26m for the development and delivery of a national facial-recognition system, and another £11.6m on LFR capabilities.
The announcement of the policing reforms follows a judicial review hearing that challenged the lawfulness of the Metropolitan Police’s LFR use, and comes amid an ongoing consultation launched by the Home Office in December 2025 about a new legal framework for the technology.
In a recent interview with former prime minister Tony Blair, Mahmood described her ambition to use technologies such as AI and LFR to achieve Jeremy Bentham’s vision of a “panopticon”, referring to his proposed prison design that would allow a single, unseen guard to silently observe every prisoner at once.
Typically used today as a metaphor for authoritarian control, the underpinning idea of the panopticon is that by instilling a perpetual sense of being watched among the inmates, they would behave as authorities wanted.
“When I was in justice, my ultimate vision for that part of the criminal justice system was to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his panopticon,” Mahmood told Blair. “That is that the eyes of the state can be on you at all times.”
Responding to the policing reforms announced, Ruth Ehrlich, the interim director of external relations at campaign group Liberty, said: "Rolling out powerful surveillance tools while a consultation is still under way undermines public trust and shows disregard for our fundamental rights.”
She added that attempts by police forces to use AI and facial recognition have so far been “plagued by failure”.
“We have seen what happens when facial-recognition technology is rolled out without clear safeguards: children are wrongly placed on watchlists, and Black people are put at greater risk of being wrongly identified.”
Conservative MP David Davis also highlighted “significant error rates” in the use of digital facial ID and AI, telling the House of Commons on the day of the announcement that rolling out these technologies in a law enforcement context could risk “miscarriages of justice”, adding: “Innocent people fear this, particularly after the Post Office scandal, which showed that courts believe computers rather than people.”
While the use of LFR by police – beginning with the Met’s deployment at Notting Hill Carnival in August 2016 – has already ramped up massively in recent years, there has so far been minimal public debate or consultation, with the Home Office claiming for years that there is already “comprehensive” legal framework in place.
The department has said that although a “patchwork” legal framework for police facial recognition exists (including for the increasing use of the retrospective and “operator-initiated” versions of the technology), it does not give police themselves the confidence to “use it at significantly greater scale…nor does it consistently give the public the confidence that it will be used responsibly”.
When launching its consultation on a new framework for the tech, the Home Office added that the current rules governing police LFR use are “complicated and difficult to understand”, and that an ordinary member of the public would be required to read four pieces of legislation, police national guidance documents and a range of detailed legal or data protection documents from individual forces to fully understand the basis for LFR use on their high streets.
Read more about police technology
- Microsoft hides key data flow information in plain sight: Microsoft’s own documentation confirms that data hosted in its hyperscale cloud architecture routinely traverses the globe, but the tech giant is actively obfuscating this vital information from its UK law enforcement customers.
- Met claims success for permanent facial recognition in Croydon: Met Police boasts that its permanent deployment of live facial recognition cameras in Croydon has led to more than 100 arrests and prompted a double-digit reduction in local crime, ahead of an upcoming judicial review assessing the technology’s lawfulness.
- UK MoJ crime prediction algorithms raise serious concerns: The Ministry of Justice is using one algorithm to predict people’s risk of reoffending and another to predict who will commit murder, but critics say the profiling in these systems raises ‘serious concerns’ over racism, classism and data inaccuracies.
