chalabala - Fotolia

Spring Budget risks funding legally questionable police tech

Open legal questions around how UK police are using facial recognition and cloud technology could undermine the £230m investment committed in the Spring Budget to “time and money-saving technology” for police

Chancellor Jeremy Hunt has committed £230m to police forces so they can pilot or roll out productivity-boosting technologies, but open questions around the legality of how certain systems are already being used could undermine further investment.

In his Spring Budget speech, Hunt said police officers currently waste around eight hours a week on unnecessary admin tasks, and that the money will therefore go towards a range of “time and money-saving technology”.

This will include further investment in live facial recognition, automation and artificial intelligence (AI), and the use of drones as potential first responders. The funds will also be used to set up a new Centre for Police Productivity to support forces’ greater use of data and AI, as well as to help maximise their productivity.

Pre-briefings of the government’s technology plans to journalists revealed that automated redaction technologies would be a priority, so that personal information can be removed from documents or irrelevant faces can be blurred out from body-worn video footage.

Hunt also committed to providing a further £75m to the roll-out of Violence Reduction Units and hot spot policing tactics, the latter of which largely revolves around the use of data to target police resources and activities to areas where crime is most concentrated.

Computer Weekly contacted the Home Office for further details of the funding and what it will be spent on. A spokesperson said the Home Office is working with policing partners to allocate the funding, and that further information on specific fund allocations will be set out in due course

However, lingering concerns around the legality of how UK police are deploying cloud infrastructure and AI-powered facial recognition could undermine the effectiveness of the investment.

In the case of facial recognition, there have been repeated calls for new biometric-focused legislation from a wide range of actors due a lack of clear rules controlling its use; while the UK data regulator is yet to confirm how police use of US-based cloud infrastructure is legal, following multiple issues raised by data protection experts and other regulators around how these systems handle people’s data.

Migrating police systems over to public cloud infrastructure was highlighted as a key technological enabler by the Police Digital Service (PDS) and the National Police Technology Council (NPTC) in their joint National policing digital strategy 2020-2030, which set the goal to have 80% of police technology in these systems by the end of the decade.

Given this priority, as well as the computing power and storage required to effectively use AI, data protection experts told Computer Weekly that many of the new AI tools being deployed will be hosted on this US-based cloud infrastructure, opening them up to potential legal compliance challenges as well.

Computer Weekly asked the Home Office if it believed the investment in police tech could be undermined by the legal issues around their deployments, but received no response on this point.

Facial recognition

In March 2022, for example, following a 10-month investigation into the use of AI and algorithmic technologies by UK police – including facial recognition and various crime “prediction” tools – the Lords Justice and Home Affairs Committee (JHAC) found that forces are deploying a range of advanced tech without a thorough examination of their efficacy or outcomes.

It added that UK police are essentially “making it up as they go along”, and described the situation as “a new Wild West” characterised by a lack of strategy, accountability and transparency from the top down.

Following a short follow-up investigation, this time looking exclusively at the use of facial recognition, the JHAC found in January 2024 that UK police are expanding their use of LFR technology without proper scrutiny or accountability, despite lacking a clear legal basis for their deployments.

“Does the use of LFR have a basis in law? Is it actually legal? It is essential that the public trusts LFR and how it is used?” asked then JHAC chair Baroness Hamwee. “It is fundamental that the legal basis is clear. Current regulation is not sufficient. Oversight is inadequate.

“Technology is developing so fast that regulation must be future-proofed. Police forces may soon be able to link LFR cameras to trawl large populations, such as Greater London, and not just specific localities. We are an outlier as a democratic state in the speed at which we are applying this technology. We question why there is such disparity between the approach in England and Wales and other democratic states in the regulation of LFR.”

Commenting on the fresh police tech funding, the JHAC’s new chair, Lord Foster, said: “While we don’t yet know the full details of the proposals, we accept that new technologies may well provide valuable tools to help police forces.

“However, our inquiry into one such technology, live facial recognition, showed a lack of clear standards and regulation for its use. We expect the government to respond shortly. But, as police forces increasingly rely on technology, we will want assurance that there will be proper scrutiny and accountability of their use.”

Some critics have also questioned the lawfulness of facial recognition as a policing tool based on its questionable proportionality and necessity, arguing that the scanning of tens of thousands of faces every time the tech is deployed would likely not pass this legal test, particularly when other, less intrusive methods are already available to police.

New legal frameworks

Both Parliament and civil society have repeatedly called for new legal frameworks to govern law enforcement’s use of biometrics – including the UK’s former biometrics commissioner, Paul Wiles; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

In an exclusive interview with Computer Weekly, the outgoing biometrics and surveillance camera commissioner for England and Wales, Fraser Sampson, also highlighted a number of issues with how UK police had approached deploying its facial recognition capabilities, and warned that the future oversight of police tech is at risk as a result of the government’s proposed data reforms.

In October 2019, the ICO also published an opinion that said while new legislation was not necessary, there is a need for more clarity around how it applies to LFR, which should come in the form of a statutory and binding code of practice.

“Such a code should provide greater clarity about proportionality considerations, given the privacy intrusion that arises as a result of the use of LFR, for example, facial matching at scale,” it said.

“Without this, we are likely to continue to see inconsistency across police forces and other law enforcement organisations in terms of necessity and proportionality determinations relating to the processing of personal data. Such inconsistency, when left unchecked, will undermine public confidence in its use and lead to the law becoming less clear and predictable in the public’s mind.”

Responding to concerns raised about LFR, a Home Office spokesperson said: “Facial recognition, including live facial recognition, is a powerful tool that has a sound legal basis, confirmed by the courts. It has already helped the police to catch a large number of serious criminals, including for murder and sexual offences.

“The police can only use facial recognition for a policing purpose, where necessary, proportionate and fair, in line with data protection, equality and human rights laws.”

The JHAC has previously said it expects the government to respond to its findings on facial recognition on 26 March 2024.

Hyperscale public cloud infrastructure

Aside from facial recognition, there are also ongoing data protection concerns about the use of US-based hyperscale public cloud systems by UK police forces, and whether such systems can comply with the UK’s stringent law enforcement-specific data protection rules that place strict requirements on when and how data can be transferred overseas.

The issues with the cloud infrastructure therefore largely stem from the potential for US government access via the Cloud Act, subjects, such as US government access via the Cloud Act, which effectively gives the US government access to any data, stored anywhere, by US corporations in the cloud; the use of generic rather than specific contracts that take into account the police-specific data protection rules; and the risk of overseas transfer of sensitive law enforcement data to a jurisdiction where there are demonstrably lower data protection standards.

Since Computer Weekly revealed in December 2020 that dozens of UK police were processing over a million’s people data unlawfully in Microsoft 365, data protection experts and police tech regulators have questioned various aspects of how hyperscale public cloud infrastructure has been deployed by UK police, arguing they are currently unable to comply with strict law enforcement-specific rules laid out in Part Three of the Data Protection Act (DPA) 2018.

At the start of April 2023, Computer Weekly then revealed the Scottish government’s Digital Evidence Sharing Capability (DESC) service – contracted to body-worn video provider Axon for delivery and hosted on Microsoft Azure – was being piloted by Police Scotland despite a police watchdog raising concerns about how the use of Azure “would not be legal” because of the above issues.

Computer Weekly also revealed that suppliers Microsoft and Axon, as well as the ICO, were all aware of these issues before processing in DESC began. The risks identified extend to every cloud system used for a law enforcement purpose in the UK, as they are governed by the same data protection rules.

Responding to subsequent concerns raised by Scottish biometric commissioner (SBC) Brian Plastow, information commissioner John Edwards initially told him in December 2023 his office was likely to green-light these police cloud deployments because of an information-sharing agreement with the US government, which he suggested would take precedent over domestic UK laws.

The regulator backed down from this position after a letter detailing their meeting was published online by Plastow, and later clarified to Computer Weekly that UK police can legally use cloud services that send sensitive law enforcement data overseas with “appropriate protections” in place. However, it declined to specify what these protections are.

In the wake of the Budget announcement, Plastow confirmed to Computer Weekly that he has still not received a copy of the ICO’s legal advice on DESC’s compatibility with UK data protection law.

“This links to the broader point about not investing in technologies until it has been established that they are legal,” he said.

Read more about police technology

While funding for Police Scotland is largely a devolved matter for the Scottish Parliament, meaning the £230m announced only applies to police tech in England and Wales, Plastow added that he shares the concerns of the JHAC, and “endorse their call for proper independent oversight and scrutiny over the ethical and effectiveness considerations relative to biometric enabled surveillance technologies used in policing throughout the UK”.

Computer Weekly contacted the ICO about when it will be publishing its legal advice on police use of cloud.

An ICO spokesperson said: “The ICO considers that, under the Data Protection Act 2018, law enforcement agencies may use cloud services that process data outside the UK where appropriate protections are in place.

“We are actively considering the DESC proposals and are working with the relevant partners in that regard,” they said. “We continue to provide advice to police and law enforcement agencies on using new technologies in a way that complies with data protection law. We will be providing guidance in due course on the general use of cloud services, and we will consider further support that law enforcement agencies may require.”

Since Computer Weekly first reported on data protection issues with police cloud in December 2020, the use of US cloud providers has expanded throughout the criminal justice sector.

This includes the integration of the Ident1 fingerprint database with Amazon Web Services (AWS) under the Police Digital Services (PDS) Xchange cloud platform; and HM Courts and Tribunals’ cloud video platform, which is partly hosted on Azure and processes biometric information in the form of audio and video recordings of court proceedings, as well as its common platform, a separate cloud-based platform that allows various criminal justice sector professionals to access and manage case information.

Commenting on the increasing prevalence of hyperscale public cloud infrastructure in UK policing, SoftIron chief operating officer Jason Van der Schyff said that while criminal justice bodies should be using technology to make “archaic and cumbersome administration” more efficient and effective, key legislation designed to protect people’s data cannot be neglected “in the thrill of expediency”.

“The real issue here might be that instead of fostering a UK-domiciled, owned and operated, industry of cloud service providers, the HMG has let UKCloud fail and squashed the potential for smaller UK companies to compete for the provision of cloud services by signing up to anti-competitive wholesale agreements with those US-headquartered hyperscalers,” he said. “It’s time HMG spent more time innovating with great British companies than wasting taxpayers’ dollars on shiny and trendy hyperscalers.”

Computer Weekly contacted the Home Office about the various issues around police deployments of US-based hyperscale cloud services, but received no response on any of these points.

Artificial intelligence and algorithms

Speaking with Computer Weekly, Nicky Stewart, former head of ICT at the Cabinet Office, said that apart from the data protection infringements under the DPA 18, which will only grow as police forces further consolidate on cloud infrastructure like Azure, wider questions need to be asked about how AI tools are integrated with these systems.

“Does this mean that police forces will have choice in selecting appropriate AI for their needs, or will the proprietary nature of Azure – coupled with Microsoft’s tendency to offer commercial favour to its own products over rival products (as per software licencing) or potentially ‘partner’ products – start to consolidate the nascent AI market on Microsoft?” she said.

“Will this additional funding be used strategically or not? If it isn’t, the nascent AI market could coalesce on Microsoft, which is dangerous, as no one company should be allowed to dominate this unknown market at such an early stage.”

Stewart added that a coalescing of police AI around US firms could mean that UK companies could lose out, and will also put police at greater risk of legal action given the infrastructure’s conflict with law enforcement data protection rules.

She also questioned the role of US cloud providers in decision-making around AI deployments, given their control of the infrastructure these tools will sit on: “Because it will all be powered by cloud, who will make the decisions? People with a grip of the bigger picture, or techies and cloud engineers?”

Owen Sayers – an independent security consultant and enterprise architect with over 20 years’ experience in delivering national policing systems – added that while promises of automation and reducing police time via cloud-based AI applications will resonate with an uninformed public, there are serious legal implications of rolling out more AI in a law enforcement context.

While this partly stems from the fact that the vast majority of AI or automation tools being adopted by UK police will need to be hosted on hyperscale public cloud infrastructure, which comes with its own data protection issues, Sayers said there are also questions about the extent to which police will use the tech to make automated decisions about people that can seriously affect their lives. 

“Automated redaction of personal data struggles when put against the Section 49 rights for a data subject against automated ‘significant decision’-making,” he said, referring to any significant decision being anything that “produces an adverse legal effect, or significantly affects the data subject”.

“Section 49 and the controls under Section 50 make policing’s reliance on automation largely pointless anyway, since a data subject has to be directly informed of such processing on a case-by-case basis (quite an overhead), and can demand the processing is done again without the automation if they so choose – and many will if the outcome isn’t to their liking.”

Sayers added that to legally use the automation and AI promised by Hunt in the Budget, Parliament would need to create new legislation.

“Pressing on without that being in place is to throw more good and limited public money into the gaping maw of police and justice public hyper cloud in the full knowledge that it will definitely result in illegal processing, increasing UK policing’s already rampant data protection lawbreaking activity in the process.

“That would be to the material detriment and not the benefit of the UK public – and is something the next government will need to look at urgently.”

Computer Weekly contacted the Home Office about AI deployments on cloud infrastructure – including about the overheads associated with automated decision-making in a policing context, the associated data protection concerns, and how it’s preventing the market for AI tools from being dominated by a handful of cloud infrastructure providers – but received no response on these points.

Case study: Bedfordshire Police auto-redaction

Given the Budget’s emphasis on rolling out automated redaction technologies to police, Computer Weekly looked at the specific example of how Bedfordshire Police and its suppliers are working to ensure the force’s AI-powered, cloud-based redaction tool is used legally in lieu of ICO guidance.

Known as DocDefender, the system is built to identify and redact non-relevant personal information from case files being shared with UK prosecutors.

Created by software provider Riven and hosted on Amazon Web Services (AWS) hyperscale public cloud infrastructure, the tool is intended to improve the force’s data protection compliance and help officers make significant time savings.

In the Police productivity review from November 2023, for example, the use of DocDefender was said to offer somewhere between 80 and 92% time savings, “Examples included the redaction of a phone download (578 pages equivalent) in 20 minutes (previously this have taken a couple of days), and the redaction of a 350,000-cells spreadsheet in thirty minutes (this would previously have taken four hours),” it said.

Given the lack of clarity over the legality of law enforcement processing in public hyperscale cloud systems, Computer Weekly contacted Bedfordshire Police, Riven and AWS about how they are collectively approaching and managing the system’s deployment.

While Bedfordshire itself as the data controller did not directly respond to many points, both AWS and Riven explained how they use localised UK data storage and end-to-end encryption to protect the data, as well as clarifying that no data is stored in the cloud after the initial processing for redaction is complete.

“It is also worth clarifying that the process of redaction means each document only sits on the servers for a few hours rather than being stored,” said a spokesperson for Bedfordshire. “This technology actually allows us to further safeguard personal details by improving our ability to effectively redact long and complex documents.”

However, while there may be no police data stored or processed within AWS’s US servers, the fact that the redaction processing takes place in its cloud environment could still open the data up to a number of data protection risks.

This includes the fact AWS’s infrastructure is subject to the provisions of the US Cloud Act – which effectively gives the US government access to any data, stored anywhere, by US corporations in the cloud. It may also be accompanied by a gag order, meaning US government access can take place without the knowledge of the data controllers or contracting authorities (i.e. UK police in this case).

This means that, regardless of where the data is physically stored or processed, it can be accessed by AWS, which in turn puts it in reach of US authorities.

Responding to Computer Weekly’s questions, an AWS spokesperson said the suggestion that the US government can access any data held by US-headquarter cloud providers, regardless of where it is physically stored and without the knowledge of AWS’s customers, is inaccurate.

They clarified that the Act provides a mechanism that allows law enforcement to go to a US court during the course of a criminal investigation to request data from service providers, and that to make a formal request for data, law enforcement agencies must first meet the legal standards for a warrant issued by a US court.

They also highlighted AWS’s transparency reports, adding that no US government data requests to AWS have resulted in the disclosure of enterprise or government content data stored outside the US.

On the claim the data is protected due to its encryption in transit and at rest, Part Three makes no mention of encryption in its “security of processing” clauses, meaning encryption is only considered an effective safeguard in relation to non-law enforcement data processing activities.

This is reflected in a DPIA conducted for Police Scotland’s cloud-based digital evidence sharing system, in which the Scottish Police Authority wrote: “Encryption is not mentioned as a mitigating measure in Part 3… [and has therefore] not been applied to the risk.”

It’s worth noting there are currently no technologies that enable processing on encrypted text data, so the data must first be decrypted for the processing to occur “in the clear”. This means the data is not encrypted for the time it’s being processed in the cloud system.

Computer Weekly asked AWS if it would like to clarify how, in this context, encryption can provide an appropriate safeguard for law enforcement data.

A spokesperson said AWS does not access or use customer data for any purposes without its customers agreements, and that encryption (including the management of the encryption keys) is a key technical supplementary measure described in European data regulators.

It added that encrypted content is rendered useless without the applicable decryption keys, and that the company provides advanced tools and encryption services to protect its customers’ data both in transit and at rest. However, it did not comment on the need to for encrypted data to be processed “in the clear” (i.e. unencrypted).

Computer Weekly also contacted both Riven and Bedfordshire Police about the encryption claim. While it received no direct response from the police on this point, Riven said that without a substantive claim, there is nothing to comment on.

It told Computer Weekly that many of the claims about the processing of law enforcement data in the cloud revolve around hypothetical situations and have no evidence behind them.

Read more on Technology startups

CIO
Security
Networking
Data Center
Data Management
Close