Paul Fleet - Fotolia

Police Scotland use cloud for biometric data despite clear risks

Police Scotland confirms it has stored significant volumes of biometric data on a cloud-based digital evidence sharing system despite major ongoing data protection concerns, bringing into question the effectiveness of the current regulatory approach and the overall legality of using hyperscale public cloud technologies in a policing context

Police Scotland has “uploaded significant volumes of images” to its cloud-based digital evidence-sharing system despite major ongoing data protection concerns, finds formal information notice from the Scottish biometrics commissioner.

At the start of April 2023, Computer Weekly revealed the Scottish government’s Digital Evidence Sharing Capability (DESC) service – contracted to body-worn video provider Axon for delivery and hosted on Microsoft Azure – was being piloted despite the police watchdog raising concerns about how the use of Azure “would not be legal”.

According to a Data Protection Impact Assessment (DPIA) by the Scottish Police Authority (SPA) – which notes the system will be processing genetic and biometric information – the system presents several risks to data subjects’ rights.

This includes the potential for US government access via the Cloud Act, which effectively gives the US government access to any data, stored anywhere, by US corporations in the cloud; Microsoft’s use of generic, rather than specific, contracts; and Axon’s inability to comply with contractual clauses around data sovereignty.  

In the wake of Computer Weekly’s coverage, Scottish biometrics commissioner Brian Plastow served Police Scotland (the lead data controller for the system) with a formal information notice on 22 April 2023, requiring the force to demonstrate that its use of the system is compliant with Part Three of the Data Protection Act 2018 (DPA 18), which contains the UK’s law enforcement-specific data protection rules.

Plastow specifically asked whether biometric data transfers have taken place, what types have been transferred, in what volumes, and which country the data is being hosted in.

Police Scotland response

While Police Scotland’s response to Plastow has not been publicly disclosed, he confirmed in correspondence with Computer Weekly that the force “uploaded significant image volumes to DESC during this pilot”, which specifically included stills and CCTV images.

He also confirmed with the Scottish Police Authority Forensic Services that no DNA profiles or fingerprints have been uploaded, pending clarification from the UK Information Commissioner on whether he is satisfied that doing so would not conflict with UK data protection law.

In his official response to Police Scotland, Plastow notes that he was provided with assurances that “data is encrypted by the DESC solution prior to being hosted on a Microsoft Azure UK datacentre.”

Plastow also highlighted that, despite these assurances: “The US Cloud Act allows access to electronic data stored overseas by ‘US headquartered’ electronic communication service providers and ‘remote computing service providers’.”

He further highlighted the fact that US headquartered service providers “will need to ensure that that no Scottish UK data, or onward transferred EU data, is accessed by US authorities without valid authorisation and implementation of” the UK’s data protection rules.

However, it should be noted that encryption is not considered to be a relevant or effective safeguard under Part Three (as it does not allow for ‘supplementary measures’ that would enable data to be sent to jurisdictions with demonstrably lower data protection standards, such as the US), and is only considered such in relation to non-law enforcement data processing activities covered by the UK’s implementation of the General Data Protection Regulation (GDPR).

In any case, the SPA also explicitly notes in its DPIA that the encryption keys are held by Axon, meaning “they would be able to decrypt and provide the data, potentially without our knowledge or consent, where compelled by US authorities to do so”.

Following the confirmation from Police Scotland that it uploaded significant volumes of biometric information during the DESC pilot, Plastow confirmed his office will be formally assessing the force’s compliance with Scotland’s statutory Code of Practice on the use of biometric data in winter 2023. Areport detailing his findings will be laid before Scottish Parliament in spring 2024.

However, the commissioner had already set out his intention to conduct assurance reviews in his plan published December 2021, meaning this would have gone ahead regardless of whether issues with the DESC system had been raised or not.

Computer Weekly contacted Police Scotland about various aspects of the story, including why DNA and fingerprint data was deemed too sensitive for the system, but other biometric information was not; why it considers encryption to be an effective safeguard in this instance; and why it decided to press forward with the DESC pilot despite major data protection concerns being highlighted by both the SPA and ICO.

“Police Scotland continues to work closely with all relevant partners to identify, assess and mitigate any risks relating to data sovereignty, where required. Further risk assessments and mitigation will be kept under ongoing scrutiny,” said a Police Scotland spokesperson.

“All digital evidence on the DESC system is held securely. Access to the information is fully audited and monitored, and only accessible to approved personnel. Processes are in place to ensure any data risks are quickly identified, assessed and mitigated.

“We take the management and security of data seriously. We are working with our criminal justice partners to ensure robust, effective and secure processes are in place to support the development of the system and will continue to engage with the biometrics commissioner, the Information Commissioner’s Office and relevant partners.” 

The ICO’s ‘conversational’ approach

Plastow also noted that while Police Scotland has been consulting with the ICO about its processing and storage activities in the DESC system – which is required to take place before any high-risk processing commences – there is still no formal guidance from the ICO about putting law enforcement data in cloud infrastructure.

The ICO has since confirmed to Computer Weekly that it has never given formal regulatory approval for use of such systems by UK law enforcement bodies, despite being aware of issues due to its on-going conversations with the data controllers involved.

In the SPA’s correspondence with the ICO released under the Freedom of Information Act (FOI), for example, the regulator largely agreed with its assessments of the risks. Regarding international transfer requirements, for example, it noted that technical support provided from the US by either Axon or Microsoft would constitute an international data transfer, as would a US government request for data made via the Cloud Act. 

“These transfers would be unlikely to meet the conditions for a compliant transfer,” said the ICO. “To avoid a potential infringement of data protection law, we strongly recommend ensuring that personal data remains in the UK by seeking out UK-based tech support.”

In an email summarising its meetings with Police Scotland, the ICO further noted DESC pilot would begin on 24 January 2023 and would involve live personal data that “there will be international transfers involved in the provision of technical services”, and that Police Scotland is “assured as the controller” that it is meeting all the law enforcement data protection obligations.

However, it added: “If you have a remaining residual high risk in your DPIA that cannot be mitigated, prior consultation with the ICO is required under section 65 DPA 2018. You cannot go ahead with the processing until you have consulted us.”

Despite this warning, the Scottish policing bodies involved in the DESC system elected to go ahead with the pilot without a formal consultation with the ICO.

In June 2022, the ICO set out its “revised approach” to public sector enforcement, with the aim of protecting public bodies from having to make large pay-outs for data protection breaches when fines could disrupt public services.

“In practice, this will mean an increased use of the ICO’s wider powers, including warnings, reprimands and enforcement notices, with fines only issued in the most serious cases,” it said.

Responding to a follow up FOI from Computer Weekly, the ICO later confirmed that while it has obtained legal advice on the use of cloud infrastructure for law enforcement data, the matter is ongoing, and it has not yet come to a formal position on the matter. The advice itself was withheld, however, as it is subject to legal professional privilege.

Personal data has been processed in hundreds of cases in likely breach of the act whilst the ICO has done nothing
Owen Sayers, independent security consultant

This is when the ICO also confirmed it has “never given formal regulatory approval for the use of these systems” in a law enforcement context.

Owen Sayers, an independent security consultant and enterprise architect with over 20 years’ experience in delivering national policing systems, said: “While the ICO appears to remain doggedly committed to its approach of having ‘conversations’, rather than formal interviews and assessments, the effectiveness of this must be questioned given that during this type of informal engagement, Police Scotland elected to go into a live pilot whilst not adhering to the guidance of the ICO that they should refer the project for formal consultation due to the presence of multiple high risks.

“As a result, personal data has been processed in hundreds of cases in likely breach of the act whilst the ICO has done nothing. In fact, it’s fallen to other commissioners to initiate their own enquiries in lieu of the ICO using their own powers to protect the public.”

Sayers added that if a government supplier has action taken against them by the ICO, rather than a public sector body, it would be disclosable and therefore could drive actual improvement by impacting on their ability to successfully bid for other work.

Fraser Sampson, the biometrics commissioner of England and Wales, similarly noted it was “interesting” that his Scottish counterpart was the one to intervene over the DESC system, as “presumably the ICO could have intervened in the same way that the biometrics commissioner has and demanded the same statutory provision of information”.

Sampson further questioned “why one regulatory body has found it necessary to do so, and seems to have done so quite effectively”, while the other has not.

In April 2023, Sampson previously told Computer Weekly that the issues around using hyperscale cloud in a policing context will only increase in importance as more and more data (particularly citizen-captured data) is fed into the systems, and that the use of such technologies is now a UK-wide policing issue that goes far beyond his narrow biometrics remit. 

Computer Weekly contacted the ICO about various aspects of the story and every claim made against them, including why it allowed the pilot to go ahead without prior consultation, despite being in full view of the risks.

“The ICO is actively considering these issues and engaging with relevant authorities,” said an ICO spokesperson. “Our approach to taking regulatory action is set out in our regulatory action policy and our engagement on these issues to date has been guided by these principles. We expect all controllers to manage risks appropriately and to comply with the law.”

Microsoft Azure ‘high risks’

While the ICO is yet to come to formal position on the legality of UK law enforcement organisations using hyperscale public cloud systems, the SPA DPIA has already identified several “high-risk” issues with Microsoft Azure which bring into question its suitability to process UK policing data.

This includes the fact that Microsoft’s standard data processing addendum is drafted primarily to apply to processing related to the General Data Protection Regulation (GDPR) rather than Part 3 processing (the specific law enforcement requirements); that the contract between Axon and Microsoft does not contain the “granular level of detail” required to satisfy either GDPR or Part 3; and that Microsoft’s use of generic terms and conditions means the DPA’s section 59 requirement for a specific contract detailing the nature of the processing cannot be met. 

It also identified that while the Microsoft addendum states data is held in the UK, it simultaneously states that data may be transferred to or processed in the US, or any other country in which Microsoft or its processors operate, opening the risk of it processing UK policing data outside of the UK “without any visibility or control over this processing for the controllers”.

“Microsoft can and do move data internationally as they see fit and at their sole discretion – the false claim that you can maintain UK data sovereignty by virtue of them having a few UK datacentres needs to be understood by regulators and exposed as the lie it is when police CTO’s fall back on it,” said Sayers.

Emails from both Axon and Microsoft’s legal teams, which were shared with Computer Weekly, also revealed the issues flagged by the SPA were known and understood by the companies themselves for at least two years, during which time no action was taken to remedy the concerns raised. 

Both companies were asked at the time about the emails and what action they have since taken to resolve the issues raised. Axon said it works “closely with customers to ensure robust and effective safeguards are in place”, while Microsoft did not respond. Axon’s full response is detailed here.

Commenting on this aspect, Sampson added while it may be the decision-making processes of public bodies that are the underlying issue, the ICO’s conversational approach means “private, profit-making organisations” are essentially being let off the hook as well.

“If a more lenient treatment, or even a form of exemption is extended to public law enforcement bodies, then the principal beneficiary of that will probably be their private contracting partner, so that’s who you’re extending your leniency towards,” he said, adding that citizens’ concerns are “probably greater” about the corporate retention, storage and sharing of biometric data than they are about the police doing it alone.

He further said the perception that private companies such as Microsoft or Axon are benefitting from the ICO’s decision to not take enforcement action against public sector bodies “will not support and bolster the strategic aim of increasing public trust and confidence”.

Sampson – who has just called for a review into the use of Chinese state-owned Hikvision cameras in sensitive sites by public authorities across the UK – added that while there is “greater preparedness” to look at the risk of government access and accept them as real in that case, people have a much harder time doing so when a household name like Microsoft is involved.

“This is not about it being headquartered in Hong Kong, Shanghai or anywhere else, because the same issues ought to apply, and the same demands for assurance ought to apply, irrespective of the of the brand and the country of origin,” he said.

“If people genuinely think that only the Chinese have got the ability to demand big datasets where it’s in the interest of their government to do so, then they haven’t really understood this area at all, because I can’t think of a single functioning nation that wouldn’t have given itself the ability to go through a large corporate organisation for information that it felt was relevant to his national interests.”

Scottish government clarification

Since Computer Weekly published its initial story about DESC in April 2023, the Scottish government, as the contracting authority, confirmed in an FOI response to Sayers that “there are no specific references in the contract to the Data Protection 2018 Part 3.”

It added: “However, the contract explicitly states at Clause 13.3 that: ‘The provisions of this clause 13 are without prejudice to any obligations and duties imposed directly on the service provider under the data protection laws and the service provider hereby agrees to comply with those obligations and duties’.”

Sayers said that, because the contract doesn’t specifically mention obligations under Part Three, it is “a bit tenuous” to claim the suppliers are fully compliant with the requirements, which include the need for explicit contractual provisions that reference Part Three.

While the Scottish government’s response to Sayers adds that there are separate contractual clauses in place prevent data from being transferred overseas (a key Part Three requirement), this conflicts with the SPA DPIA, which points out that Microsoft’s own standard data processing addendum allows the opposite.

The Scottish government’s own contract with Microsoft also contradicts the claim that clauses prevent data from being transferred overseas, specifically noting that data will not be transferred “unless required to do so by… [a] law or regulatory body to which the service provider is subject”.

Sayers said the full wording of this clause appears to allow the supplier to send personal data outside of the UK, and to do so without prior disclosure to the Scottish government if the regulatory body instructs them not to do so. He added that this, in turn, suggests that a US-regulated supplier subject to US cloud provisions would be able to offshore data without breaching the contract terms.

Elsewhere in its response, the Scottish government, despite being the contracting authority, noted: “As part of the DESC partnership arrangements, the DESC partners who are controllers for this purpose have carried out due diligence in relation to sub-processors on behalf of the Scottish government.

“Since contract award there has been extensive engagement between the DESC partners and the supplier on the specific issues arising from Part 3 requirements.”

However, it does not say whether the Scottish government as the contract owner was involved in those discussions, or whether those discussions have in fact resulted in any contract changes.

“The Scottish government have thrown the DESC partners under the bus really by saying that they were consulted prior to contract award, and that since the contract award the same partners been managing the supplier to effect the necessary legal processing outcomes, but [they] don’t go into any details,” said Sayers.

It looks very much like Scottish government want to distance themselves from their own fairly grievous errors and omissions when competing and awarding the contract
Owen Sayers, independent security consultant

“It looks very much like Scottish government want to distance themselves from their own fairly grievous errors and omissions when competing and awarding the contract, but it’s questionable if the DESC partners can now change the contract sufficiently to comply with all the DPA 2018 Part Three legal requirements without materially changing the contract and possibly invalidating the contract or bid processes that led to it.

“Others who unsuccessfully bid for this work will, I am sure, seek to raise claims if those changes go too far.”

Sayers further noted that section 59 of the DPA 18 requires a written contract specifically between the controllers and processors, in which the necessary guarantees and obligations of Part Three are clearly expressed.

“It is far from clear how the current DESC contract can thus create that relationship, and as such processing data under the contract is in itself a breach of Section 59,” he said.

“It may be a legally viable commercial contract, but it’s unlikely to be a valid contract in the context of the Data Protection Act requirements. None of the DESC controllers should be relying on it today to meet their controller obligations under Section 59.”

Computer Weekly contacted the Scottish government about claims regarding its contract with Axon and compliance with Part Three.  

“The Scottish government and DESC partners take the security and privacy of data extremely seriously. The Scottish government understand that data processing agreements under Part 3 of the Data Protection Act 2018 are in place between the relevant DESC partners and the supplier Axon,” said a Scottish government spokesperson.

“The DESC contract mandates the use of UK-based data storage and processing for data used for law enforcement purposes, which Axon has confirmed it will comply with. Data access is audited and monitored and available only to approved personnel such as police officers, prosecutors and defence agents. Processes are in place to ensure any data risks are quickly identified, assessed and mitigated.

“As the DESC Programme progresses, the Scottish government will continue to work closely with DESC partners to ensure data protection standards are upheld, including through ongoing engagement with Axon and the Information Commissioner’s Office.”

Ongoing concerns

Computer Weekly first raised these issues about UK police cloud deployments to the ICO in December 2020, when it revealed that the roll-out of Microsoft 365 to dozens of UK police forces may be unlawful, because almost all of them failed to conduct data protection checks before deployment and hold no information on their cloud contracts.

This is despite police forces being obliged under Part Three to conduct data protection impact assessments and ensure that processors seek permission before transferring data internationally to a third country.

Police forces at the time generally claimed the international data transfer requirements were covered by Microsoft’s data protection addendum, and that data is being stored in Microsoft’s UK region.

Only one force, Kent, had completed its own DPIA at the time, with the rest relying on a “national DPIA” completed by the National Enabling Programme (NEP, which was in charge spearheading the roll-out), despite that document itself saying that each individual force must assess and mitigate their own data protection risks.

Despite the NEP’s claims that it had been “consulting with the ICO throughout the life of the programme to ensure we are working within all existing legislation”, and that it had provided the ICO with a full copy of the DPIA, an ICO spokesperson confirmed: “We provided informal data protection advice on the National Enabling Programme, but a data protection impact assessment was not formally submitted for consultation with the commissioner.”

Since then, the use of US cloud providers has expanded throughout the criminal justice sector to include the integration of the Ident1 fingerprint database with Amazon Web Services (AWS) under the Police Digital Services (PDS) Xchange cloud platform; and HM Courts and Tribunals’ cloud video platform, which is partly hosted on Azure and processes biometric information in the form of audio and video recordings of court proceedings, as well as its common platform, a separate cloud-based that allows various criminal justice sector professionals to access and manage case information.

Next steps

Sampson said while this is now “solely a matter for the ICO”, the case of DESC and its regulatory oversight is an important example of “how things can fall through the cracks under the current [regulatory] arrangements, and how those cracks need to be addressed before the Data Protection and Digital Information [DPDI] bill becomes law next year, because this will be the entirety of the regulatory framework [for biometrics]”.

While the UK government’s upcoming DPBI bill largely keeps the Part Three processing requirements intact, Sampson said he is concerned about how it will change oversight of the police, and both their public space surveillance activities and use biometric data.

This is because, as it stands, the DPDI would completely do away with the statutory surveillance camera code of practice (the only instrument available that directly regulates police surveillance in public spaces) and disperse regulatory responsibility over biometrics to different offices.

“The regulatory framework is of secondary importance to the citizen, as long as something happens in the right direction, and they can see that there is purposeful action taken,” he said, adding the issues around DESC are a “forerunner” for how the government’s intention to have biometrics overseen and regulated solely within the context of data protection will play out.

 If we are to encourage and build public trust and confidence in the police and police use of biometric surveillance, we’re going to have to do better
Fraser Sampson, biometrics commissioner of England and Wales

“It’s possibly a good opportunity then to test the rigour of those arrangements, and the extent to which police retention of facial and other images is merely a matter of data protection.”

Given the concerns raised over multiple years by Computer Weekly and others about UK police cloud deployments, as well as being half a decade on from Part Three coming into force, Sampson said: “A lot of the big questions in this area are not only unanswered at the moment, but they’ve been unanswered for many, many years. If we are to encourage and build public trust and confidence in the police and police use of biometric surveillance, we’re going to have to do better than that.”

He added that one of the issues a reconstituted ICO will need to address post-DPDI is the appropriate use of encouragement, intervention and enforcement in driving accountability, saying: “It really doesn’t matter, the niceties of the legal framework under which nothing happens.” 

Giving the example of “regional fire control centres” that were set up during his time at West Yorkshire police, Sampson said they were purpose-built for fire and rescue services but never opened their doors due to being “an utter disaster in terms of project management”.

“Essentially, what we had there was the capability to do a number of things including store a great deal of sensitive operational data,” he said. “One of the solutions that West Yorkshire considered was making a very secure on-prem solution for other police forces to share their data, so they didn’t all have to replicate the on-prem costs and all the other difficulties with that and wouldn’t need to outsource it.”

Sampson also questioned why, with five or so of these facilities around the country, they could not be used to store policing data instead of in hyperscale public clouds: “We know it’s going to be hybrid [between cloud and on-prem] anyway in the future, so to what extent will there be a return to some on-prem arrangements, as we gather more and more data, and more and more of it is potentially sensitive and subject to hacking or interference?”

Read more about police technology

  • UK police double down on ‘improved’ facial recognition: The Met and South Wales Police have doubled down on their use of facial recognition technology after research found improved accuracy in their algorithms when using certain settings, but civil society groups maintain that the tech will still be used in a discriminatory fashion.
  • Overhaul of UK police tech needed to prevent abuse: Lords inquiry finds UK police are deploying artificial intelligence and algorithmic technologies without a thorough examination of their efficacy or outcomes, and are essentially ‘making it up as they go along’.
  • ICO under fire for taking limited action over serious data breaches: Lawyers and data protection experts have criticised the Information Commissioner’s Office (ICO) for limiting its enforcement action against Thames Valley Police (TVP) and the Ministry of Justice (MoJ), despite serious data protection failings that placed the lives of witnesses and prisoners at risk.

Read more on Privacy and data protection

CIO
Security
Networking
Data Center
Data Management
Close