Brian Jackson - Fotolia

Police tech needs clear legal rules, says biometrics regulator

Police use of artificial intelligence and facial recognition needs to be controlled by strict rules and mechanisms to ensure public trust

Clear, comprehensive and coherent frameworks are needed to regulate police use of artificial intelligence (AI) and biometrics in the UK, the biometrics and surveillance camera commissioner has said.

As commissioner for the retention and use of biometrics, Fraser Sampson is responsible for overseeing police use of DNA and fingerprints in England, Wales and Northern Ireland, as well as for monitoring the use of surveillance cameras in public spaces under his role as surveillance camera commissioner.

In the first annual report covering his dual function, which was delivered to home secretary Suella Braverman in November 2022 and laid before Parliament on 9 February 2023, Sampson extensively covered the use of AI and facial recognition by UK police, which he noted sits at the intersection of his discrete but connected roles.

“The areas of biometrics and surveillance are becoming both increasingly important and increasingly inter-related,” said Sampson.

“I believe that many of the issues raised in my report show that we urgently need to wake up to the opportunities presented, and the threats posed, by the explosion of capability in AI-driven biometric surveillance. If we fail, we risk missing out on the potential benefits it can offer and exposing ourselves to the potential dangers it poses.”

He added that while biometric surveillance technologies such as live facial recognition (LFR) can “undoubtedly be intrusive to privacy and raise other human rights considerations”, there is also “no question” they can also help combat serious crime and safeguard other fundamental rights, such as the right to life or freedom from degrading treatment.

“The extent to which the public will tolerate facial recognition and other emerging biometric surveillance technology will depend largely upon whether they believe there are mechanisms in place to make sure they’re used lawfully, responsibly and according to a set of clear principles that ensure their use is dictated by what society agrees is acceptable, and not simply by what technology makes possible,” he said.

Read more about police technology

To deal with the potential dangers associated with AI-driven biometrics, Sampson added that “we need a clear, comprehensive and coherent framework to ensure proper regulation and accountability”.

Both Parliament and civil society have repeatedly called for new legal frameworks to govern law enforcement’s use of biometrics – including a House of Lords inquiry into police use of advanced algorithmic technologies; the UK’s former biometrics commissioner, Paul Wiles; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

However, the government maintains there is “already a comprehensive framework” in place.

Responding to the report, policing minister Chris Philip wrote: “As you have said, the capabilities available to the police are ever expanding, and it is right that we support them to utilise the opportunities it presents, in a way which maintains public trust.”

Concerns around police AI and LFR

In the report itself, Sampson noted that the principal tension between proponents and opponents of LFR comes from the perceived lack of transparency and accountability around its use, and the absence of any express requirement for police forces to demonstrate why, and evidence how, their deployments are necessary and proportionate.

“It is over a decade since the government abandoned the concept of compulsory ID cards, yet we are morphing from a standard police surveillance model of humans looking for other humans to an automated, industrialised process (as some have characterised it, a move from line fishing to deep ocean trawling),” he wrote.

“In that context, we should recognise concerns that we may be stopped on our streets, in transport hubs, outside arenas or school grounds on the basis of AI-generated selection and required to prove our identity to the satisfaction of the examining officer or of the algorithm itself.”

Sampson added that he is particularly concerned about the potential for retrospective use of the technology to locate witnesses; as outlined in guidance published by the College of Policing in March 2022, which suggested that witnesses of crime, as well as victims, could be included in facial recognition watchlists.

He said that any instances where retrospective facial recognition might “legitimately make a significant contribution”, such as in the wake of terrorist attacks or natural disasters, are “mercifully rare and wholly exceptional”.

If police start using algorithms to track everyone said to have been present at a particular event, he said, “that is a new and somewhat sinister development”.

“As one charitable group described it, such a situation would mean us all becoming involuntary participants in a permanent police identity parade.”

Public-private partnerships

Sampson further noted that the vast majority of the UK’s biometric surveillance capability is privately owned, and can only be accessed under contractual arrangements between policing bodies and the private sector.

He said this “contextual transformation” would need to be reflected in any new regulation, which should adopt a holistic systems approach to cover every actor involved in the biometric surveillance ecosystem: “In a systemic setting, contamination of part contaminates the whole.”

Following a 10-month investigation into the use of advanced algorithmic technologies by UK police, including facial recognition and various crime “prediction” tools, the Lords Home Affairs and Justice Committee (HAJC) found that there was “much enthusiasm” about the use of AI systems from those in senior positions, but said “we did not detect a corresponding commitment to any thorough evaluation of their efficacy”.

The HAJC also noted a range of “dubious selling practices” stemming from a conflict of interest between police forces, which are obliged under the PSED to consider how their policies and practices could be discriminatory, and private sector suppliers, which often want to protect their intellectual property and trade secrets.

In August 2020, for example, South Wales Police’s use of LFR technology was deemed unlawful by the Court of Appeal, in part because of the fact that the force did not comply with its Public Sector Equality Duty.

It was noted in the judgement that the supplier in that case – Japanese biometrics firm NEC – refused to divulge details of its system to SWP, meaning the force could not fully assess the tech and its impacts.

“For reasons of commercial confidentiality, the manufacturer is not prepared to divulge the details so that it could be tested,” said the ruling. “That may be understandable, but in our view it does not enable a public authority to discharge its own, non-delegable, duty under section 149.”

Other major concerns

Aside from facial recognition and AI, Sampson’s report also discusses a range of other issues covered by his dual commissioner functions, including mission creep in the use of Automatic Number Plate Recognition (ANPR) and UK government plans to do away with his joint role.  

On ANPR, which was originally introduced in 1976 and is the UK’s largest non-military database, Sampson said its use is almost completely unregulated and has no firm legal framework.

He said the system can now do far more than it was originally designed to do, such as capturing non-vehicular data, and monitoring people, behaviour, associations, networks and habits; not just in relation to the driver, but also of vehicle occupants, too. “I should add that that I make no criticism of the use of ANPR by the police – quite the opposite in fact – but highlight the potential for mission creep which may have consequences for its original policing and law enforcement purposes,” he said. “There is a clear need for a structural underpinning to the system, which is currently missing, and exacerbated by the complexity of this not being a single, homogenous system.”

While this is the first combined report covering Sampson’s dual function as biometric and surveillance camera commissioner, it could also be the last.

In November 2021, the UK government proposed further amalgamating Sampson’s roles under the purview of the Information Commissioner’s Office (ICO), which Sampson himself described at the time as “ill-conceived”.

In his report, he added: “While I am somewhat relieved to see that the government has decided not to transfer all these functions to the ICO, the proposal in the Data Protection and Digital Information Bill simply deletes the Surveillance Camera Code and its attendant functions rather than making provision for their being taken on by the ICO as proposed in the consultation.”

A revised Surveillance Camera Code of Practice was approved by Parliament in January 2022 and specifically addresses the use of public space surveillance – including the use of facial recognition technology – by the police and local authorities.

Read more on IT governance

CIO
Security
Networking
Data Center
Data Management
Close