Outgoing police tech watchdog warns of declining oversight

The outgoing biometrics and surveillance camera commissioner for England and Wales discusses police deployment of powerful new surveillance technologies, and the declining state of oversight in this area

The future oversight of the police’s increasingly advanced biometric surveillance capabilities is at risk, and there are real dangers of the UK slipping into an “all-encompassing” surveillance state if concerns about these powerful technologies aren’t heeded, warns outgoing biometrics and surveillance camera commissioner Fraser Sampson.

In an exclusive interview with Computer Weekly, Sampson says that while he expected a high degree of polarisation from people around the police’s use of technology, he did not expect “the kind of disconnected approach the government has to this”.

He adds: “I’ve been shocked at how little the police and local authorities know about the equipment they’re using, where it is, what it can do, who they bought it from, why people might be concerned about it, how far it’s achieving what it was bought to do, and how comfortable people are with it doing more things in the future.”

On the benefits of facial-recognition and other AI-driven technologies to police, Sampson notes that there are a number of ways it can help in, for example, catching people that use and contribute to websites dedicated to child sexual abuse material, finding missing or vulnerable people, and locating terror suspects.

Sampson has also previously gone on record stating that biometric capabilities can help the state fulfil its positive obligation to prevent citizens from suffering inhumane or degrading treatment. However, Sampson says the current approach taken by UK police to this tech could put those benefits beyond their reach because of how they have deployed it elsewhere in more controversial settings.

Sampson also blasted the common refrain that "if you’ve done nothing, you’ve got nothing to worry about", saying: “You’ve completely missed the point if you even raise that as a defence.”

Appointed to the dual role in March 2021, Sampson has since been responsible for overseeing how police collect, retain and use a range of biometric material (including digital facial images), as well as encouraging their compliance with the surveillance camera code of practice.

During that time, Sampson has issued a number of warnings about the current state of police biometrics and surveillance. He has, for example, highlighted issues around the lack of a clear legal framework to regulate the police’s use of artificial intelligence (AI) and biometric material, as well as a lack of clarity about the scale and extent of public space surveillance.

Sampson has raised further concerns about the questionable legality of using hyperscale public cloud infrastructure to store and process law enforcement data and the police’s general ‘culture of retention’ around biometric data.

On the data retention issue in particular, Sampson has highlighted the ongoing and unlawful retention of millions of custody images of people never charged with a crime on a number of occasions.

In an appearance before Parliament’s Joint Committee on Human Rights (JCHR), for example, he noted that although the High Court ruled in 2012 that these images must be deleted, the Home Office (which owns most of the biometric databases used by UK police) says it can’t be done because the database they are held on has no bulk deletion capability.

Sampson has also been a long-time critic of the government’s data reform proposals, arguing that measures contained in its Data Protection and Digital Information (DPDI) Bill will see biometric oversight subsumed by the Investigatory Powers Commissioner while removing the government’s obligation to publish a Surveillance Camera Code of Practice. Both of these measures, he says, will render his roles obsolete.

While Sampson was originally appointed for a two-year period, he agreed to a short-term reappointment near the end of 2022 until the DPDI Bill had received royal assent. In August 2023, however, Sampson handed in his resignation, citing delays to the bill.

In a letter to then-home secretary Suella Braverman, Sampson says that changes to the Parliamentary timetable mean it is not expected to pass until spring 2024 at the earliest, and that he is therefore “unable to find a practical way in which I can continue to discharge the functions of these two roles”.

‘Trawling the digital ocean’

A major part of Sampson’s work over the past few years has revolved around monitoring police deployments of both live and retrospective facial-recognition technology.

On the operational benefits of facial-recognition technology, the Met Police’s director of intelligence Lindsey Chiswick previously told MPs that its use has already led to “a number of significant arrests”, including for conspiracy to supply class A drugs, assault on emergency workers, possession with the intent to supply class A drugs, grievous bodily harm, and being unlawfully at large having escaped from prison.

“Those are some of the examples that I have brought here today, but there is more benefit than just the number of arrests that the technology alerts police officers to carry out, there is much wider benefit. The King’s Coronation is one example of where deterrence was a benefit. You will have noticed that we publicised quite widely in advance that we were going to be there as part of that deterrence effect,” she said.

“If I recall my time up in Camden when I went to view one of the facial-recognition deployments, there was a wider benefit to the community in that area at the time. Actually, we got quite a lot of very positive reaction from shopkeepers and local people because of the impact it was having on crime in that area.”

Civil society groups, lawyers and politicians previously identified to Computer Weekly a number of concerns with the Met Police’s approach to the technology and its deployment.

They highlighted, for example, the unlawful retention of millions of custody images that are used to compile facial-recognition watchlists, problems around automation bias, and the false characterisation of its live facial-recognition (LFR) deployments as ‘trials’ despite being used in operational contexts to make arrest.

They also questioned its proportionality and necessity, as over the course of six deployments in the first half of 2022, only eight arrests were made despite scanning 144,366 people’s biometric information.

They further highlighted the social power dynamics surrounding the tech, arguing that even if the tech was 100% accurate 100% of the time, it will still be deployed against certain sections of society such as ethnic minorities or political protestors.

Sampson also questions the ways facial recognition deployments have been approached by UK police, noting the thinness of the evidential basis around its effectiveness in tackling serious crimes, and further highlighting the risk of slipping into “all-encompassing” facial-recognition surveillance.

Sampson says that on the one hand, there are arguments from critics that UK police “never really seem to catch anyone significant using it, let alone very dangerous or high-harm offenders”, but on the other, those in policing will argue this is because it has been deployed so infrequently on relatively so few people, “we’re not going to have very spectacular results, so therefore, we’ve got to use it more to prove the case more”.

On the Home Office’s repeated claim that LFR is a valuable crime prevention tool capable of stopping terrorists, rapists and other violent offenders, Sampson says the overt nature of the deployments – police forces are required to publicly state when and where they are using it – means wanted people will simply avoid the area.

It is worth noting that the Home Office’s claim the tech can prevent particularly serious crimes such as rape and murder has not been borne out in actual arrests arising from police use of the technology.

“If you [as the police] are going to advertise where you’re using it and between what hours of the day, unless I’m wanted and have no idea that I’m wanted by the police – which is slightly odd – I’m just not going to go to those places,” he says.

“It’s the same as my concern for the ANPR [Automatic Number Plate Recognition] system – if you really want to defeat it, it is so easy. You’re not going to find my car because I’ll clone the plates or use stealth tape or some other obscurant tactics, so it means in the end you have a mass trawling of the digital ocean in the full knowledge that almost everything you net isn’t what you’re looking for.”

He adds that the argument then becomes about making the capability more covert to avoid this pitfall: “Then it becomes very sinister…you can’t just avoid one town, because it could be looking for you anywhere. The use case has made itself on that argument.”

A chilling effect

Sampson further challenges the technology’s crime prevention capabilities on the basis that authorities are largely relying on its chilling effect, rather than its actual effectiveness in identifying wanted individuals. He says the logic here is that people “might behave” if they know the police have a certain capability and might be using it.

Describing this as “heading to George Orwell territory”, Sampson adds: “It’s really challenging for the police then to find the evidence that it can work when used properly, without having to throw away all the safeguards to prove it, because once they’re gone, they’re gone.”

Sampson says that even in situations where the technology can be used effectively to, for example, find missing children or identify at-large terror suspects in crowds, there are real risks of that power being abused.

If intrusion is felt in a chilling effect where people are very uncomfortable, then you’ve got to question whether that is proportionate
Fraser Sampson, outgoing biometrics and surveillance camera commissioner

“If you’ve got a missing, vulnerable person, whether it’s child or elderly person…you’ve got a relatively narrow window within which catastrophic things can happen, and you need to find that one face among many,” he says.

“If you have that capability to switch that on across integrated camera networks within the search area, whether a town or a city, that’s pretty easy to do now technologically. I think you’d then be explaining [to the public in certain situations] why you hadn’t switched it on.

“But the challenge would be, how do we know you’ve only switched it on during those periods? And what are you doing with it afterwards?”

Pointing to the government’s plan to link the UK passport database with facial-recognition systems, Sampson further warns against doing something “just because you can”. and again likened this to “trawling the entire digital ocean just in the hope that you scoop up somebody” in the dragnet.

“If that intrusion for them is felt in a chilling effect where people are very uncomfortable about it, then you’ve got to question whether that is proportionate,” he adds.

A fragmented landscape

For Sampson, the answer is to such complexities around facial-recognition and other biometric-capturing technologies is to have “a very robust, very clear, intuitive, oversight accountability framework, so that when people have questions about this stuff, as they naturally will, they know where to go”.

However, changes being ushered in under the DPDI Bill means that an already an patchy regulatory framework for biometrics and surveillance is “being further fragmented and broken up…to say we can leave it all to other existing bodies, I’ve not seen any evidence in support of that. In fact, all the evidence I have seen points in the other direction.”

Noting that his appointment to the dual role was a recognition of the growing convergence between new biometric capabilities and surveillance techniques, Sampson says the government’s data reforms will further fracture what is already a very fragmented regulatory landscape.

In an independent report Sampson commissioned on the implications of these data reforms on surveillance oversight, academics Pete Fussey and William Webster note that the DPDI Bill will weaken oversight of the police’s intrusive surveillance capabilities if enacted as is. 

“The possibilities for integrated surveillance technology, driven by AI and supported by the internet, create genuine public anxieties over civic freedoms…In current form, the bill will delete several surveillance oversight activities and mechanisms that are set out in legislation and arise from the fulfilment of statutory duties placed on commissioners,” it said.

Noting that surveillance oversight has historically been “overburdened and under-resourced”, the report added: “The bill contains no provision for continuing the work of driving up standards for the development, procurement, adoption and use of surveillance cameras, a programme of work widely applauded across police, practitioner and industry communities.”

According to one of the report’s interviewees, the bill also makes no provision for the absorption of Sampson’s roles by the ICO, and “just deals with extinction”.

Sampson is not alone in calling for clear legal frameworks to govern police use of biometrics, with both Parliament and civil society groups making repeated calls for new regulation.

This includes a House of Lords inquiry into police use of advanced algorithmic technologies; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; the former biometrics commissioner for England and Wales, Paul Wiles; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

However, the government maintains that there is “already a comprehensive framework” in place.

New legal frameworks needed

Commenting on the various calls for biometric regulation that have been made by disparate voices, and whether those at the helm of decision-making in this area are heeding such calls, Sampson says: “I don’t think that they’re necessarily getting it.”

He adds that while senior policing officials and local authorities are “starting to get it”, they generally look to central government “for a steer on this, and the government clearly don’t get it”.

Sampson further adds that “there is no underpinning or rationale to the [government’s] direction of travel” with oversight of police technology, and that it has so far been unable to provide evidence attesting to the efficacy of its current frameworks: “Just saying it a number of times in a pre-rehearsed form of words doesn’t make it so, and I just don’t see the evidential basis for this [approach].”

Giving the example of behind-closed-door meetings he has been involved in regarding the government’s overarching project to embed facial-recognition capabilities across UK policing, Sampson says there is a “profound tone deafness” among key decision-makers about how to approach the roll-out of new technologies in a law enforcement context.

Simply building something and then…waiting for the public to finally cotton on to why it’s good for them... is just not the right way around
Fraser Sampson, outgoing biometrics and surveillance camera commissioner

He says while “public trust and confidence is [seen as] a desired outcome”, it should instead be an “essential input” that precedes the roll-out of new tech: “You can’t hope to build this and run it successfully, particularly not in a jurisdiction that relies on policing by consent, unless you build that trust and confidence first.”

He adds: “Simply building something and then…waiting for the public to finally cotton on to why it’s good for them – which is the presumption at the heart of it – is just not the right way around.”

On building trust and confidence in policing rather than technologies, Sampson also warned against “predictive policing” practices and the dangers of using algorithms or AI to make predictions about people’s potential for criminal behaviour. He says that such approaches rely heavily on assumption and create a baseline level of suspicion around members of the public.

“We failed spectacularly as a nation trying to predict whether a 17 year old was going to pass their A-levels in a couple of subjects using algorithms, why would we presume to think we’re going to be any better at predicting whether that same teenager is going to be convicted of street robbery in the next five years?” he says.

“And actually, why would we even presume to try when there are so many other areas we could use this technology for, that don’t rely on those assumptions, don’t have that track record, and don’t have that existing level of suspicion?”

Following his departure from his dual roles on 31 October 2023, Sampson is now set to become a professor of governance and national security at Sheffield Hallam’s Centre of Excellence in Terrorism, Resilience, Intelligence and Organised Crime Research (CENTRIC), where he will be continuing research into emerging biometric and AI technologies in a range of security and policing contexts.

Read more about police technology:

Read more on Privacy and data protection

CIO
Security
Networking
Data Center
Data Management
Close