denisismagilov - stock.adobe.com
Met Police could deploy facial recognition against protesters
Live facial recognition will be rolled out operationally by the Met Police, but police monitoring group Netpol believes it will hamper people’s ability to exercise their rights to protest
The operational use of live facial recognition (LFR) by UK law enforcers could artificially inflate tension between political activists and police during protests, as participants may feel under pressure to conceal their identities because of privacy concerns.
According to the Network for Police Monitoring (Netpol), which monitors and resists policing that is excessive, discriminatory or threatens civil liberties, individuals who seek to conceal their identities to evade LFR on privacy grounds could unwittingly garner more attention from police who may assume they are “troublemakers”. It could also dissuade them from participating in political action all together, it added.
“It seems highly likely that more people will seek to cover their faces to protect their privacy, and it is certainly the case that if only a small number make this choice, they face an increased risk of aggressive policing and a greater likelihood of arrest, simply on the basis that a face covering is viewed as ‘suspicious’,” said Kevin Blowe, a coordinator at Netpol.
“The real challenge for the police, however, is if thousands of people turn up for a protest wearing a mask – if live facial recognition finally makes wearing one normal. We plan to actively encourage the campaigners we work with to do so.”
The Metropolitan Police Service (MPS) set out plans on 24 January 2020 to start deploying LFR operationally, having trialled the use of the technology since 2016.
LFR technology acts as a biometric checkpoint, enabling police to identify people in real time by scanning faces and matching them against a set of selected custody images, known as “watch lists”.
In Hong Kong, where police have been using facial recognition to identify protesters, there has been a widespread adoption of masks by activists, who are also using other tools such as laser pointers against the cameras, as well as pulling down lampposts with LFR technology attached.
In the UK, South Wales Police (SWP) have already deployed facial recognition at two recent Cardiff City v Swansea City football matches, to which many fans responded by wearing masks and bringing banners that read “No facial recognition”.
In the UK, wearing face coverings to protests is entirely legal, and only in the limited circumstances of a section 60 order can police force you to remove the mask.
During its penultimate trial deployment in Romford in February 2019, the MPS issued a local resident with a £90 public order fine for swearing at an officer after being told to remove his face covering.
Computer Weekly contacted the MPS for clarification about whether or not the technology would be used at protests, and whether there was any guidance on how it would be deployed in these situations.
In response, the MPS shared an extract from its press release, claiming this will offer an indication about how the technology will be used in the field.
“The Met will begin operationally deploying LFR at locations where intelligence suggests we are most likely to locate serious offenders. Each deployment will have a bespoke ‘watch list’, made up of images of wanted individuals, predominantly those wanted for serious and violent offences,” it said.
“At a deployment, cameras will be focused on a small, targeted area to scan passers-by. The cameras will be clearly signposted and officers deployed to the operation will hand out leaflets about the activity. The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV, body-worn video or ANPR.”
While the MPS statement neither confirms nor denies whether the technology will be used in a protest setting, Blowe makes the point that facial recognition has already been used on protesters outside an arms fair in Cardiff during LFR trials by SWP.
“I don’t think there are any doubts they would use facial recognition at protests whatsoever – there’s been no restraint on other forms of surveillance, and the routine filming of demonstrations is now something that happens at even the smallest of protests,” he said, and with this in mind, some protesters may be prompted to rethink their involvement.
“If you know you’re being constantly scanned for participation in a protest, and you have no idea whether you’re appearing on the data set they’re using, then the chances are that you’re much less likely to attend that demonstration too.”
How protesters could be on LFR watch lists
Civil rights liberties group Big Brother Watch (BBW) has previously called for a moratorium on the use of LFR technology, and sought to pursue legal action against the MPS over its use of it, claiming that it “tramples over civil liberties”.
While the Information Commissioner’s Office (ICO) states in a report that the “the custody images database is almost always the source of images for watch lists”, it added that data protection impact assessments (DPIAs) conducted by both the MPS and SWP “leave enough room for a range of sources for watch list images”.
Blowe said that there are several police databases that can be used as a “starting point” for any watch lists used to police protests, including those where images are stored of people considered to be “domestic extremists”.
“As we’ve flagged up in the work that we do around ‘domestic extremism’, much of the thinking around this is a paranoid suspicion of anybody who is involved in any form of protest, and that there is absolutely nothing wrong with gathering information on them because it might be useful, with no real controls, transparency or accountability,” he said.
“Take Extinction Rebellion for instance, they’re probably going to be doing another big action in London this April. If facial recognition is used, there would undoubtedly be a list of people that would be identified as being prominent organisers or well-known people it will be looking out for, not because they are violent in anyway, but because they are involved in some kind of protest or direct action.”
On 10 January, the Guardian reported on a counter-terrorism police briefing document distributed to medical staff and teachers as part of the government’s anti-radicalisation Prevent programme.
In it, Counter-Terrorism Policing listed a number of groups it viewed as “extremist”, including Extinction Rebellion, Stop the Badger Cull and the Campaign for Nuclear Disarmament, alongside fascist groups such as Combat 18 and Generation Identity.
“Unfortunately, as we have seen from last week, the organisations that are seen as potential risks or threat elements are pretty much anybody involved in any form of campaigning,” said Blowe.
He added that it remains difficult for people to find out if their information is being retained on secret police databases, citing the failure of the MPS to respond to subject access requests, which the ICO issued two separate enforcement notices for breaches before and after the Data Protection Act 2018’s introduction.
Since publication, the Home Office has contacted Computer Weekly to clarify that Extinction Rebellion is not considered an extremist group, and that the group’s inclusion in the guidance was an error by the police unit that created it.
It also said that the guidance was not part of the Prevent programme, although since it surfaced on 10 January the Guardian said it has been approached by teachers and council workers from across the country who did receive Prevent training that referenced Extinction Rebellion.
Read more about live facial recognition technology
- Live facial recognition technology should not be deployed by UK law enforcement under current circumstances, and the retention of millions of custody images would not hold up under another legal challenge, MPs have been told.
- As adoption of facial recognition systems continues to grow worldwide, there is increasing concern that this technology could undermine fundamental privacy rights and how it can be kept in check.
- UK privacy watchdog is to investigate whether the use of live facial recognition technology at King’s Cross complies with data protection laws.