sp3n - stock.adobe.com
The Metropolitan Police Service has completed its ninth deployment of live facial recognition technology, which took place in Romford town centre on 31 January. Eight people were arrested during the eight-hour deployment, said the Met.
Three of the arrests were a direct result of the technology identifying people who were wanted for violent offences, although one of them, a 15-year-old boy, was later released without further action. The other five arrests were part of the wider operation by the Met in the area.
A final deployment of facial recognition technology, which would have concluded the Met’s trial of it, was due to take place on 1 February, but was cancelled because of adverse weather conditions and will be rearranged for another date.
The Met has been using facial recognition since 2016, beginning with that year’s Notting Hill Carnival, and the Romford deployment was its ninth.
The technology essentially acts as a biometric checkpoint, with a facial recognition-linked camera scanning public spaces and crowds, enabling the police to identify people in real time by matching faces against a database.
The Met’s website said: “The technology is being used to assist in the prevention and detection of crime by identifying wanted criminals.”
However, the deployments have attracted criticism from privacy activists and campaign groups, such as Liberty and Big Brother Watch, both of which were present at the Romford deployment.
Hannah Couchman, policy and campaigns officer at Liberty, said: “Our key concern is that this is an enormous threat to your privacy. Every single person that walks past that facial recognition van will have a scan taken of their face, which is deeply sensitive biometric information.
“It will happen without their consent and probably without their knowledge, because there is such a lack of information available to them.”
Couchman said the Met was reluctant to give information about these operations to the public, pointing out that the press statement about the Romford operation was issued only at 4pm the previous day.
“While anyone who declines to be scanned will not necessarily be viewed as suspicious, officers will use their judgement to identify any potential suspicious behaviour,” said the Met’s statement.
However, Romford resident Chris Johnson said he was stopped by uniformed officers for having his face covered near the deployment van, which he did after being told by another member of the public that the police were trialling the technology.
Johnson then received a £90 fine for a public order offence. He said: “I don’t want my face on anything – I’m not a criminal. If I want to cover my face, I’ll cover my face but the police officer, because I swore, gave me a fine – all because I decided to walk down the street and I didn’t want them looking at my face, which I think I’ve got a right to do.”
According to Big Brother Watch, another man was also stopped by police for covering his face.
The Met was approached for comment on whether people can refuse to be scanned, but did not respond.
Read more about police use of technology
- Committee has “serious concerns” about police forces’ digital capabilities, interoperability and tech adoption, and urges ministers to take responsibility.
- Lincolnshire Police is embracing cloud technologies within its control room environment as it strives to improve the efficiency of its operations.
- A year-long ICO investigation has highlighted major problems with how the Metropolitan Police handles and shares the personal data of individuals on its Gangs Matrix.
Griff Ferris, legal and policy officer at Big Brother Watch, said the group was present to protest at the Met’s use of the technology and to inform the public about the trials. “The [public] reactions been mixed,” he said. “Lots of people are interested – they are interested to know more about what’s going on, they want to take a leaflet and find out more information.
“Some people obviously walk by if they’re busy, but lots of people are stopping to chat, saying they don’t like what the police are doing, saying they don’t feel comfortable with it.”
Ferris added: “We have also got a legal challenge against this technology, which has so far been an expensive waste of time. They’ve only made two arrests and have an almost 100% misidentification rate.”
In a report published May 2018, Big Brother Watch said that over 98% of the “matches” made using facial recognition technology wrongly identified innocent members of the public, with only two people being correctly identified by the technology.
The Independent recently reported that, so far, the Met has spent more than £200,000 on automatic facial recognition, spending £198,000 on software from Japan’s NEC Corporation and £23,800 on hardware such as cameras.
The Met has also deployed uniformed and undercover officers at each trial of the technology, but the cost of this is unclear.
Ferris added: “There is no legislation, very little oversight, and there has been no consideration by Parliament. This is a completely lawless use of an authoritarian surveillance tool that we believe infringes people’s right to privacy, people’s right to freedom of expression and people’s right to freedom of association under the Human Rights Act.”
Detective chief superintendent Ivan Balhatchet, strategic lead for live facial technology at the Met, said: “Following the final deployments this week, a full independent evaluation of the deployments and the technology itself will commence.
“Tackling violent crime is a key priority for the Met and we are determined to use all emerging technology available to support standard policing activity and help protect our communities.
“The technology being tested in this trial is developing all the time and has the potential to be invaluable to day-to-day policing.”