andreusK - stock.adobe.com

Met Police to launch facial recognition operationally

Despite the continuing controversy around its use, the Metropolitan Police will be deploying live facial recognition across the capital

The Metropolitan Police Service (MPS) has announced that it will begin using facial recognition technology operationally for the first time, and has moved to assure privacy experts that its deployments will not infringe on civil liberties.

London police have been trialling live facial recognition (LFR) technology since 2016, beginning with that year’s Notting Hill Carnival. A further nine trial deployments have been made since, with the last taking place in February 2019.

LFR technology acts as a biometric checkpoint, enabling police to identify people in real time by scanning faces and matching them against a set of selected custody images, known as “watch lists”.

“This is an important development for the Met and one which is vital in assisting us in bearing down on violence. As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London,” said assistant commissioner Nick Ephgrave.

“We are using a tried-and-tested technology, and have taken a considered and transparent approach to arrive at this point. Similar technology is already widely used across the UK, in the private sector. Ours has been trialled by our technology teams for use in an operational policing environment.”

The MPS said the technology, which is provided by Japan’s NEC Corporation, will be deployed in locations where “intelligence suggests we are most likely to locate serious offenders”, and that the cameras will be focused on small, targeted areas to scan passers-by.

“This is an important development for the Met and one which is vital in assisting us in bearing down on violence”
Nick Ephgrave, Met Police

It added that “the cameras will be clearly signposted and officers deployed to the operation will hand out leaflets about the activity”.

In October 2019, the Information Commissioner’s Office (ICO) concluded a 17-month investigation into police forces’ use of LFR, and recommended in a report that the government introduce a statutory and binding code of practice on its deployment, which has not yet been done.

For information commissioner Elizabeth Denham, the report had such far-reaching implications that she felt it necessary to publish her first ever commissioner’s opinion.

In Denham’s view, police were not going far enough to identify “a lawful basis for the use of LFR”, adding that more detail was required in data protection impact assessments (DPIAs) to meet the legislative requirement that this type of data processing is “strictly necessary”.

“The Metropolitan Police Service has incorporated the advice from our opinion into its planning and preparation for future LFR use. Our opinion acknowledges that an appropriately governed, targeted and intelligence-led deployment of LFR may meet the threshold of strict necessity for law enforcement purposes,” said the ICO in a statement.

“We have received assurances from the MPS that it is considering the impact of this technology and is taking steps to reduce intrusion and comply with the requirements of data protection legislation. We expect to receive further information from the MPS regarding this matter in forthcoming days.”

The ICO added that the MPS had committed to reviewing each deployment, and that it would continue to observe the service’s use of the technology.

“We reiterate our call for government to introduce a statutory and binding code of practice for LFR as a matter of priority. The code will ensure consistency in how police forces use this technology and to improve clarity and foreseeability in its use for the public and police officers alike,” it said.

Tried-and-tested technology?

According to Ephgrave, the MPS has taken “a considered and transparent approach” to LFR, which he describes as a “tried-and-tested technology”.

However, in a report published in May 2018, civil liberties group Big Brother Watch (BBW) found that over 98% of the “matches” made using facial recognition technology wrongly identified innocent members of the public, with only two people being correctly identified by the technology.

“This decision [by the MPS] represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK”
Silkie Carlo, Big Brother Watch

“This decision [by the MPS] represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK,” said Silkie Carlo, director of BBW.

“This is a breathtaking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary. This move instantly stains the new government’s human rights record and we urge an immediate reconsideration.”

In September 2019, BBW joined 24 other right groups in calling on UK police, as well as private companies, to immediately stop using LFR for public surveillance.

In the same month, a High Court ruling on the use of LFR by South Wales Police stated that, although its use was lawful, any future development of the technology was “likely to require periodic re-evaluation of the sufficiency of the legal regime”.

In her commissioner’s opinion, Denham said the court’s decision should not “be seen as a blanket authorisation to use LFR in all circumstances”.

Private sector surveillance of public spaces

The ICO is currently conducting a separate investigation into the use of LFR in the private sector, including where it is used in partnership with law enforcement.

This follows revelations from September 2019 that the owners of the Kings Cross estate were using LFR without telling the public, and that both the MPS and British Transport Police had supplied the company with images for their database, despite initially denying involvement.

While it does not explicitly mention facial recognition technology, the National policing digital strategy 2020-2030, released on 21 January, states that UK police “will strengthen our relationships with the private sector to empower it to appropriately share in public safety responsibilities”.

In July 2019, surveillance camera commissioner Tony Porter called for an independent review of surveillance in public spaces, saying “the use of technology-enhanced surveillance has to be conducted and held to account within a clear and unambiguous framework of legitimacy and transparency”.

Tweeting about the MPS’s decision to launch LFR operationally, he said: “I still call for robust legislation to provide clarity to law enforcement as to its use. Government manifesto has accepted that – let’s see it delivered. I am engaging with MPS prior to the commencement of the use of LFR.”

In Europe, the European Commission is currently considering placing a ban on the use of LFR in public spaces for five years due to privacy and data concerns.

Read more about facial recognition technology

  • UK privacy watchdog is to investigate whether the use of live facial recognition technology at King’s Cross complies with data protection laws.
  • The first independent report into use of facial recognition technology by the police found a number of shortcomings around trials that would not withstand legal scrutiny.
  • The Information Commissioner’s Office is calling for a statutory code of practice to govern how police in the UK deploy live facial recognition technology while controversy surrounding its use continues.

Read more on Technology startups

CIO
Security
Networking
Data Center
Data Management
Close