alphaspirit - stock.adobe.com

ICO issues warning about using facial recognition technology

The UK privacy watchdog has ruled that any police force or private organisation using live facial recognition technology is processing personal data and needs to pay attention to data protection laws

Live facial recognition (LFR) technology that can scan crowds and then check large databases for matches in seconds is processing personal data, according to the Information Commissioner’s Office (ICO).

This is a key preliminary finding of an ICO investigation into police trials of the technology and comes just days after an independent report into the application of the technology by a UK police force found that the use of LFR by the Metropolitan Police could be held unlawful if challenged in court.

The report by the Human Rights, Big Data and Technology Project raises a number of concerns around the procedures, practices and human rights compliance during LFR trials, including concerns about consent, public legitimacy and trust.

In March, the UK’s Science and Technology Committee warned that LFR technology should not be deployed by UK law enforcement until concerns around its effectiveness were resolved.

The South Wales Police and the Met Police have been trialling LFR technology in public spaces to identify individuals at risk or those linked to a range of criminal activity.

Invasion of privacy

“We understand the purpose is to catch criminals. But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives. And that is a potential threat to privacy that should concern us all,” said information commissioner Elizabeth Denham.

 “Legitimate aims have been identified for the use of LFR. But there remain significant privacy and data protection issues that must be addressed, and I remain deeply concerned about the roll-out of this technology,” she wrote in a blog post.

According to Denham, there needs to be “demonstrable evidence” that the technology is necessary, proportionate and effective, considering the invasiveness of LFR.

“Trials [of LFR technology in public spaces] represent the widespread processing of biometric data of thousands of people as they go about their daily lives. That is a potential threat to privacy that should concern us all”
Elizabeth Denham, information commissioner

“There is also public concern about LFR; it represents a step change from the CCTV of old. There is also more for police forces to do to demonstrate their compliance with data protection law, including in how watch lists are compiled and what images are used. And facial recognition systems are yet to fully resolve their potential for inherent technological bias; a bias which can see more false positive matches from certain ethnic groups,” she said.

Denham pointed out that a court case is currently considering the need for a detailed framework for safeguards prior to making decisions to implement LFR systems and governing its use at all stages.

The case relates to a challenge to the lawfulness of South Wales Police’s use of LFR by a member of the public, supported by civil rights group Liberty. “The resulting judgment will form an important part of our investigation and we will need to consider it before we publish our findings,” said Denham, adding that any police force deploying LFR needs to consider a wide range of issues.

On National Surveillance Camera Day on 21 June 2019, the UK surveillance camera commissioner Tony Porter called for a strengthening of the code of practice for the surveillance camera industry in the face of new privacy regulation and surveillance technologies such as facial recognition.

“I am a big fan of listening to people who hate surveillance. I love surveillance, but it is important that we understand where the pressure points are for those who challenge it so that we can mitigate against them,” he said.

According to ICO guidance, police forces considering LFR technology should:

  • Carry out a data protection impact assessment (DPIA) and update this for each deployment because of the sensitive nature of the processing involved in LFR, the volume of people affected, and the intrusion that can arise. Law enforcement organisations are advised to submit data DPIAs to the ICO for consideration, with a view to early discussions about mitigating risk.
  • Produce a bespoke ‘appropriate policy document’ to cover the deployments that sets out why, where, when and how the technology is being used.
  • Ensure the algorithms within the software do not treat the race or sex of individuals unfairly.

The ICO said police forces should also ensure they have familiarised themselves with its Guide to law enforcement processing, which covers part 3 of the Data Protection Act 2018.

Although data protection law differs for commercial companies using LFR, Denham said the technology was the same and the intrusion that can arise could still have a detrimental effect.

“In recent months, we have widened our focus to consider the use of LFR in public spaces by private sector organisations, including where they are partnering with police forces. We’ll consider taking regulatory action where we find non-compliance with the law,” she warned.

The ICO plans to report on all its findings about surveillance technology once the judgment in the South Wales Police case has been issued, and set out what action needs to be taken.

Calls for facial recognition ban

In May, the US city of San Francisco voted to ban police from using facial recognition applications, while California is considering similar moves. 

Earlier this week, a US non-profit organisation, Fight for the Future, launched what it claims is the first national campaign in the US calling for a federal prohibition on all uses of facial recognition technology by governments, reports the Seattle Times.

The BanFacialRecognition.com campaign enables members of the public to contact their Congressional and local representatives to urge them to ban surveillance technology.

The campaign argues that a ban of facial recognition is necessary because regulation alone is not enough to protect against the dangers of the technology.

According to the campaign website, facial recognition technology identifies the wrong person up to 98% of the time, resulting in wrongful imprisonment and deportation.

The campaign also raises concerns about bias within the technology and about the safety of biometric information. “Once our biometric information is collected and stored in government databases, it’s an easy target for identity thieves or state-sponsored hackers,” the campaign claims, arguing that facial recognition is unlike any other form of surveillance.

“It enables automated and ubiquitous monitoring of an entire population, and it is nearly impossible to avoid. If we don’t stop it from spreading, it will be used not to keep us safe, but to control and oppress us – just as it is already being used in authoritarian states.”

Read more about facial recognition technology

Read more on Privacy and data protection

CIO
Security
Networking
Data Center
Data Management
Close