alphaspirit -

ICO to probe facial recognition at King’s Cross

UK privacy watchdog is to investigate whether the use of live facial recognition technology at King’s Cross complies with data protection laws

The Information Commissioner’s Office (ICO) has reiterated its warnings about the use of facial recognition technology amid concerns about privacy, and begun an investigation into use of the technology at King’s Cross in London.

The Information Commissioner’s Office (ICO) investigation comes in response to concerns about the use of facial recognition technology at King’s Cross Central development, that includes King’s Cross and St Pancras International stations, as well as restaurants, shops and cafes.

The issue has also attracted the attention of London mayor Sadiq Khan, who has reportedly written to the site’s developer Argent seeking reassurances that its use of facial recognition technology is legal. He has also called for new laws clarifying the use of the technology, according to the BBC.

In early July, the ICO found that live facial recognition (LFR) technology that can scan crowds and then check large databases for matches in seconds is processing personal data, in a key preliminary finding of an ICO investigation into police trials of the technology.

Private sector organisations using LFR need to ensure there is “demonstrable evidence” that the technology is necessary, proportionate and effective, the ICO warned.

The ICO’s finding was issued just days after an independent report into the application of the technology by a UK police force found that the use of LFR by the Metropolitan Police could be held unlawful if challenged in court.

The report by the Human Rights, Big Data and Technology Project raises a number of concerns around the procedures, practices and human rights compliance during LFR trials, including concerns about consent, public legitimacy and trust.

In March, the UK’s Science and Technology Committee warned that LFR technology should not be deployed by UK law enforcement until concerns around its effectiveness were resolved.

The ICO investigation into use of the LFR at King’s Cross will assess details of how the technology is used at the site visited by thousands of people every day to determine whether it complies with data protection law

Argent has stated that the use of facial recognition technology at King’s Cross is to “ensure public safety”, but has so far given no indication of how long it has been the system at the site or how it protects the data it collects.

“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” said information commissioner Elizabeth Denham in a statement. “That is especially the case if it is done without people’s knowledge or understanding.”

The growing use of LFR technology in public spaces by private sector as well law enforcement agencies is a cause for “deep concern”, said Denham.

“My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used,” she said.

“Facial recognition technology is a priority area for the ICO and when necessary, we will not hesitate to use our investigative and enforcement powers to protect people’s legal rights.”

Denham reiterated that any organisation wanting to use facial recognition technology must comply with the law.

“And they must do so in a fair, transparent and accountable way. They must have documented how and why they believe their use of the technology is legal, proportionate and justified.

“We support keeping people safe, but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights,” she said.

Read more about facial recognition technology

Controversially, the South Wales Police and the Met Police have been trialling LFR technology in public spaces to identify individuals at risk or those linked to a range of criminal activity.

“We understand the purpose is to catch criminals. But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives,” Denham wrote in a July blog post.

“Legitimate aims have been identified for the use of LFR. But there remains significant privacy and data protection issues that must be addressed,” she said.

Denham pointed out that a court case is currently considering the need for a detailed framework for safeguards prior to making decisions to implement LFR systems and governing its use at all stages.

The case relates to a challenge to the lawfulness of South Wales Police’s use of LFR by a member of the public, supported by civil rights group Liberty.

“The resulting judgment will form an important part of our investigation and we will need to consider it before we publish our findings,” said Denham, adding that any police force deploying LFR needs to consider a wide range of issues.

The ICO plans to report on all its findings about surveillance technology once the judgment in the South Wales Police case has been issued, and set out what action needs to be taken.

On National Surveillance Camera Day on 21 June 2019, the UK surveillance camera commissioner Tony Porter called for a strengthening of the code of practice for the surveillance camera industry in the face of new privacy regulation and surveillance technologies such as facial recognition.

“I am a big fan of listening to people who hate surveillance. I love surveillance, but it is important that we understand where the pressure points are for those who challenge it so that we can mitigate against them,” he said. 

Read more on Privacy and data protection

Data Center
Data Management