sp3n - stock.adobe.com
Clearview AI, the controversial facial recognition technology company, is to face a joint investigation by the UK’s Information Commissioner’s Office (ICO) and the Office of the Australian Information Commissioner (OAIC) focusing on its alleged use of scraped data and biometrics of individuals.
The two agencies confirmed the investigation and said it highlighted “the importance of enforcement cooperation in protecting the personal information of Australian and UK citizens in a globalised data environment”.
It will be conducted in accordance with relevant Australian and UK privacy and data protection laws under the Global Privacy Assembly’s Global Cross Border Enforcement Cooperation Arrangement, and may involve other data protection authorities if appropriate and relevant.
Clearview’s data collection methods are said to have involved scraping images of people from social media platforms and other sites on the public internet for inclusion in its database, without seeking permission to do so.
It sells access to its database to law enforcement agencies, which can then upload a photo of an individual and use its proprietary facial recognition algorithm to seek matches.
The firm claims its technology has helped in hundreds of investigations, tracking down serious criminals, exonerating the innocent, and helping to identify and protect victims.
But its critics say its practices are highly unethical and it has been the subject of cease and desist orders from the likes of Google and Twitter. Earlier in 2020, an investigative journalist from the New York Times alleged that the firm’s management had used its relationship with the police to intimidate and harass the journalist in the course of their work. It has also been accused of links to far right actors and white supremacists.
Toni Vitale, partner and head of data protection at JMW Solicitors, told Computer Weekly: “The technology allowing companies to scrape data from the internet and combine it with information about users has been around for many years, but what Clearview appears to have done is to combine personal data with photos of individuals obtained from the internet.
“If the data is used strictly for the purposes of law enforcement, consent of the individuals is not required under either UK or Australian laws, but it does beg the question how transparent Clearview has been about its practices and what it does with the data of unmatched data subjects.”
Vitale said that if Clearview was using the data it collected for any purpose other than assisting law enforcement, such as to help venues collect information on their visitors, it was likely to be in breach of data protection laws, particularly if the subjects are not informed of this.
“Transparency is one of the key tenets of the GDPR [General Data Protection Regulation], and at face value I can’t see how Clearview has met this principle,” he said.
Read more about facial recognition
- IBM’s CEO writes to US congress about decision to stop using and selling facial-recognition technology, and says we should re-evaluate whether it should be sold to law enforcement agencies.
- Facial recognition will make passports on the Eurotunnel an option rather than a necessity, but privacy campaigners have questioned whether gathering biometric data on passengers is necessary.
- The number of companies claiming to have developed facial recognition tools that can identify masked faces has skyrocketed since the start of the Covid-19 coronavirus pandemic.
Leonie Power, privacy partner at law firm Fieldfisher, said the opening of the joint investigation highlighted growing concerns about the use of biometric data, especially with regard to developments around facial recognition.
“Processing of biometric data to uniquely identify a natural person is subject to strict conditions under European privacy laws [GDPR],” said Power. “The investigation is set to focus on the use of such data, as well as the use of ‘scraped’ data from publicly available sources.
“Just because photos are available via social media does not make them ‘fair game’ for any use by public or private organisations. While European privacy rules do not necessarily block the data being used, organisations that collect it must comply with those rules.”
Fieldfisher technology partner Simon Briskman added: “Clearview AI might prefer not to be in the spotlight on this one, but it is an interesting test case. Visual search is just one example of how AI applications can glean data from a vast array of data sources and very large data sets. In the future, unlocking the value of data to deliver insights will help not just crime-fighting – as Clearview aims to do – but support economic forecasting, disease control, vehicle safety and many other applications.
“However, laws apply when accessing and using data, and personal data most notably. Clearview identifies on its own website that it uses public information. Information being publicly accessible does not mean that it can be used free of conditions. Not only might the public site itself impose terms, for example restricting commercial exploitation, but in this case, underlying images might carry copyright. And, as the ICO’s investigation highlights, use of personal data is regulated by data protection law.”
Briskman added: “Hopefully, whatever the results of the ICO’s investigation, we will see further clarity on data scraping emerging. Being able to legally exploit data in a way consistent with the rights of others can unlock real benefits for society.”
Computer Weekly contacted Clearview AI, but the firm had not responded to a request for comment at the time of publication.