denisismagilov - stock.adobe.com
There is no “justifiable basis” for Police Scotland to invest in and use live facial recognition (LFR) technology, a Scottish parliamentary committee has ruled, citing human rights and data protection concerns.
The Scottish Parliament’s Justice Sub-Committee on Policing said in a report published today (11 February) that Police Scotland would need to demonstrate the legal basis of its use of LFR, as well as eliminate the biases that discriminate against ethnic minorities and women, for use of the technology to be approved.
LFR technology acts as a biometric checkpoint, enabling police to identify people in real time by scanning faces and matching them against a set of selected custody images, known as “watch lists”.
“The sub-committee believes that there would be no justifiable basis for Police Scotland to invest in technology which is known to have in-built racial and gender bias, and unacceptably high levels of inaccuracy,” said the report.
It said the committee had not received sufficient evidence to justify the introduction of LFR technology, or that it is even possible to use the technology in a “proportionate” way.
“Its use on people who attend legitimate and legal pursuits, such as peaceful protests, concerts or sporting events, is not necessary or proportionate,” the report said.
Although Police Scotland does not currently use LFR, plans to introduce it were included in its 10-year Policing 2026 strategy, which the committee said must be reviewed and updated if the police still plan to deploy the technology.
“The Scottish Police Authority must ensure that comprehensive human rights, equalities, community impact, data protection and security assessments are carried out,” it said, adding these must all be made publicly available.
The report also considered Police Scotland’s use of retrospective facial recognition (RFR), whereby facial recognition technology is used to search through recorded surveillance camera or other video footage to match people’s faces against a database of images.
It said that custody images, which are used to build both LFR and RFR “watch lists”, are often retained indefinitely by police in the UK because of a lack of legislation governing their use.
In March 2019, UK biometrics commissioner Paul Wiles confirmed to the UK’s Science and Technology Committee that the Police National Database (PND), which is also used by Police Scotland, currently holds 23 million custody images, regardless of whether or not those people were subsequently convicted.
The sub-committee’s report recommends that the Scottish Police Authority should review the use of RFR, including its use of the PND and the legal basis for uploading images to it.
“It should also include consideration of the consequences of their access to, and use of, any images of innocent people held illegally on that database,” said the report.
Public consent for LFR
The committee said Police Scotland must also demonstrate that there is public consent for its use of the technology, “as a lack of public consent risks undermining the legitimacy of the technology and, potentially, public confidence in policing”.
It added: “The use of live facial recognition technology would be a radical departure from Police Scotland’s fundamental principle of policing by consent.”
According to a national study released by the Ada Lovelace Institute in September 2019, people have mixed feelings about the use of LFR.
Almost half (46%) wanted the ability to opt out, a figure that is higher for people from ethnic minority backgrounds (56%), while 55% wanted the government to impose restrictions on police use of the technology. The vast majority of people surveyed (77%) also did not trust private companies to use the technology ethically.
The committee also expressed concern over the use of LFR by private companies, citing the sharing of custody images between the Metropolitan Police Service, British Transport Police and the King’s Cross Estate Development Company as an example.
It suggested that any new legislation developed to govern the use of LFR technology should also cover private companies in order to hold them to the same standard.
“Whether this technology is being used by private companies, public authorities or the police, the Scottish government needs to ensure there is a clear legal framework to protect the public and police alike from operating in a facial recognition Wild West,” said sub-committee convener John Finnie.
“The sub-committee is reassured that Police Scotland has no plans to introduce live facial recognition technology at this time. It is clear that this technology is in no fit state to be rolled out or indeed to assist the police with their work.”
Lack of legal frameworks
In December 2019, the justice sub-committee backed a bill that would create a dedicated biometrics commissioner for Scotland and establish a statutory code of practice for the use of biometric data by Scottish police.
In November, the Information Commissioner’s Office (ICO) also called for a statutory code of practice to govern how UK police deploy facial recognition technology, saying the lack of one contributes to inconsistent practice, increases the risk of compliance failures, and damages public confidence in the technology.
In his 2019 annual report, Wiles noted the lack of a legislative framework and clear laws governing the development of biometric databases.
“There is nothing inherently wrong with hosting a number of databases on a common data platform with logical separation to control and audit access, but unless the governance rules underlying these separations are developed soon, then there are clear risks of abuse,” he said.
According to the Scottish justice sub-committee, a number of individuals and organisations expressed the need for a moratorium on the technology in their submitted evidence.
One example is Tatora Mukushi, a legal officer at the Scottish Human Rights Commission, who said that if human rights and data protection standards cannot be built into the design of the technology, then it should not be introduced at all.
Read more about facial recognition
- UK policing bodies have laid out their top five digital priorities for the decade ahead, which includes boosting collaboration between public and private sector actors, as well as building common frameworks to dictate how new technologies are used by police.
- Despite the continuing controversy around its use, the Metropolitan Police will be deploying live facial recognition across the capital.
- As adoption of facial recognition systems continues to grow worldwide, there is increasing concern that this technology could undermine fundamental privacy rights and how it can be kept in check.
In July 2019, the UK’s Science and Technology Committee called on the government to issue a moratorium on the use of LFR, saying: “No further trials should take place until a legislative framework has been introduced and guidance on trial protocols, and an oversight and evaluation system, has been established.”
The European Commission is currently considering banning the use of LFR in public spaces for five years because of privacy and data concerns.
A report released in February 2020 by the Committee on Standards in Public Life, Artificial intelligence and public standards, said that, at present, the government and other public sector bodies are not sufficiently transparent about their use of artificial intelligence (AI).
“Our evidence suggests that this lack of transparency is particularly pressing in policing and criminal justice,” it said. “This is particularly concerning given that surveillance technologies like automated facial recognition have the potential to undermine human rights.”
It added that no public body should implement AI “without understanding the legal framework governing its use”.
The report was authored by Jonathan Evans, ex-director general of UK security service MI5.