Andrey Popov - stock.adobe.com
A publicly accessible record on the collaborative uses of live facial recognition (LFR) should be created to reduce the secrecy around public-private partnerships, says an advisory body to the Home Office.
The Biometric and Forensic Ethics Group (BFEG) – an advisory, non-departmental public body sponsored by the Home Office that has a remit to focus on the ethical aspects of technologies that produce biometric data and identifiers – has outlined a number of issues that should be addressed before public-private collaborations in the use of LFR.
The publication of a briefing note in January 2020 follows a nearly year-long evidence-gathering mission by the BFEG, which focused in particular on how the technology is used collaboratively between police forces and private entities.
On creating a publicly accessible record, the BFEG said police forces should actively list documents on their websites related to each deployment of LFR technology that identify the purpose of the collaboration, the identity of the private company involved, and the types and amount of data being shared, with whom and for how long.
The most notable example of UK law enforcement collaborating with private sector entities on LFR is the King’s Cross Estate, which was revealed to be using the technology across a 67-acre area of central London in August 2019.
Having initially denied any involvement, the Metropolitan Police Service (MPS) and British Transport Police eventually admitted in September 2019 to supplying the King’s Cross Estate with images for its database.
A number of major technology companies have also come under scrutiny for their ties to law enforcement in the wake of George Floyd’s killing in the US on 25 May 2020, with IBM, Amazon and Microsoft all agreeing to temporarily halt their sales of LFR technology to US law enforcement in June 2020.
The BFEG noted that public-private collaborations had the potential to exacerbate discrimination and bias that already exist in LFR technologies, “particularly in cases where a public authority does not scrutinise the private entity’s training dataset and algorithm testing”. It said these collaborations are crucial to the rapidly evolving nature of the technology “because private organisations are extending and expanding what public authorities can do with it”.
The BFEG added: “Most of these [use cases] could not accurately be described as partnerships, in the sense of a clearly defined formal or contractual relationship between two parties. However, they all involve collaboration, which means that there is a flow of data, computational infrastructure (hardware, software, platforms) and knowledge that crosses public–private boundaries.”
As well as calling for the creation of a publicly accessible record, the BFEG recommended that, in the absence of a clear legislative framework, the collaborative use of LFR should only proceed if authorised by an officer of the rank of superintendent or above, and that an independent ethics group should be established to oversee these partnerships.
Read more about facial recognition technology
- Facial-recognition supplier claims new system can accurately identify masked faces, therefore promoting public health during the pandemic. But questions remain about whether its existing UK law enforcement clients will be deploying the technology.
- Tony Porter speaks to Computer Weekly about the changes in facial recognition during his time as surveillance camera commissioner, the ethics of using the technology, and his new role as chief privacy officer at Corsight AI.
- In a landmark decision, the Court of Appeal has ruled that South Wales Police’s facial recognition deployments breached human rights and data protection laws.
“To maintain public confidence, the BFEG recommends that oversight mechanisms should be put in place,” it said. “The BFEG suggests that an independent ethics group should be tasked to oversee individual deployments of biometric recognition technologies by the police and the use of biometric recognition technologies in public-private collaborations (P-PCs).
“This independent ethics group would require that any proposed deployments and P-PCs are reviewed when they are established and monitored at regular intervals during their operation.”
Other recommendations included that police should only be able to share data with “trustworthy private organisations”, specific members of which should also be thoroughly vetted; that data should only be shared with, or accessible to, the absolute minimum number of people; and that arrangements should be made for the safe and secure sharing and storage of biometric data.
The BFEG’s note also made clear that any public-private collaborations must be able to demonstrate that they are necessary, and that the data sharing between the organisations is proportionate.
“It is generally not permitted for the police to share information about members of the public with private organisations,” it said. “Any deviation from this fundamental ethical principle can be justified only if it serves an important public interest that could not be achieved without this collaboration.
“If the police cannot discharge their responsibilities without collaborating with private organisations, then the data or information that is shared by the police or, indeed, by private organisations with the police in these collaborations, should be only what is necessary for the police to perform their role.”
The BFEG added: “The benefits to policing must also be sufficiently great to justify any loss of privacy involved in the sharing of information either by the police or by private organisations with the police.”
The evidence used to create the briefing note, which was gathered by the BFEG’s Facial Recognition Working Group, came from stakeholders in industry, regulation, civil liberties and policing.