agsandrew - Fotolia

Interview: Tony Porter, chief privacy officer, Corsight AI

Tony Porter speaks to Computer Weekly about the changes in facial-recognition during his time as surveillance camera commissioner, the ethics of using the technology, and his new role as chief privacy officer at Corsight AI

Following a seven-year stint as the UK’s surveillance camera commissioner (SCC), Tony Porter has joined Corsight AI, a facial recognition supplier launched in 2019, to act as the firm’s chief privacy officer (CPO).

Facial recognition technology acts as a biometric checkpoint, enabling its users to identify people in real time by scanning faces and matching them against a set of selected images.

It has become increasingly controversial as its use has expanded, especially when deployed by law enforcement, with critics claiming it erodes privacy, reinforces prejudices against people of colour and contributes to the mass surveillance practices of nation states.

Speaking to Computer Weekly, Porter says that facial recognition technology has come on leaps and bounds since he became SCC in March 2014, at which point surveillance was largely considered to be an old-fashioned, analogue process revolving around the capture of CCTV footage.

“[Facial recognition] first crossed my eyes at a security fair run by the Home Office, and it seemed very fantastic and fanciful, but it didn’t really work,” he says, adding that it was clear using such a technology would also require “clear rules of engagement” going forward.

“The public needs to have clear expectations – if there are trial and pilot periods, people should know what they are; if there are performance measures, people should understand that – but because this was a new technology, none of that was in place.”

Porter claims that since then, he has seen a much higher level of maturity in both law enforcement and commercial sectors using the technology, which he says now understand “this isn’t just CCTV”, on top of significant improvements in the accuracy and speed of the technology itself.

Benefit to society

Porter adds that while facial recognition is a challenge that covers a lot of different but inter-related ground – from human rights and privacy to data protection, surveillance and other relevant laws – the technology “should and ought to be used for the benefit of society”.

“I’ve always had confidence in both the democratic process in this country and in humanity to define laws and regulations that can make sure it is used for the public good, and not used as a cause of harm,” he says.

Since joining Corsight on 12 January 2020, Porter says he has spent a significant amount of time engaging with employees from across the company in an attempt to gain the deepest level of knowledge and understanding possible about its processes and technology.

“When Corsight approached me, I wanted to know what their attitude was to ethics and transparency – I wanted to know how their scientists would feel when I go into their system and challenge them on privacy and privacy settings, and how they intend to respect the right of the citizen,” he says, adding that chief executive Rob Watts was unequivocally clear that Corsight wanted a strong and outspoken culture, both internally and when facing the public.

“The second step has been to agree with the chief executive that to have an ethical position the framework is paramount for the organisation – that will include how we come to decision making and how we will push our solution further, but in a way that is compatible with those aims [of ethics and transparency].”

Conflict of interest

When asked whether there is a conflict of interest in moving to work for a private company in the space he was previously acting as a regulator for, Porter says he was asked a similar question when moving from his role as a commander in counter-terrorism policing into the SCC position.

“I said ‘let’s have a chat in 12 months’ time, see what I’ve done, ask yourself if I’ve covered this with a clean pair of hands and if I’ve been honest and direct’, because you only lose your reputation once,” he says.

“Moving into this role [at Corsight], I would see a conflict of interest If I had argued against the use of facial recognition – that would have been a difficult position. I have to say, three or four years ago when I first started writing about it and arguing with the likes of ministers and being difficult in certain quarters, I never once said this technology should not be used.

“In fact, I always said it should be used because it can bring no end of good.”

The ethics of different use cases and community engagement

Despite his optimism about the technology, Porter says he recognises there needs to be a balance struck between security and privacy, as well as between the opinions of different groups and stakeholders towards its deployment, with whom there should be extensive engagement.

“I do believe there are inappropriate uses. In this country, we’ve got a democracy … where the right to protest is part of our freedom, part of our liberties,” he says. “I think to surveil people and infringe their Article 8 rights [to privacy] is a matter of serious concern, and one I would argue there need to be parameters around in that particular circumstance.”

He adds, for example, that the use of facial recognition against protestors should be not be so impermissibly wide that police are “capturing people just because we can” in dragnet-style surveillance, and that the grounds of using the tech must be justified and proportionate.

“If you’ve got concerns about 11-year-old schoolchildren playing in the school ground in a manner that is not conducive to a particular headmaster’s desires, personally I think that would step into very difficult ethical territory, but do I think it’s unethical to use facial recognition where we have positive and active intelligence that young children are being trafficked? No,” he says, adding that more thought needs to be given to the risks and limitations of the technology in specific use cases.

To do this, Porter claims that transparent community engagement is essential to deploying facial recognition technology that people can actually trust, especially if that community will be disproportionately subjected to its use.

Tech in retail

In the context of deploying the tech in retail situations, for example, Porter says commercial entities should engage with other retail groups, and can approach local MPs and councillors, as well as local police, with evidence of why it would be useful. From here, he adds, they should take a chance and publicise its use to explain how and why it is being done.

“In relation to where technologies are used disproportionately against minority ethnic groups [by the police], there are circumstances where there has been a large-scale deployment that was seen, and perceived, as being directly discriminatory towards a particular racial group – that caused no end of embarrassment to the police and they were compelled to cease the project,” he says, adding that this was a byproduct of poor communication and transparency.

Computer Weekly has previously written about the high number of facial recognition suppliers developing algorithms that can identify people with hidden or masked faces throughout 2020, which has largely been billed as a way of promoting public health during the pandemic. This includes Japan’s NEC Corporation, which provides facial-recognition equipment and software to both the MPS and South Wales Police in the UK.

When asked about this development, which Corsight is also a part of, Porter says that if the individual wearing a mask is identified from the system’s watchlist, and they have been placed on that watchlist legitimately, then “it’s worked and the use is lawful”.

“I think it’s a different position if somebody wears a full face covering to avoid facial recognition, because everybody has the right to walk around, certainly in UK and Europe, without being stopped because you don’t want your face to be viewed,” he says.

Read more about police technology

Speaking about the Metropolitan Police Services’ (MPS) penultimate trial deployment of facial recognition tech, which took place in February 2019, Porter says it was wrong that police stopped a man for covering his face, who was not on a watchlist, in an attempt to not have his face captured by the system.

“I think there has to be a maturity among law enforcement to say ‘if you’ve got no grounds to think somebody is bad guy, and somebody doesn’t want to have been biometrically processed, [then] I don’t think anybody has an absolute right to submit [someone else] to being biometrically processed’,” he says.

He adds that specifically Corsight’s tech will momentarily digitally process people’s biometric signatures, and then rapidly dispose of them if the information is irrelevant.

“Now, that that still suggests an impingement of your Article 8 human rights, but it’s mitigating it to a great extent because data retention is minimal and it’s removed [quickly],” he says.

“The other element it requires is for the user to be fully au fait with the laws and what their operational procedures are.”

The new biometrics and surveillance camera commissioner role

Before leaving his position as SCC, Porter published a 72-page guidance document in November 2020 on the use of facial recognition by UK police, which urges forces to seek legal guidance before establishing programmes that make use of the technology.

It marked the first guidance published following the Bridges case, in which the use of facial recognition by SWP was found to be unlawful on the grounds it violated the defendant’s privacy rights, and that the force did not properly consider how its policies and practices related to the tech could be discriminatory.

“Too much discretion is currently left to individual police officers,” said the Court of Appeals at the time. “It’s not clear who can be placed on the watchlist nor is it clear there are any criteria for determining where AFR [automatic facial-recognition] can be deployed.”

Porter claims the Bridges case “brought about recognition that there needs to be a tighter code of practice among the police,” and that his guidance, as well as the efforts of the police themselves, “lay a much smoother path for all to consider legitimate use”.

When asked if there still needs to be an explicit law or regulation to deal specifically with the use of facial recognition, Porter says the existing frameworks can still be “strengthened” and “streamlined”, adding that legislation often lags behind technology.

Amalgamating roles

In July 2020, the Home Office announced it would be amalgamating the roles of the biometrics and surveillance camera commissioners, a move Porter at the time condemned for diluting the roles.

“I think there was an assumption from the Home Office that biometric meant surveillance when actually, while biometrics is currently an important part of surveillance, Bridges’ case showed that it was so much more – it was conduct, it was the overlapping of laws, it was being impermissibly wide,” he says.

“We have been hit by, for many years, people saying ‘well, surveillance is only a data protection issue’ – it’s not. Surveillance by the state is a significant intrusion of your civil rights, it has to be done for lawful reasons.

“These are above and beyond issues that are covered in the Data Protection Act, so while I did not support the move and I do not support it, I truly wish my successor the best of luck.”

Read more on Technology startups

CIO
Security
Networking
Data Center
Data Management
Close