winyu - stock.adobe.com

TfL under fire for relying on Uber facial-verification data in licensing decisions

TfL is facing numerous legal appeals from Uber drivers as a result of its decisions to revoke their private licences on the basis of mistaken information from Uber’s systems

Transport for London’s (TfL) reliance on information from Uber’s facial verification-based driver ID system is coming under increased scrutiny following multiple instances of misidentification leading to unionised drivers losing their private hire licences.

The drivers and their union say that, despite the technology’s questionable veracity, TfL has been overly reliant on it when making licence revocation decisions, which can severely impact driver’s livelihoods if they are unable to legally operate their vehicle.

TfL’s unchallenging acceptance of the evidence provided by Uber’s Microsoft-based verification system has resulted in numerous legal challenges being raised by drivers who say they have had their licences unfairly revoked.

Earlier this month (5 July 2021), for example, a Magistrate’s Court ordered TfL to reinstate the license of an Uber driver, which it found was revoked without notice or appeal following a failed facial-verification check.

The City of London Magistrates Court found TfL moved straight to revoke the driver’s private hire license on identity fraud grounds, and relied solely on evidence from Uber’s facial verification software without conducting its own investigation, as a licensing authority, into the incident.

“The Court noted that at no stage had there been any risk to the public and that TfL had proceeded straight to revocation, despite other sanctions being available to them,” wrote the driver’s solicitor, Abbas Nawrozzadeh of Eldwick Law, in his notes of the judgment.

“They had relied on the actions of a body that fell outside of their licensing power and revoked the licence without any investigation into actual events on that day. The Court described the decision as perverse and unreasonable, stating that as a licensing body TfL should be ashamed of bringing this case to court as it represented a waste of public funds.”

Facial verification

Speaking to Computer Weekly, Nawrozzadeh added that he had been “dealing with a raft of these cases” since Uber introduced its Real-Time ID Check facial-verification system.

In the case from 5 July, the misidentification occurred while the driver, a member of the App Drivers & Couriers Union (ADCU), was working for the ride hailing app’s food delivery arm, Uber Eats. The error, however, resulted in the deactivation of both his Uber Eats and drivers accounts by the company.

Facial-recognition vs facial-verification

Uber’s Real-Time ID Check system uses Face API, a face-matching software developed by Microsoft that can be used for either facial verification or recognition. It essentially acts as a comparison tool, checking selfies taken by couriers and drivers as they log in against photographs in Uber’s database to confirm their identities.

Facial-recognition is a process that cannot be opted out of due to the fact it uses cameras to scan every face in its view, which it then compares to watch lists for a match, and people may not even be aware if they had walked through its field of view.

Facial verification, meanwhile, is a one-to-one process whereby the individual knows the biometric processing is taking place and is actively involved in it by giving information via a personal device.

Under London’s transport regulations Uber is obliged to inform TfL when a driver is dismissed via a specific form, which Computer Weekly understands should trigger a “fitness assessment” by the regulator to decide whether the driver in question should continue to hold a private hire license.

This process would also normally entail TfL notifying the driver in writing that it  had received an adverse report about them, so they have the opportunity to respond and appeal, if TfL decides to revoke their license.

According to the general secretary of the ADCU, James Farrar, drivers can usually continue working while a decision is appealed, except in more serious cases where there is a perceived safety risk or fraud.  

It should be noted that Uber itself was allowed to continue operating while it appealed its own unfavourable licensing decision.

Uber’s appeal

In November 2019, TfL concluded that Uber was “not fit and proper” to hold a private hire operator licence after finding that a change to Uber’s systems allowed unauthorised drivers to upload their photos to other Uber driver accounts.

This allowed them to pick up passengers as though they were the booked driver, which occurred in at least 14,000 trips.

“Another failure allowed dismissed or suspended drivers to create an Uber account and carry passengers, again compromising passenger safety and security,” said TfL in a press release at the time.

Uber, however, appealed the decision, and was granted a new private hire operator licence in September 2020 on the basis it had made efforts to address the issues raised.

In response to these failings, Uber implemented its facial-verification system in April 2020, with the approval of TfL.

“Then they can immediately revoke,” he told Computer Weekly. “You can continue to appeal to the Magistrate court but you can’t work in the meantime.”

Following the driver’s misidentification by Uber’s Facial-verification system, and the subsequent deactivation of his accounts, TfL was notified and immediately revoked his licence, Farrar added. This is a course of action it usually takes in response to serious allegations, such as sexual or physical violence.

He further claimed the TfL decision was based on no or very little data, and that Uber was only asked by the regulator to substantiate the allegation when the driver challenged TfL’s decision. “Even if he had engaged in any kind of identity fraud, which he didn’t [as the court decision found], he would only have done it as a delivery driver but TfL revoked the [private hire] licence anyway,” he said. “There was no question he did anything wrong while acting as a licensed driver.”

Computer Weekly asked Uber if it would like to comment on why both the Eats and driver’s accounts were deactivated, but received no response.

As a result of the court’s findings, the driver was also awarded his legal costs against TfL, which Farrar described as “extraordinary”. “Usually they’re immune to cost claims because, if you’re a licensing authority in the interest of public safety, only under exceptional circumstances will a cost be awarded against you for an adverse licensing decision, even if it’s overturned,” he said.

Issuing a general response to Computer Weekly’s questions, a TfL spokesperson said: “The safety of the travelling public is our top priority and where we are notified of cases of driver identity fraud, we take immediate action to revoke a driver’s licence so that passenger safety is not compromised.”

Uber was also asked if it would like to respond to the ruling, but Computer Weekly received no response.

Not an isolated incident

The plight of the ADCU driver the Magistrates Court heard about this month is far from an isolated incident, with regard to Uber drivers losing their licences as a result of facial recognition errors.

In March 2021, the ADCU and its associated data trust, Workers Info Exchange (WIE), said it had identified a further seven cases of Uber drivers losing their jobs and having their licences revoked by TfL because the company’s Real-Time ID Check system failing to recognise their faces.

Incidentally, when TfL was asked to comment on the July 2021 ruling it made an abortive request for the name of the driver to substantiate it was commenting on the correct case, which gives some indication about the number of cases of this nature it is involved with.

The ADCU did not wish to disclose the name of the driver to Computer Weekly out of a concern they could face repercussions from TfL.

As to why Uber’s facial-verification tool may be considered error-prone, research into Microsoft’s systems, as well as actions of Microsoft itself, suggest there is bias against particular groups, particularly people of colour and women.

In 2018, research from MIT indicated that Microsoft’s facial-recognition and detection systems – specifically the Face API being used by Uber – had gender and racial biases, finding it had much higher error rates when identifying women or people with darker skin.

“The substantial disparities in the accuracy of classifying darker females, lighter females, darker males and lighter males in gender classification systems require urgent attention if commercial companies are to build genuinely fair, transparent and accountable facial analysis algorithms,” said authors Joy Buolamwini and Timnit Gebru.

Sales suspended

In June 2020, Microsoft – alongside Amazon and IBM – suspended sales of its facial-recognition technologies to US law enforcement agencies in response to several weeks of mass protests against the police murder of George Floyd on 25 May.

Microsoft President Brad Smith previously told ITV in January 2019 that one of the challenges with the technology in its current form was that “it doesn’t work as well for women as it does for men, it doesn’t work as well for people of colour”, adding that it was more likely to find errors, mismatch and generally “fail to identify” people from these groups.

Commenting on the union’s allegations regarding Uber’s Real-Time ID Check system when they were first made in March, a Microsoft spokesperson said the company was “committed to testing and improving Face API, paying special attention to fairness and its accuracy across demographic groups”.

“We also provide our customers with detailed guidance for getting the best results and tools that help them to assess fairness in their system.”

Read more about workers in the gig economy

  • Ride-hailing firm will pay its UK drivers minimum wage following court ruling, but has diverged from the court’s interpretation that drivers should be paid from when they log in, not just when passengers are on board.
  • Unionised Uber drivers have condemned the ride-hailing firm for “cynically” framing its UK service fee rise as a way of improving equality among drivers, rather than a pay cut.
  • Uber and GMB sign “landmark” collective bargaining agreement with no ability to negotiate on earnings, attracting criticism from smaller unions that it lacks teeth.

An Uber spokesperson added that the system was designed to protect the safety and security of passengers by ensuring the correct driver or courier is using the account.

“While no tech or process is perfect and there is always room for improvement, we believe the technology, combined with the thorough process in place to ensure a minimum of two manual human reviews prior to any decision to remove a driver, is fair and important for the safety of our platform,” they said.

In a April 2021 letter sent directly to the ADCU – which has also been shared with Computer Weekly – Microsoft told the union “errors in the human review process associated with Uber’s implementation of our facial recognition technology do not provide a basis for Microsoft to terminate its license to use the technology, especially when Uber has acknowledged the failure and is committed to improvement”.

As part of an investigation for Wired from early March 2021, a further 14 Uber Eats couriers shared evidence with journalist Andrew Kersley that showed how the technology failed to recognise their faces, leading to threats of termination and account closure.

In terms of the impact on drivers, Farrar said “they’ve either had to fall back on Universal Credit or try to find some alternative means of work, and haven’t been able to get it – it’s had a devastating effect and a lot of these guys are main breadwinners in their families as well”.

Computer Weekly asked Uber if it would like to comment on the claim that its facial-verification software has led to the wrongful deactivation of multiple driver’s accounts, but received no response.

Driver’s fired by other automated processes

The ADCU is also appealing a series of other licence revocation decisions by TfL at the Magistrate’s Court, which are similarly based on mistaken information that led to accusations of fraud from Uber.

In April 2021, for example, Uber was ordered in a default judgement by the district court of Amsterdam (where Uber’s European headquarters is located) to reinstate six drivers because the decision to deactivate their accounts and terminate their employment was “based solely on automated processing, including profiling”, which goes against Article 22 of the General Data Protection Regulation (GDPR).

London-based driver Abdifatah Abdalla, for example, claimed he was accused, without providing evidence, of sharing his account details with a third-party when the app detected two sign-in attempts from different locations resulting in his deactivation

His private hire licence was revoked by TfL a month later, leaving him unable to drive for alternative ride-hailing apps such as Kapten and Bolt.

In the same month, the City of London Magistrates’ Court separately ordered TfL to reinstate Abdalla’s private hire licence, concluding that “no investigation has taken place”, and further criticised the regulator’s “willingness to accept” the evidence provided by Uber.

Appeals

All of the drivers involved in the Amsterdam case are now having to separately appeal TfL’s licensing decisions in the UK Magistrate’s court as the immediate revocation of a license means it cannot be appealed at the TfL level, said Farrar.

“As soon as TfL got these dismissal notices [from Uber], it took a very harsh view that these were safety threats and immediately revoked those licenses,” said Farrar. “Normally, immediate revocation would be for something like physical or sexual violence, but in these cases TfL said they were immediate and those drivers and no notice – just ‘boom’, Kafkaesque, you’re deactivated from Uber and you’ve lost your licence.

“We [now] have to separately turn around and fight for the restoration of the licenses … appealing to the Magistrate’s to review the TfL decisions, but in order to do that the bar is pretty high because you’ve got to prove the decision is wrong … it’s not just about presenting an alternative view, you have to reach a standard.”

Computer Weekly asked TfL if it would like to comment on the fact that multiple magistrates’ rulings have found it relied solely on evidence from Uber without conducting its own investigation as a licensing authority into the actual events, but received no response on this point.

Rushed facial-verification implementation

The ADCU has previously claimed, and maintains, Uber rushed to implement its identification system in a bid to win back its London operating licence after TfL decided in late 2019 that it would not be renewed over problems with unauthorised drivers using the platform.

Evidence given to TfL by Uber during its licensing appeal in September 2020 showed that, because of failures in the company’s manual identification process, it had started to roll out the system from April 2020 onwards.

“TfL has taken an active interest in [Uber’s] proposals with regard to this product. There are clear benefits to the product and TfL supports any technology which increases passenger safety by ensuring the driver is licensed by TfL and permitted to use the Uber app,” wrote TfL’s director of licensing, regulation and charging, Helen Chapman, in her witness statement.

“I consider the use of this product a step in the right direction, although clearly its implementation is still at a very early stage,” said Chapman. “I therefore cannot meaningfully comment on the effectiveness of it at this stage.”

She added that TfL had received a Data Protection Impact Assessment (DPIA) for the system from Uber in March 2020.

DPIAs are of note because they are designed to increase awareness around privacy and data protection issues within organisations, and allow them to not only comply with relevant laws, but to also identify and fix any issues at an early stage before harm is caused.

“A DPIA is not simply a rubber stamp or a technicality as part of a sign-off process. It’s vital to integrate the outcomes of your DPIA back into your project plan,” says the Information Commissioner’s Office (ICO) in its guidance on conducting DPIAs. “You should not view a DPIA as a one-off exercise to file away. A DPIA is a ‘living’ process to help you manage and review the risks of the processing and the measures you’ve put in place on an ongoing basis.”

Letter to the mayor

In March 2021, the ADCU, WEI and digital rights group Big Brother Watch co-signed a letter to London mayor Sadiq Khan relating to the case of another driver misidentified by Real-Time ID Check.  

In it, they claimed TfL had “placed significant pressure on Uber, under threat of losing their license, to rapidly introduce facial recognition identity technology in London”, adding that it has also refused to share the DPIA with the ADCU.

“It is our view that the mandatory use of such technology cannot be justified and is entirely disproportionate to the risk. In the evidence TfL presented at Uber’s licensing appeal, only 21 drivers were found to share access to their driver app out of 90,000 analysed over several years. This was only possible at the time due to a security problem in Uber’s app which has since been rectified,” said the letter, seen by Computer Weekly.

“The use now of such intrusive, inaccurate, and dangerous technology is disproportionate and unnecessary since there are other more proportionate means to achieving the same end. TfL’s promotion of Orwellian levels of surveillance of 100,000 Londoners working in the gig economy is not a reasonable or proportionate response for the failure of the gig employer to audit and test the security of their systems.”

It added: “Checks, balances, and due diligence has failed at every level. TfL has by passed all due process in its political rush to be seen to be an aggressive enforcement authority.”

Given multiple reports of misidentification leading, in turn, to deactivations by Uber and licence revocations by TfL, Computer Weekly asked the regulator if it still supported the roll-out of the technology, but received no response on this point.

Computer Weekly also asked TfL whether it would like to comment on the ADCU’s claim it had placed significant pressure on Uber to introduce facial-recognition technology, but again received no response.

Read more on IT architecture

CIO
Security
Networking
Data Center
Data Management
Close