CLRCRMCK

Uber faces legal action over ‘racist’ facial verification system

Two UK-based unions are taking Uber to court, claiming their members have been unfairly dismissed as a result of misidentification by the company’s facial verification system

Unionised Uber drivers are taking legal action against the ride-hailing app provider over allegations of using “racially discriminatory” facial verification technology, which they claim has led to dozens of unfair dismissals.

Uber’s Real-Time ID Check system uses Face API, a face-matching software developed by Microsoft that can be used for either facial verification or recognition, and essentially acts as a comparison tool, checking selfies taken by couriers and drivers as they log in against photographs in Uber’s database to confirm their identities.

The legal action is being brought by two separate unions – the App Drivers and Couriers Union (ADCU) and the International Workers’ Union of Great Britain (IWGB) – which claim that Uber’s use of the technology has led to the wrongful suspension of their members following misidentification by the system.

“Workers are prompted to provide a real-time selfie and they face dismissal if the system fails to match the selfie with a stored reference photo,” said the ADCU. “In turn, private hire drivers who have been dismissed also faced automatic revocation of their private hire driver and vehicle licences by Transport for London [TfL].”

In July 2021, Computer Weekly reported that the transport regulator was facing numerous legal appeals from Uber drivers as a result of its decisions to revoke their private licences on the basis of mistaken information from Uber’s systems.

In the ADCU case, which is being supported by its associated data trust Worker Info Exchange and the Equality & Human Rights Commission (EHRC), the union has filed a claim at the Central London Employment Tribunal on behalf of former UberEats courier Pa Edrissa Manjang and former Uber driver Imran Javaid Raja.

“It is clear that artificial intelligence and automated decision-making can have a discriminatory impact. The consequences, in the context of deciding people’s access to work, can be devastating. These cases are enormously important,” said the pair’s lawyer, Paul Jennings, a partner at Bates Wells. “AI is rapidly becoming prevalent in all aspects of employment and important principles will be established by the courts when determining these disputes.”

The IWGB has also filed a separate claim for indirect racial discrimination on behalf of an unnamed member, whose account it claims was terminated following a facial recognition error. It further claimed that it has represented more than 200 drivers and couriers who have been unfairly terminated by Uber in the past year alone on a range of grounds, including facial recognition failures.

“Artificial intelligence and automated decision-making can have a discriminatory impact. The consequences, in the context of deciding people’s access to work, can be devastating”
Paul Jennings, Bates Wells

Both unions stressed that multiple studies have brought into question the effectiveness and accuracy of facial verification technology, particularly when used to identify people of colour.

In 2018, for example, research from MIT indicated that Microsoft’s facial recognition and detection systems – specifically the Face API being used by Uber – had gender and racial biases, finding it had much higher error rates when identifying women or people with darker skin.

The potentially discriminatory nature of Uber’s facial verification system was first highlighted as an issue affecting workers in March 2021 when, as part of an investigation for Wired, 14 Uber Eats couriers shared evidence with independent journalist Andrew Kersley that showed how the technology failed to recognise their faces, leading to threats of termination and account closure.

“Our Real-Time ID Check is designed to protect the safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel,” claimed an Uber spokesperson in response to the separate legal actions being taken by the unions, as well as allegations that its facial verification system is racially discriminatory.

“The system includes robust human review to make sure that this algorithm is not making decisions about someone’s livelihood in a vacuum, without oversight.”

Alongside legal action, the IWGB organised a 24-hour boycott of Uber on 6 October 2021 and an accompanying protest outside the company’s London headquarters on the same day, which was supported by Black Lives Matter UK (BLM UK).

Demands made by the IWGB included increased earnings for drivers after a recent increase in the commission taken by Uber, as well as a fair, transparent process for account terminations.

“The impact of Uber’s facial recognition algorithm reflects a complete lack of care for black people and their livelihoods. The gig economy which already creates immense precarity for black key workers is now further exacerbated by this software that prevents them from working at all, purely based on the colour of their skin. Racist practices such as these must come to an end,” said BLM UK.

Henry Chango Lopez, general secretary of the IWGB, added: “Hundreds of drivers and couriers who served through the pandemic have lost their jobs without any due process or evidence of wrongdoing, and this reflects the larger culture at Uber which treats its majority-BAME workers as disposable. Uber must urgently scrap this racist algorithm and reinstate all the drivers it has unfairly terminated.”

A separate strike action organised by ADCU at the end of September 2021 made similar demands of Uber, including that it respect a Supreme Court decision which explicitly ruled that drivers should be paid from when they log in, not just when assigned to trips as Uber decided a month later.

James Farrar, general secretary of the ADCU and director of Worker Info Exchange, said Uber only implemented the facial recognition system to secure the renewal of its licence, which it knew would generate unacceptable failure rates when used against a workforce mainly composed of people of colour.

“Uber then doubled down on the problem by not implementing appropriate safeguards to ensure appropriate human review of algorithmic decision-making,” he added.

The ADCU and IWGB have not been officially recognised by Uber, which instead chose to sign a collective bargaining agreement with GMB in May 2021.

While this was the first time Uber had recognised a union of its drivers anywhere in the world, the agreement does not allow for collective bargaining over drivers’ earnings, including the firm’s implementation of the minimum wage.

Read more about biometric technologies

  • UK’s former biometrics commissioner says Parliament should explicitly legislate on the use of biometric technologies so it is clear to both police and private companies which uses are acceptable.
  • Retrospective facial recognition software purchased for £3m by the Met Police will be deployed in coming months amid continuing controversy around the use of biometric technologies by law enforcement bodies.
  • Civil society groups call for ban on use of live facial recognition technology amid claims the government and the police are introducing intrusive surveillance measures without parliamentary scrutiny.

Read more on Mobile apps and software

CIO
Security
Networking
Data Center
Data Management
Close