06photo - stock.adobe.com
Ride-hailing app Uber has launched a “dynamic pricing” algorithm in London to automatically set variable pay for drivers and fares for passengers, but the drivers’ union claims there is no transparency over how people’s data will be used.
Uber’s introduction of the algorithm in London at the start of February 2023 allowed the company to set variable pay and pricing levels based on real-time data for market conditions, such as time and distance, predicted route, estimated traffic, and number of users requesting or providing services.
In correspondence sent from Uber to its UK drivers in August 2022 regarding the roll-out of the algorithm, published by Brave New Europe, the company said: “If you joined Uber years ago, you will have joined when prices were quite simple.
“We set prices based on time and distance, and then surge [pricing] helped increase the price when demand was highest,” it added.
“Uber has come a long way since then, and we now have advanced technology that uses years of data and learning to find a competitive price for the time of day, location and distance of the trip.”
While Uber itself has not disclosed exactly which data its algorithm will be using to set pay and prices, the App Drivers and Couriers Union (ADCU) said it would likely use personal data and profiles of drivers and passengers to make decisions.
The ADCU added that, in the case of consumers, the algorithm could unfairly impact passengers in vulnerable situations (for example, those travelling home late at night or elderly people trying to reach hospital appointments).
For drivers, the ADCU said the algorithm could push down working conditions by targeting drivers based on their willingness and ability to accept lower fares.
“In both scenarios, competing algorithms are monitoring market signals as well as the behaviour of customers and drivers in real time,” said the ADCU. “As a result, passengers are offered prices at higher rates than they might otherwise be. In the same way, driver pay is kept artificially lower with drivers being effectively black- and grey- listed in the workplace.”
An Uber spokesperson said it is “categorically false” that Uber uses either personal data or profiling in the algorithm, and flatly rejected the claim it would create “blacklists” of drivers. They also said Uber had worked closely with GMB, which it signed an agreement with in May 2021, to consult with drivers on the algorithm and incorporate their feedback ahead of the London launch.
“Upfront Pricing is designed to improve reliability for riders, which in turn helps to create more trips for drivers,” they said. “Overall, Uber’s average take rate remains the same, but each trip is now based on real-time information to provide the best price to appeal to the drivers in the area.”
Responding to Uber's assertion that no person data is used by the algorithm, ADCU general secretary James Farrar said the definition of personal data is any information that relates to an identifiable individual, which would include data collected by the company about, for example, people's locations or trips taken.
"The dynamic pricing algorithm is clearly driven by machine learning using driver and passenger personal data gathered over time," he said. "Direct, indirect and tacit personalisation is a creeping risk over time as Uber and other apps seek to optimise their pricing and pay mechanisms. Without either transparency to stakeholders or adequate regulatory oversight from Transport for London, the safeguards are simply not there to prevent abuses now or in the future."
The ADCU also pointed to a January 2022 Harvard study on the consumer harms of dynamic pricing algorithms, which found that the use of such technologies opens up the possibility of “tacit collusion” between firms, as their automated monitoring of rivals’ prices in combination with the ability to rapidly alter pricing creates an unspoken understanding that deviations will be met with retaliatory undercutting, thus tacitly establishing a supracompetitive price.
The authors added that a single firm can also “initiate a cycle of consumer harm simply by employing a superior pricing algorithm”, because its capabilities to quickly reprice goods and services, as well as autonomously observe and react to competitors’ price changes, removes all incentives for less technologically advanced players to compete on price; creating a situation where “all firms will charge above the competitive price”.
Farrar described the pricing algorithm as “dangerous and predatory”, and further accused the Mayor of London and Transport for London (TfL) of “failing in their duty” by giving the algorithm the green light.
“Passengers and drivers are, directly and indirectly, unfairly targeted for personal auto-exploitation,” he said. Vulnerable passengers are placed at risk when service is denied or unfairly priced while driver pay is unfairly held down by means of employer-tacit collusion and with individuals targeted for black- and grey- listing.
“In post-Brexit Britain, where the government plans to strip away the few protections we have against algorithmic abuse, the Mayor must step forward and insist on high standards of data protection for London passengers and workers,” said Farrar. “We call on the Mayor to immediately ban the use of dynamic pricing algorithms in the regulated London minicab market.”
Read more about algorithms at work
- MEPs vote to amend platform worker directive: MEPs have voted in favour of amendments to the European Commission’s platform worker directive that would introduce a presumption of employment and increase algorithmic transparency.
- AI adopted without due consideration for workers, MPs told: MPs have been warned that the rapid roll-out of artificial intelligence in workplaces has changed UK enterprises’ management practices so much that current employment law is no longer fit for purpose.
- Workplace surveillance ‘spiralling out of control’, says TUC: Trade union body pushes for workers to be consulted on the implementation of new technologies at work, warning that invasive surveillance practices are getting out of hand.
Computer Weekly contacted Uber about these concerns over impacts on both drivers and consumers, as well as its position on the Harvard study, but Uber rejected the claim that personal data or profiling were used in the algorithm. It did not respond to questions about the Harvard study.
Computer Weekly also contacted TfL and the Mayor of London, but received no response.
Issues around Uber’s lack of algorithmic transparency are longstanding. In June 2022, the ADCU called on Uber to provide full algorithmic transparency so that drivers can understand how they are being profiled, how their performance is managed, and on what basis work has either been allocated or withheld.
In December 2021, a report published by Worker Info Exchange – a campaign group set up to help workers access and gain insight from data collected from them at work – noted that there are “woefully inadequate levels of transparency” about the extent of the algorithmic surveillance and automated decision-making that workers are subject to throughout the gig economy.
The ADCU further added that Uber is still failing to provide algorithmic transparency when drivers submit data subject access requests to the company for information it holds on them. Computer Weekly contacted Uber about its alleged failure to provide algorithmic transparency, but received no response on this point.
In March 2021, following legal action brought by the ADCU on behalf of six Uber drivers, Amsterdam’s District Court ruled that both Uber and Ola must disclose – to different extents – more of the data used to make decisions about drivers’ work and employment.
These cases arose from the operators withholding information from drivers that had submitted subject access requests for data held on them by the companies.
The court also rejected both Uber’s and Ola’s claims that drivers collectively taking action to access their data amounts to an abuse of their individual data access rights, laying the ground for drivers to form their own union-controlled data trust.