Workplace monitoring needs worker consent, says select committee

Employers looking to monitor their employees through connected devices should only to so with the consent of those affected due to negative impacts such surveillance can have on work intensification and mental health

Enterprises should only be able to digitally monitor and surveil employees with their explicit consent, according to a select committee report on the impacts of connected internet of things (IoT) technologies.

Published 7 August 2023, the Culture, Media and Sport (CMS) Committee’s Connect tech: smart or sinister report concluded its year-long inquiry into the potential benefits and harms of connected technologies, which it defines as any physical object connected to the internet or other digital networks.

In relation to the workplace, it said the monitoring of employees via connected technologies “should only be done in consultation with, and with the consent of, those being monitored”, adding that the UK government should commission research to improve the evidence base around the deployment of automated data collection systems at work.

Connected technologies at work

The committee noted that data collection and processing is “fundamental” to the utility of connected technologies: “Directly, user data enables functionalities like monitoring, measuring and tracking (e.g. for fitness wearables), activation (e.g. for smart speakers), customisation, issuing alerts, notifications and recommendations, initiating interventions (e.g. in healthcare solutions) and decision-making, reasoning and other types of remote and/or autonomous operation.”

It added that, indirectly, the data collected by connected IoT technologies can also be leveraged and repurposed for other means, including for product development, targeted advertising (based on either observed or inferred preferences), and data sharing with third parties.

While the committee looked at the impacts of a wide array of connected technologies and deployment scenarios – including in homes, schools and smart cities – it also specifically investigated the proliferation of “smart workplaces”, noting that a wide range of enterprises from all different sectors have been increasingly deploying such technology in recent years.

Giving a range of examples – from industrial robotics and algorithmic logistics management to customer service chatbots and self-service checkouts – the committee said connected devices can bring a wide range of benefits to workplaces.

“Dr Asieh Tabaghdehi, senior lecturer in strategy and business economics, Brunel University London, argued that connected tech can lead to better and more efficient production, particularly where performance can be optimised through quicker or more proactive workflows, communication and feedback,” it said.

“Connected tech can also be used to empower people traditionally excluded from forms of work: Dr Efpraxia Zamani notes that technologies that facilitate working from home have allowed people with disabilities or living in rural areas secure otherwise-unavailable jobs, albeit with the challenges of developing social relationships at work.”

Citing Amazon’s written evidence to the committee, it said that the e-commerce giant argues that the combination of robotics, machine learning and other technologies in its fulfilment centres has reduced the physical burden on employees and freed them up to focus on more sophisticated tasks beyond the scope of automation, as well as driven improvements in operational safety.

However, noting the “inherent power imbalance” of the employer-employee relationship, the committee added that connected workplace environments also have clear negatives.

“Dr Tabaghdehi and Dr Matthew Cole, post-doctoral researcher at the Fairwork Project based at the [Oxford Internet Institute] OII, described to us instances where the micro-determination of time and movement tracking through connected devices, which had been introduced to improve productivity, such as in warehouses, had also led to workers feeling alienated and experiencing increased stress and anxiety,” it said.

“Dr Cole also argued that, more broadly, technological transformation would likely lead to a change in task composition and a deskilling of many roles as complex tasks are broken up into simpler ones to allow machines to perform them.”

Similar sentiments were shared with the Business, Energy and Industrial Strategy (BEIS) Committee in November 2022, when a number of witnesses challenged the characterisation of automation at Amazon by noting an uptick in union members reporting work intensification and negative mental health impacts as a result of the automated systems introduced.

Citing Cole further, the CMS Committee said there are few options for recourse available to employees despite these negative effects.

Aside from calling for worker consultation and consent, the committee said that the ICO should develop its existing draft guidance on Employment practices: monitoring at work into a principles-based code for designers and operators of workplace connected tech.

It said the government should clarify whether it’s proposals to regulate artificial intelligence (AI) will apply to the Health and Safety Executive (HSE) and how that regulator can be supported in fulfilling its remit.

While the government said in its AI whitepaper that it would empower existing regulators – including the HSE – to create tailored, context-specific rules that suit the ways AI is being used in the sectors they scrutinise, the Ada Lovelace Institute said in July 2023 that, because “large swathes” of the UK economy are either unregulated or only partially regulated, it is not clear who would be responsible for scrutinising AI deployments in a range of different contexts.

Responding to the connected technologies report, Andrew Pakes, deputy general secretary of Prospect Union, said that although the monitoring of employees through various devices is becoming increasingly commonplace, regulation is lagging well behind implementation.

“These are important recommendations from the Culture, Media and Sport committee report and would go some way to identifying the true scale of the issue, through government research, and catching up with the reality of worker surveillance. In particular, it is vital that workers are fully informed and involved in the design and use of monitoring software and what is being done with the data collected,” he said.

“It’s also important that any change or expansion of the role of HSE be matched with a significant funding boost to the agency which is already stretched to breaking point.”

Prospect Union has long-argued that employees should be involved in the “design, construction, testing and implementation” of any technologies used to control or monitor their work, and has also campaigned for a “right to disconnect” as a way of combatting digital surveillance and work intensification.

Ongoing issues

While the committee looked at the specifically at the surveillance of employees through connected devices, politicians and unions have long-been decrying a range of surveillance-enabling technologies in the workplace.

A Parliamentary inquiry into AI-powered workplace surveillance conducted by the All-Party Parliamentary Group (APPG) for the Future of Work previously found in November 2021 that AI was being used to monitor and control workers with little accountability or transparency, and called for the creation of an Accountability for Algorithms Act.

In March 2022, the Trades Union Congress (TUC) also said the use of surveillance technology in the workplace is “spiralling out of control”, and could lead to widespread discrimination, work intensification and unfair treatment without stronger regulation to protect workers.

In April 2023 – a month after the government published its AI whitepaper – the TUC further warned that the government is failing to protect workers from being “exploited” by AI technologies, noting that the whitepaper only offered a series of “vague” and “flimsy” commitments for the ethical use of AI at work, and that its separate Data Protection and Digital Information Bill (DPDI) has set a “worrying direction of travel”.

In May 2023, Labour MP Mick Whitley introduced “a people-focused and rights-based” bill to regulate the use of AI at work, which includes provisions for employers to meaningfully consult with employees and their trade unions before introducing AI into the workplace.

Regarding privacy, Whitley said that “it would protect workers from intrusion into their private lives” through the creation of a formal “right to disconnect”, and require the government to publish statutory guidance for employers on how they can protect the privacy and work-life balances of their employees.

Although 10-minute rule motions rarely become law, they are often used as a mechanism to generate debates on an issue and test opinion in the Parliament. As Whitley’s bill received no objections, it has been listed for a second reading on 24 November 2023.

Read more about tech in the workplace

Read more on Artificial intelligence, automation and robotics

CIO
Security
Networking
Data Center
Data Management
Close