Tombaky - Fotolia

Online Safety Bill needs complete overhaul, say rights groups

Civil society groups write to new digital minister to challenge various aspects of the upcoming Online Safety Bill, which they say threatens rights to privacy and freedom of expression

The Online Safety Bill needs to be “completely overhauled” to protect freedom of speech and privacy in the UK, says a coalition of civil society groups in an open letter to the UK government.

Addressed to new digital minister Michelle Donelan, the letter sets out the coalition’s main concerns over the Bill. These include: the provision to compel online companies to scan the content of users’ private messages; the extensive executive powers granted to the secretary of state to define what constitutes lawful speech; and the duty it would impose on technology platforms to deal with “legal but harmful” content.

Signed by privacy campaign group Big Brother Watch and six other civil society organisations – including Liberty, Article 19, English PEN, Open Rights Group, Global Partners Digital and Index on Censorship – the letter also asks Donelan to meet with the coalition to discuss the Bill, the passage of which was paused in July 2022 over parliamentary timetabling issues.

Computer Weekly contacted Donelan for comment, as well as confirmation of whether she would meet the coalition, but had received no response by the time of publication.

Under the Bill’s duty of care, tech platforms that host user-generated content or allow people to communicate will be legally obliged to proactively identify, remove and limit the spread of both illegal and “legal but harmful” content, or could be fined up to 10% of their turnover by the online harms regulator, Ofcom.  

“The law should be upheld online as it is offline, but as currently drafted, the Bill would impose a two-tier system for freedom of expression, with extra restrictions for categories of lawful speech, simply because they appear online,” said Big Brother Watch and others in the open letter, adding that the clause containing this provision should be dropped in its entirety.

While Ofcom has said it will not require the outright removal of legal content, larger platforms that fall into “category 1” – services with the highest risk functionalities and the highest user-to-user reach – will be required to set out how “priority content that is harmful to adults”, such as suicide-related material, is dealt with by their service.

Although Parliament is yet to specify the types of harmful content covered by the Bill, service providers will be required to balance their limiting of such content with the need to protect users’ freedom of speech.

Others, including the House of Lords Communications and Digital Committee and the Legal to Say. Legal to Type campaign group, have been also been highly critical of the “legal but harmful” provision, arguing, for example, that the potential fines would mean they err on the side of caution and therefore be much quicker to censor users’ content.

Regarding the new powers of the secretary of state, the open letter added: “It has been widely observed that the Bill gives the secretary of state excessive executive powers to define categories of lawful speech to be regulated and influence the limitations of our online expression. We believe that these powers would be vulnerable to politicisation by a future government.”

Read more about online safety

It further added that the Bill also poses a serious threat to privacy in the UK by creating a new power to compel online intermediaries to use “accredited technologies” to conduct mass scanning and surveillance of citizens on private messaging channels.

“These measures also put at risk the underlying encryption that protects private messages against being compromised by bad actors,” the letter said. “The right to privacy is deeply entwined with the right to freedom of expression and these proposals risk eroding both, with particularly detrimental effects for journalists, LGBTQ+ people and other communities. The Bill must not compel online intermediaries to scan the content of our private messages.”

In July 2022, then-home secretary Priti Patel published an amendment to the Bill that will give powers to regulators to require tech companies to develop or roll out new technologies to detect harmful content on their platforms.

The amendment specifically requires tech companies to use their “best endeavours” to identify, and to prevent people seeing, child sexual abuse (CSA) material posted publicly or sent privately.

Telecommunications regulator Ofcom will have the power to impose fines of up to £18m or 10% of the turnover of companies that fail to comply.

While ministers have argued that end-to-end encryption makes it difficult for tech companies to see what is being posted on messaging services, critics say the technology could be subject to “scope creep” once installed on phones and computers, and could be used to monitor other types of message content, potentially opening up backdoor access to encrypted services.

The Department for Digital, Culture, Media and Sport (DCMS) has previously funded a Safety Tech Challenge Fund in September 2021 which aims to develop technologies to detect CSA material in end-to-end-encrypted services, while, it claims, respecting the privacy of users.

The government announced five winning projects in November 2021, but has yet to publish an independent assessment of whether the technologies are effective at detecting abuse material and protecting the privacy of people using end-to-end encryption.

Read more on IT legislation and regulation

Data Center
Data Management