Daniel - stock.adobe.com
The government’s plan to impose a duty on technology platforms to deal with “legal but harmful” content in the Online Safety Bill would be ineffective and threatens freedom of speech, a House of Lords report has warned.
Under the Bill’s duty of care, tech platforms that host user-generated content or allow people to communicate will be legally obliged to proactively identify, remove and limit the spread of both illegal and legal but harmful content – such as child sexual abuse, terrorism and suicide material – or they could be fined up to 10% of their turnover by the online harms regulator, now confirmed to be Ofcom.
In its report, published on 22 July 2021, the House of Lords Communications and Digital Committee said that although it welcomes the Bill’s proposals to oblige tech platforms to remove illegal content and protect children from harm, it does not support the government’s plan to make companies moderate content that is legal, but may be objectionable to some.
Instead, the Lords argued that existing laws – such as those on harassment or grossly offensive publications – should be properly enforced, and any serious harms not already made illegal should be criminalised.
“For example, we would expect this to include any of the vile racist abuse directed at members of the England football team which is not already illegal,” peers wrote in the report.
“We are not convinced that they are workable or could be implemented without unjustifiable and unprecedented interference in freedom of expression. If a type of content is seriously harmful, it should be defined and criminalised through primary legislation.
“It would be more effective – and more consistent with the value which has historically been attached to freedom of expression in the UK – to address content which is legal but some may find distressing through strong regulation of the design of platforms, digital citizenship education, and competition regulation.”
In terms of dealing effectively with illegal content online, the peers said platforms should also be made to contribute more resources to help police enforce pre-existing laws.
The report also pointed out that platforms’ moderation decisions were often “unreasonably inconsistent and opaque” and could be influenced by commercial or political motivations.
It added that, given the market is dominated by a handful of powerful companies such as Facebook and Google, “rather than allowing these platforms to monopolise the digital public square, there should be a range of interlinked services between which users can freely choose and move”.
Read more about online harms and safety
- Fact-checking experts tell House of Lords inquiry that upcoming Online Safety Bill should force internet companies to provide real-time information on suspected disinformation, and warn against over-reliance on AI-powered algorithms to moderate content
- The UK government has issued “safety by design” guidance to help tech companies better protect users online ahead of its forthcoming Online Safety Bill.
- A forum of UK regulators with remits over different aspects of the digital economy has outlined its priorities for the coming year, which include developing joined-up regulatory approaches and building up shared skills and capabilities.
To achieve this, peers said the Digital Markets Unit (DMU) – which was set up to scrutinise the dominance of tech giants in the UK economy and has begun its work on developing legally binding codes of conduct to prevent anti-competitive behaviour in digital markets – should make structural interventions to increase competition, which would include mandating interoperability between social media services.
“The benefits of freedom of expression online mustn’t be curtailed by companies such as Facebook and Google, too often guided by their commercial and political interests than the rights and wellbeing of their users,” said committee chair Lord Gilbert.
“People have little choice but to use these platforms because of the lack of competition. Tougher regulation is long overdue and the government must urgently give the Digital Markets Unit the powers it needs to end these companies’ stranglehold.”
Because of the key role search engines play in facilitating freedom of expression, both through disseminating individuals’ and publishers’ content and providing access to information from which opinions can be formed, the report added: “The lack of competition in this market is unacceptable.”
It said the DMU should therefore make further structural interventions in the search engine market, which would include “forcing Google to share click-and-query data with rivals and preventing the company from paying to be the default search engine on mobile phones”.
Gilbert added that while freedom of speech is not an unfettered right, the right to speak your mind is the hallmark of a free society. “The rights and preferences of individuals must be at the heart of a new, joined-up regulatory approach, bringing together competition policy, data, design, law enforcement and the protection of children,” he said.
At the end of June 2021, the newly formed campaign group Legal to Say. Legal to Type critiqued the Online Safety Bill for being overly simplistic and ceding too much power to Silicon Valley firms over freedom of speech in the UK.
Speaking at a press conference to launch the group, Conservative MP David Davis, who characterised the Bill as a “censor’s charter”, said: “Silicon Valley providers are being asked to adjudicate and censor ‘legal but harmful’ content. Because of the vagueness of the criteria and the size of the fine, we know what they’re going to do – they’re going to lean heavily into the side of caution.
“Anything that can be characterised as misinformation will be censored. Silicon Valley mega-corporations are going to be the arbiters of truth online. The effect on free speech will be terrible.”