oatawa - stock.adobe.com

Communications watchdog Ofcom to gain extra powers as online harms regulator

The government has released its initial response to 2019’s Online Harms White Paper, setting out a number of decisions about how it will regulate harmful content on the internet

Ofcom is set to become the UK’s online harms regulator, for which it will be given a range of new powers to protect users from harmful and illegal content, the government has said.

The announcement comes as the government releases its initial response to the Online Harms White Paper, a joint proposal from the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, published in April 2019.

The whitepaper introduced the world’s first framework designed to hold internet companies accountable for the safety of their users, and set out intentions to give companies a statutory duty of care to protect their users from harm.

While the government will publish a full consultation response to the whitepaper in spring 2020, the initial response sets out a number of decisions the government has already made.

Under the government’s online harms regulation proposals, Ofcom will be given clear responsibilities to protect users online, which will include paying due regard to safeguarding free speech, defending the role of the press, promoting tech innovation and ensuring businesses do not face disproportionate burdens.

The regulator would also require technology companies to explicitly state what content and behaviour is acceptable on their services in clear and accessible terms and conditions.

“It is incumbent on tech firms to balance issues of privacy and technological advances with child protection,” said home secretary Priti Patel. “That’s why it is right that we have a strong regulator to ensure social media firms fulfil their vital responsibility to vulnerable users.”

“It is incumbent on tech firms to balance issues of privacy and technological advances with child protection. That’s why it is right that we have a strong regulator to ensure social media firms fulfil their vital responsibility to vulnerable users”
Priti Patel, home secretary

However, only companies that allow the sharing of user-generated content will be covered, which the government claims is fewer than 5% of UK businesses. At this stage, it is unclear if every company with a comment section on their website will be included.

To help businesses understand whether their services would fall into the scope of the new regulation, Ofcom will also be responsible for developing and providing guidance to them.

The government has also promised to carry out an economic impact assessment to determine the anticipated effect the regulation will have on businesses and the UK economy as a whole.

Ofcom the obvious choice

A major reason for choosing Ofcom is the government’s belief that, because of its previous experience in the broadcasting and telecoms sectors, it has the expertise and independence necessary for the job.

“We will give the regulator the powers it needs to lead the fight for an internet that remains vibrant and open but with the protections, accountability and transparency people deserve,” said DCMS secretary of state Nicky Morgan.

“With Ofcom at the helm of a proportionate and strong regulatory regime, we have an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the UK.”

Privacy concerns

However, civil liberties group Big Brother Watch railed against the regulation proposals in a series of tweets, citing concerns that it would undermine users’ privacy and free speech.

“To be clear the government is making private companies legally responsible for what we, individuals, say to each other. This explicitly requires mass surveillance & censorship. This is going to be a disaster for privacy & free speech online,” it said.

“The ‘duty of care’ regulation proposed today gives social media companies legal responsibility for preventing psychological ‘harm’ – undefined – arising from online chats between members of the public. The result? Social media companies have to police private conversations.”

Code of practice

In terms of next steps, the government said it would also be introducing an interim code of practice which, although voluntary, is intended to bridge the gap until the regulator becomes operational. This code will provide guidance to companies on how to tackle online terrorist and child sexual exploitation and abuse (CSEA) content and activity.

Vinous Ali, associate director of policy at trade association body TechUK, said it is vital Ofcom be given the appropriate resources and skills for the challenge ahead, and that more clarity is given on the regulation going forward.

“Whilst the direction of travel is encouraging, much more work is needed to deliver clarity on questions of scope, process, legal but harmful content and enforcement. TechUK will continue to work with government, Ofcom and other stakeholders to ensure the new framework has the clarity and proportionality necessary to give confidence to users and business alike,” she said.

Read more about online safety

  • Government announces £30m of new cash to tackle online child harm as part of Spending Review 2019.
  • The UK Council for Internet Safety has an expanded scope to tackle digital abuse and will inform future policy development.
  • The Home Office has boosted its technology stack aimed at fending off digital child abuse with three new tools.

 

Read more on IT for telecoms and internet organisations

CIO
Security
Networking
Data Center
Data Management
Close