The UK government has introduced the world’s first framework designed to hold internet companies accountable for the safety of those using their services, as well to tackle potential harm to users.
Under the Online Harms whitepaper released 8 April 2019, which is a joint proposal from the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, social media and technology companies will need to take “reasonable steps” to protect their customers from threats stemming from their offerings.
These range from cyber bullying, terrorism, disinformation and child sexual exploitation to encouragement of behaviours that may not be illegal but are still highly damaging. The proposed laws will apply to companies of any size that allow sharing or access to user-generated content or interaction with other users online.
Announcing the new measures, prime minister Theresa May argued that internet companies “have not done enough for too long” to protect their users, particularly young people, from harmful content.
“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe,” said May.
“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”
An independent regulator will oversee and enforce the statutory duty of care. The DCMS and the Home Office are currently consulting on powers to shut down websites, issue “substantial fines” and hold individual members of senior management to account if companies fail to comply with the new laws.
A variety of technology companies will be subject to the laws, including social media platforms, file hosting sites, public discussion forums, messaging services, and search engines.
“The era of self-regulation online for online companies is over,” said digital secretary Jeremy Wright, who added that voluntary actions from companies to tackle online harm have not been consistent or good enough.
“Tech can be an incredible force for good, and we want the sector to be part of the solution in protecting their users. However, those that fail to do this will face tough action,” he said.
The industry welcomed the new measures but raised many of the questions still open for consultation. Commenting on the Online Harms whitepaper, TechUK’s head of policy, Vinous Ali, said that while the new measures are a “significant step forward”, the UK is still a long way from achieving its goals.
“Delivering this framework will not be easy and will not be achieved if difficult problems and trade-offs are ignored. Some of the key pillars of the government’s approach remain too vague,” she said.
“It is vital that the new framework is effective, proportionate and predictable. Clear legal definitions that allow companies in scope to understand the law and therefore act quickly and with confidence will be key to the success of the new system.”
Ali noted that not all of the legitimate concerns about online harms can be addressed through regulation.
“The framework must be complemented by renewed efforts to ensure children, young people and adults alike have the skills and awareness to navigate the digital world safely and securely,” she said, adding that the duty of care is a “deceptively straightforward-sounding concept” and that it is not clearly defined and is “open to broad interpretation”.
Clarification around the legal meaning and how it expects companies to comply with such a potentially broad obligation, which could conflict with other rights, is needed, Ali said, particularly when it comes to private communications on their platforms.
TeckUK was also critical of the mentions to a regulatory body as central to enacting the new proposals. The trade body said clarity about intended goals and trade-offs necessary for it are key.
Read more about digital policy in the UK government
- A House of Lords committee called for the creation of a digital authority to coordinate regulators, assess rules and make recommendations to respond to developments in digital services.
- Internet service providers must now tell consumers how fast their broadband service will be before they sign a contract, under new Ofcom rules.
- The topic of cyber power needs wider discussion, says GCHQ head in a speech about the opportunities.
“A regulator, whether new or existing, will not thank anyone for being handed a vague remit,” Ali said. However, TechUK was satisfied with the government’s commitment to a risk-based, proportionate approach.
“A one-size-fits-all approach would have been inappropriate and inevitably ineffective. We hope this more nuanced approach will help produce better outcomes and reflect the diversity of services that are brought into scope,” Ali said.
“The regulator must become engaged effectively and work with businesses to help improve and develop effective practices that build user trust and confidence,” she added.
The announcement of the Online Harms proposals follow a government review of technology giants carried out by an expert panel led by former US president Barack Obama’s chief economic advisor, Jason Furman. The review included recommendations into the UK digital economy included a new code of conduct for large companies.
The measures set out in the online harms whitepaper include:
- A new statutory “duty of care” to hold companies accountable for the safety of their users, as well as a commitment to tackle the harm caused by their services.
- Further stringent requirements on tech companies to ensure child abuse and terrorist content is not disseminated online.
- A regulator will have the power to force social media platforms and others to publish transparency reports on the amount of harmful content on their platforms and what they are doing to address this.
- Making companies respond to users’ complaints, and act to address them quickly.
- Codes of practice, issued by the regulator, with measures such as requirements to minimise the spread of misleading and harmful disinformation with dedicated fact checkers, particularly during election periods.
- A new “safety by design” framework to help companies incorporate online safety features in new apps and platforms from the start.
- A media literacy strategy to equip people with the knowledge to recognise and deal with a range of deceptive and malicious behaviours online, including catfishing, grooming and extremism.