What are the options for regulating internet companies?

The UK – and other countries – are likely to introduce new regulations for internet and tech companies like Facebook and Google. We consider what sort of rules they might introduce

Much has been written recently about the responsibilities of digital platforms to better regulate the content that is shared on their sites. There has also been recognition of the power that tech giants can exert on elections and public discourse. 

In response, the UK government will shortly publish its Online harms whitepaper, setting out its response to these concerns. What are the recommendations likely to include?

Platform or publisher?

Platforms like Facebook have traditionally argued they are not responsible for the content published on their sites, arguing they are just a conduit for others’ content and not a publisher.

However, it’s clear that this view does not reflect the role that social media platforms play in curating content for users and nor does it recognise the adverse effects harmful content can have.

It’s likely that any forthcoming legislation will create a new legal category for digital platforms, recognising that they are neither just a conduit for content nor a publisher.

Defining this new “intermediary” category will be difficult and will require careful consideration to ensure it does not unfairly prejudice smaller players in an already monopolised market.

Under the “intermediary” category, platforms will assume legal liability for the content they publish that is identified as harmful to users.

It’s likely that platforms will need to review any content that is potentially illegal under UK law and make a decision as to whether to remove, block or flag that item. Failing to undertake these steps within a predetermined timeframe may expose the digital platform to large fines.

Social media code of practice

The government has already confirmed that it will bring forward a statutory social media code of practice. The code will set a higher bar in terms of the safety provisions that platforms will be required to offer, together with requirements around transparency reporting.

The code was originally intended to be voluntary, but recent events such as the Momo Challenge and the Christchurch mosque shootings have led to increasing calls for the code to be mandatory.

It’s expected that any code of practice will be overseen by an independent regulator, possibly Ofcom. Irrespective of whether it is Ofcom or a newly created regulator, any regulator will need strong enforcement powers and the ability to levy large fines and conduct audits, similar to the role of the Information Commissioner’s Office under data protection legislation.

Age-appropriate design

There is clearly a need to better regulate how children use and interact with digital platforms.

The Information Commissioner’s Office has already consulted on the Age Appropriate Design Code, which gives the design standards expected for providers of online services and apps used by children to meet when they process their data. The code is expected to be published later this year.

Other likely changes include safety-by-design principles that are integrated into the accounts of those who are under 18 years of age. This is likely to include ensuring strong security and privacy settings are switched on by default, while geolocation settings are turned off. Strategies to prolong children’s engagement, like targeted advertising, may also be regulated or prohibited.

Finally, it’s probable that digital platforms will have a duty of care towards users who are under 18 and other vulnerable adults to act with reasonable care to avoid harm. Currently, Facebook’s terms of service exclude any liability for any “offensive, inappropriate, obscene, unlawful, or otherwise objectionable content posted by others that users may encounter on their products,” but this is likely to be overridden by a new statutory liability.

Fair competition

The Cairncross Review, published in February, looked at some of the challenges facing high-quality journalism in the UK. The review found that people’s choice of news provider is increasingly influenced by what online platforms show them and is thus largely based on data analytics and algorithms, the operation of which are often opaque.

Social feeds and search results show snippets and single articles. Cairncross recognised that this creates a more disaggregated news experience than using traditional news publishers. To boost online ad revenue, publishers aim to maximise the number of times readers click on content. This leads to clickbait and sensationalist headlines.

As a consequence of these findings, it is likely that the Competition and Markets Authority, possibly via the creation of a new digital unit, will be given increased power to ensure fair competition and beneficial outcomes to consumers and businesses. A key requirement will be the ability to audit digital platforms’ use of algorithms.

Action is needed

Digital platforms are no longer just a conduit for information. What started out as a way to connect with friends is now where a significant proportion of society consumes much of its news.

Facebook now has over 2.3 billion users and this affords the company huge amounts of power to affect how we view events. While digital platforms have undoubtedly benefited society in various ways, the size and power of these companies means they must be held to objective standards of conduct.

A failure by law and policy makers to act may undermine quality investigative journalism and result in ongoing harm to young people and vulnerable adults.

Imposing legal liability on platforms and giving regulators increased power seems to be the only way to hold digital platforms to account.

Read more about internet regulation

Read more on Social media technology

CIO
Security
Networking
Data Center
Data Management
Close