TTstudio - Fotolia
The UK government should act immediately to deal with a “pandemic of misinformation” and introduce a draft online harms bill as a matter of urgency to rebuild trust in democratic institutions, warns a report from the Lords Democracy and Digital Technologies Committee.
The report, Digital technology and the resurrection of trust, makes 45 recommendations which, if implemented, could help restore public trust in government institutions and limit the power that has been ceded to a “few unelected and unaccountable digital corporations”, primarily Facebook and Google, it said.
The recommendations are built around the understanding that digital platforms are primarily advertising platforms – designed to optimise the performance of ads for their business customers – and, as such, act as intermediaries between those customers and user content that they did not create.
“We are living through a time in which trust is collapsing,” said the committee’s chairman, Lord Puttnam. “People no longer have faith that they can rely on the information they receive, or believe what they are told. That is absolutely corrosive for democracy. Part of the reason for the decline in trust is the unchecked power of digital platforms.
“These international behemoths exercise great power without any matching accountability, often denying responsibility for the harm that some of the content they host can cause, while continuing to profit from it.
“We have set out a programme for change that, taken as a whole, can allow our democratic institutions to wrestle power back from unaccountable corporations and begin the slow process of restoring trust. Technology is not a force of nature and can be harnessed for the public good. The time to do so is now.”
The report urges the government to immediately publish a draft online harms bill, which should clearly cover the impact of disinformation in its scope and give Ofcom, the proposed online harms regulator, the power to hold digital platforms legally responsible for content they recommend to large audiences, or that is produced by users with a large following on the platform.
The committee said online harms legislation should make clear that platforms’ duty of care extends to preventing generic harm to democracy, as well as specific harm to an individual, and that the legislation should be introduced within a year of the report’s publication.
The proposed sanctions that Ofcom should be able to impose include fines of up to 4% of an organisation’s global turnover, and the ability to enforce internet service provider (ISP) blocking for serial offenders, the committee said.
The committee also called on the government to appoint a digital ombudsman for content moderation, which would provide a point of appeal for people who feel let down by technology platforms.
“This ombudsman’s decisions should be binding on the platform and, in turn, create clear standards to be expected for future decisions for UK users,” said the report. “These standards should be adjudicated by Ofcom, with platforms able to make representations on how they are applied within their moderation processes.”
To ensure that technology platforms are transparent about their use of data and algorithms, the report recommends giving Ofcom the power “to compel companies to facilitate research on topics that are in the public interest”, after a number of witnesses told the committee that “no one had the types of data needed to do the necessary research” on what is really happening on these platforms.
The report noted that Facebook had given researchers access to a dataset with one billion-plus data points in February 2020, but the researchers have since complained of the social media giant’s “restrictive” interpretation of the General Data Protection Regulation (GDPR), which has made data sharing between them difficult.
“It is worth noting that it appears that Facebook do not have such a restrictive interpretation of GDPR when it comes to sharing data with their commercial partners,” said the committee. “Facebook’s commercial partners have greater data access in some areas than external researchers.”
It added that for research to be truly independent, it must be regulators and academics, rather than platforms themselves, who choose the research topic.
Read more about online harms
- UK companies dominate ‘safety tech’ sector with a quarter of global market share, says report, although legislation regarding online harms has stalled.
- A group of 14 technology companies have come together to launch a UK industry association dedicated to tackling online safety, with support and backing from the government, campaigners and charities.
- Facebook is hiring for 500 tech jobs in the capital in the next six months, with a particular focus on seeking out and clamping down on malicious content and fake accounts, and curbing other harmful behaviour on its platforms.
On top of this, said the committee, Ofcom should be given the power – and be properly resourced – to conduct periodic audits of the platforms’ algorithmic capabilities, which includes access to the systems’ training data and comprehensive information from the platforms on what content is being recommended.
“Ofcom should have the power to request any data relevant to ensure that platforms are acting in accordance with their duty of care,” it said.
In response to questions from Computer Weekly during a virtual briefing on the report, Democracy and Digital Technologies Committee member Lord Holmes said that “of course there will be” resistance to the proposed transparency measures.
“Will there be attempts to drag feet, put in work-arounds, try and defend those [business] models to the hilt? Of course they will, because… look at the bucks those models generate,” he said.
“It’s an absolutely stunning economic model. It’s absolutely double, triple, quadruple-dipping in terms of how the thing works and how every element is monetised and remonetized, but often off the back of extreme, often hateful, often divisive, content.”
The report recommended that all these new regulatory functions should be overseen by a joint parliamentary committee from both Houses, which is constituted in such a way that there can be no government majority among its members.
A committee of regulators – which would include the Competition and Markets Authority, the Information Commissioner’s Office and Ofcom among others – should also be established separately to allow for joint investigations between different regulators, said the committee.
Both Lords and MPs have previously expressed frustration about delays in the online harms bill, as well as the lack of a full government response to the online harms whitepaper published in April 2019, which put forward the world’s first framework designed to hold internet companies accountable for the safety of their users.
The government gave its initial response to the whitepaper in February 2020, when it put forward the proposal for Ofcom to be the online harms regulator. But in the government’s own press release announcing its initial response, it said the full response would be “published in the spring”.
Lord Puttnam added: “It is time for the government to get a grip of this issue. They should start by taking steps to immediately bring forward a draft online harms bill. We heard that ,on the current schedule, the legislation may not be in place until 2024. That is clearly unacceptable.”
The report’s conclusions are the result of more than 100 pieces of written evidence, which were accompanied by 66 witness testimonies across 26 evidence-gathering sessions.