naito8 - stock.adobe.com

DCMS Select Committee claims Facebook ‘deliberately frustrated’ fake news inquiry

Facebook’s UK public policy manager defends its participation in MPs' inquiry into the spread of fake news and disinformation online, which concludes by declaring an end to self-regulation by social media firms

Facebook stands accused of attempting to “deliberately frustrate” a parliamentary inquiry into what role social media sites should play in halting the spread of fake news and disinformation online.

The Digital, Culture, Media and Sport (DCMS) Select Committee marked the conclusion of its investigation into this area by hitting out at Facebook CEO Mark Zuckerberg for neglecting to personally cooperate with its work, and for impeding it in other ways.

The accompanying 111-page concluding report by the cross-party committee of MPs claims Zuckerberg’s snub demonstrates his contempt towards the UK government’s efforts to clamp down on the spread of fake news.

“By choosing not to appear before the committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards both the UK Parliament and the International Grand Committee, involving members from nine legislatures from around the world,” it states.  

In an accompanying statement, committee chair Damian Collins MP went a step further by claiming Zuckerberg needs to realise that while he may not believe he has a case to answer where Parliament is concerned, he does have a duty of care towards the billions of people who use Facebook around the world.  

“Evidence uncovered by my committee shows he still has questions to answer yet he’s continued to duck them, refusing to respond to our invitations directly or sending representatives who don’t have the right information,” said Collins.

“Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies,” he added.

“We believe that in its evidence to the committee, Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.”

In a statement to Computer Weekly, Karim Palant, UK public policy manager at Facebook, defended the scale of its involvement in the committee’s inquiry, and reiterated its commitment to addressing government concerns about the spread of fake news online.

“We believe that in its evidence to the committee, Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions”
Damian Collins, DCMS Committee

“[We] are pleased to have made a significant contribution to their investigation over the past 18 months, answering more than 700 questions and with four of our most senior executives giving evidence,” said Palant.

The committee also acknowledged in its inquiry the impact disinformation and fake news can have on the outcome of elections, and is also leading calls for a revamp of the electoral communications laws to bring them into line with the digital age.

According to Facebook, this is an area it has already started to proactively work on by introducing changes that provide a digital paper trail for every political advert that appears on its site.

“We are open to meaningful regulation and support the committee’s recommendation for electoral law reform. But we’re not waiting. We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years. No other channel for political advertising is as transparent and offers the tools that we do,” said Palant.

“While we still have more to do, we are not the same company we were a year ago. We have tripled the size of the team working to detect and protect users from bad content to 30,000 people and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse.”

Clamping down on disinformation

Overall, the inquiry concludes that social media sites must take greater accountability for the actions of their users and the content they share to halt the spread of fake news and disinformation across the internet.

As such, the committee is calling for social media sites, such as Facebook and Twitter, to take a more proactive approach to removing harmful content and known sources of disinformation from their sites.  

“Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use every day. Much of this is directed from agencies working in foreign countries, including Russia,” said Collins.

“We have tripled the size of the team working to detect and protect users from bad content and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse”
Karim Palant, Facebook

“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.”

The accompanying report expands on this point, and makes the case for bringing in an independent regulator to oversee the activities of social media firms, accompanied by the creation of a compulsory code of ethics that defines what harmful content is.

This would be similar in nature to the powers Ofcom wields when it comes to regulating the content put out by broadcasters to ensure it does not incite abuse or crime, and is age appropriate for the time it is shown, for example.  

Similarly, the regulator should have powers to take punitive action against social media firms that do not adhere to the code of ethics, including the option to issue large financial penalties.

“If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them,” the report states.

Ending an era of self-regulation

Social media firms have long claimed that they merely provide a platform for content to be shared, and are not liable for policing it, the report states, but that stance has to change.

“Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites,” the report states.

A new category of technology company is now needed that means firms are no longer described as merely a “platform” or “publisher” and means they will become legally liable for any harmful content their users share, the report goes on to say.

“There is now an urgent need to establish independent regulation. We believe that a compulsory Code of Ethics should be established, overseen by an independent regulator, setting out what constitutes harmful content,” the report states.

“Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites”
DCMS Committee disinformation and ‘fake news’ report

“The independent regulator would have statutory powers to monitor relevant tech companies; this would create a regulatory system for online content that is as effective as that for offline content industries.”

As it stands, the social media industry has operated on a system of self-regulation, which has proven to be insufficient to protect its users and society at large, said Collins.

“We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self-regulation must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator,” he said.

“These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologise than ask permission.”

Read more about internet regulation

Read more on Software-as-a-Service (SaaS)

CIO
Security
Networking
Data Center
Data Management
Close