naito8 - stock.adobe.com
A survey of 2,000 UK consumers shows that 83% believe Facebook should be regulated, supporting the view of MPs who think action is needed to curb the power of the social media firm.
Only 4% said they do not believe Facebook needs to be regulated at all, while 73% felt the social platform is damaging people’s mental health and 70% said they believe fake news is damaging democracy, according to the survey, carried out by One Poll on behalf of Eskenzi PR, which focuses on the cyber security sector.
The survey also revealed huge variations in opinions based on respondents’ age groups, with 70% of the 55-plus group strongly agreeing that Facebook should be regulated, compared with just 36% of those aged 25-34.
The poll also shows UK consumers are concerned that Facebook is not acting responsibly with their data, with many concerned about privacy issues, data misuse, cyber bullying and that the platform is being used by cyber criminals.
The validity of the concerns about Facebook being used by cyber criminals is demonstrated by a newly published report by a researcher at Surrey University who found that social media-enabled cyber crimes are generating at least $3.25bn a year in global revenue and that one in five organisations has been infected with malware distributed via social media.
Last week, a report from the UK Department of Culture, Media and Sport (DCMS) concluded that Facebook founder Mark Zuckerberg has failed to show “leadership or personal responsibility” over fake news and that the company is failing in its duty of care to its users.
The report also said social media sites must take more accountability for the actions of their users and the content they share, to halt the spread of fake news and disinformation across the internet.
The report called for a compulsory code of ethics to be introduced for social media companies that would be overseen by an independent regulator with powers to take punitive action against social media firms that do not adhere to the code of ethics, including the option to impose large financial penalties.
Tim Erlin, vice-president at security firm Tripwire, said that although the One Poll survey shows that the vast majority of respondents are concerned about Facebook, regulation could mean many things.
“These results are a good indication that lawmakers’ attempts to regulate Facebook will be greeted positively, but regulation is often a blunt instrument applied to problems with nuance,” he said. “In this case, any regulation developed would have to apply more broadly than to just Facebook. There is room for unintended consequences to other industries and organisations.”
Corin Imai, senior security adviser at DomainTools, said Facebook is entering dangerous territory. “The social media giant once thought of as too big to fail appears fallible for the first time, with such a wholehearted reprimand from the British public,” she said.
“The Cambridge Analytica scandal, combined with the government’s own damning findings about Facebook’s influence over democratic process, means they need to adapt – or face the consequences. While it would be unfair to say they have not done anything to protect the public from fake news and disinformation aimed at influencing democratic processes, there is still a mountain for them to climb in the public consciousness.
“Members of the public have wised up to the fact that elections are the critical infrastructure of our democracy, and that social media is an unpatched vulnerability within it. It’s up to Facebook to patch it, or governments around the world will be forced to intervene.”
Paul Bischoff, privacy advocate at Comparitech.com, said that because Facebook is one of the largest social media platforms, if it is regulated, other social media will follow suit.
“There are other reasons why Facebook deserves more scrutiny as well,” he said. “Facebook pesters users to enter profile information about their relationship status, employers, home town, education, and much more. That stuff is a gold mine for people who want political demographic data to use in targeted advertisements and posts containing fake news. Instagram and Twitter, for example, have much simpler profile pages with one-line bios and fewer personal details.”
The way Facebook deals with developers also makes it a target for regulation, said Bischoff. “Third-party apps and websites allow users to log in with their Facebook accounts in return for personal details stored in those accounts,” he said.
“Other social networks do this, too, but Facebook and Google accounts are by far the most popular ones to log in with. Although Facebook has strict guidelines about how that data can be gathered and used, it has little means of enforcement until after the damage is done. This was the case with Cambridge Analytica.
Read more about internet regulation
- Amazon does not dominate the online retail or cloud market, the firm’s UK director of public policy declared at an evidence-sharing session with the House of Lords Communications Committee about internet regulation.
- The House of Lords Communications Committee has opened an inquiry exploring the possibility of internet regulation in the UK, seeking input around issues such as the legal liability of online platforms for the content they host and how they moderate it, and how user data is protected.
“Facebook has vastly improved how it deals with third-party apps and websites, such as by removing third-party access to websites and apps that the user hasn’t interacted with recently. But Facebook, in the past, has given certain developers preferential treatment by allowing them to circumvent those rules or be grandfathered into older guidelines, and that is certainly a reasonable cause for EU regulators to step in.”
Bischoff added: “Facebook also needs to take greater care in vetting advertisers to weed out fake news. I think this has improved since Cambridge Analytica, but lawmakers don’t seem satisfied with Facebook’s self-policing efforts.”
Responding to the DCMS report, Facebook said it is committed to addressing government concerns about the spread of fake news online. In a statement to Computer Weekly, Karim Palant, UK public policy manager at Facebook, said the impact that disinformation and fake news can have on the outcome of elections is an area the social media firm has already started to work on by introducing changes that provide a digital paper trail for every political advert that appears on the site.
“We are open to meaningful regulation and support the committee’s recommendation for electoral law reform,” said Palant. “But we are not waiting. We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years. No other channel for political advertising is as transparent and offers the tools that we do.
“While we still have more to do, we are not the same company we were a year ago. We have tripled the size of the team working to detect and protect users from bad content to 30,000 people and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse.”