sdecoret - stock.adobe.com
The controversy over Cambridge Analytica’s alleged use of Facebook data for political purposes has impacted not only Facebook’s reputation, but also how businesses think about how their staff use technology, a survey has shown.
More than half (55%) of 350 IT decision-makers polled by Barracuda at Cloud Expo Europe in London said they trust Facebook less after the revelations, while 12% said they had deleted their account since the news broke and 29% had amended their personal security and sharing settings.
The survey shows that the scandal has highlighted issues around data privacy, sharing and security, and will affect business policies on how employees use the corporate network.
Of those IT decision-makers who allow access to Facebook through the corporate network, almost half (46%) said that they were planning to educate staff on how to protect themselves and their data, 8% said they were taking more drastic steps and blacklisting the social network altogether, and 7% said they would implement stricter controls on who had access to Facebook.
“While it is perhaps no surprise to learn that use of Facebook both inside and outside the workplace would be affected, this story certainly appears to have had a broader impact,” wrote Chris Ross, international senior vice-president at Barracuda, in blog post.
Almost two-thirds of respondents (62%) said that as a result of the revelations about Facebook, they had reviewed their corporate policy for allowing user access to non-business-related sites and apps, either in terms of providing new guidance or restricting access. Only 20% planned to maintain a policy to allow free access to non-business sites and apps.
“While restricting access to non-business apps from the workplace can improve productivity, it may not impact Facebook’s ability to collect and share the personal information of users,” said Ross.
“However, these privacy concerns have raised some strong feelings in the business community around Facebook’s viability as a business tool. Our recommendation is that organisations that continue to leverage Facebook as a business platform should review some basic controls.”
Barracuda recommends that organisations:
- Consider who is allowed to share data and create policies, so only certain authorised groups can post to the corporate Facebook account.
- Be aware of attempts to use information that is available in the company’s social profile for malicious activities, such as spear phishing to target executives.
- Train your users to recognise such attempts and take appropriate action.
“While the longer-term effect on Facebook’s reputation remains to be seen, we expect to see organisations making decisions about whether the platform poses a security risk and how to minimise the threat on those occasions where an alternative option just doesn’t exist,” said Ross.
Facebook is not only counting the cost of losing users’ trust, but also investors’ trust, with tens of billions of dollars wiped off the company’s value since news of the scandal broke.
Facebook CEO and founder Mark Zuckerberg came under fire for being slow to respond to the revelations, but as the company’s stock value took a beating, he promised that Facebook would crack down on abuse of the platform, strengthen its policies and make it easier to revoke apps’ ability to use personal data.
This was quickly followed up by the announcement of several steps to put users more in control of their privacy, but less than a week later, the company was forced to apologise for storing videos recorded using the Facebook webcam tool and then deleted, saying a “bug” resulted in them being stored indefinitely instead.
The bug was reportedly discovered when users downloaded all their Facebook data to see what the company was storing and found that it included videos that were recorded years ago and never posted.
In a statement to TechCrunch, Facebook said: “We discovered a bug that prevented draft videos from being deleted. We are deleting them and apologise for the inconvenience.”
In its latest attempt to regain users’ trust, Facebook has announced changes to part of its ad ecosystem that allows third parties to upload contact lists to target customers on Facebook. From now on, Facebook will require companies to certify that they obtained user consent to use the information in that specific way, according to a leaked email seen by TechCrunch.
In his latest statements related to the scandal, Zuckerberg lashed out at Apple CEO Tim Cook, who has called for stronger privacy regulations that prevent the misuse of data by applying it in new ways without the knowledge of the data owners.
Cook said: “The ability of anyone to know what you’ve been browsing about for years, who your contacts are, who their contacts are, things you like and dislike and every intimate detail of your life – from my own point of view, it shouldn’t exist.”
The Apple CEO also criticised the business model used by Facebook of “selling users to advertisers”, but Zuckerberg told website Vox that he finds “extremely glib” the argument that “if you’re not paying, that somehow we can’t care about you”.
He added: “The reality here is that if you want to build a service that helps connect everyone in the world, then there are a lot of people who can’t afford to pay. And therefore, as with a lot of media, having an advertising-supported model is the only rational model that can support building this service to reach people. That doesn’t mean that we’re not primarily focused on serving people.”
Read more on Privacy and data protection
Facebook undermined rivals in bid to dominate global messaging
Facebook asked to explain discrepancies in evidence over Cambridge Analytica
Zuckerberg responsible for Facebook privacy compliance after $5bn FTC fine
Facebook’s privacy game – how Zuckerberg backtracked on promises to protect personal data