Sergey Nivens - Fotolia

ICO could use new GDPR powers in Facebook probe

UK privacy watchdog is considering using new powers in its investigation of Facebook, while the government is considering new laws to police social media firms

The Information Commissioner’s Office (ICO) could use new powers under the EU’s General Data Protection Regulation (GDPR) to examine software code to check whether Facebook’s processes comply with the new data protection rules.

Specifically, from 25 May, the GDPR grants the ICO the power to inspect a private company’s intellectual property through algorithm audits.

This will enable ICO investigators to check whether the software used by Facebook and other companies is processing European citizens’ personal data in a fair and transparent way, as required by the GDPR.

According to the ICO, these algorithm audits are not just at technology process, and typically involve interviews with company executives and algorithm designers about how data is processed.

Typically, the ICO will liaise with the company, but it does have the power to carry out a no-notice inspection if necessary.

Any company that fails to comply with the GDPR after 25 May could face fines of up to 4% of its global annual turnover, which in Facebook’s case would be $1.6bn, based on 2017 figures.

Facebook is among 30 organisations under investigation by the ICO for misusing personal data for political and other purposes.

The ICO announced the investigation in April following the data exploitation scandal involving London-based data mining firm Cambridge Analytica.

The data of nearly 1.1 million Britons out of a total of 87 million Facebook users had their profile data extracted by a quiz app downloaded by just 305,000 people.

Steve Woods, the ICO’s deputy commissioner, told The Telegraph that enforcement officers will visit Facebook’s Dublin site to look “behind the scenes” if they suspect that the company is processing personal data improperly.

ICO investigators could also use the new GDPR powers to check that Facebook’s software processes personal data in the way the social network claims it does.  

While the ICO’s investigation continues, it emerged at the weekend that the UK government is considering new laws to police social media firms.

In a series of interviews, Matt Hancock, secretary of state for digital, culture, media and sport, said new legislation is needed because self-policing by social media firms has not worked.  

Read more about the General Data Protection Regulation

One of the challenges is that while the government engages with big firms such as Facebook, Google and Twitter, there are many more with thousands of members that are not engaging with the government.

Hancock revealed that out of 14 social media companies that were invited to hold talks with the government, only four had responded.

The government is particularly keen to find ways of ensuring that social media platforms are not accessible to children who are not old enough to have an account.

In an interview with the BBC, Hancock said the government is aiming to get to a position where all users of social media have to have their age verified.

Government research shows that 60% of UK citizens polled had witnessed inappropriate or harmful content online, 40% had experienced online abuse and 40% thought their concerns were not taken seriously by social media companies.

Hancock wrote in The Telegraph: “I want the UK to be the best place in the world to be a digital citizen, where we can all benefit from an internet that is transformative, exciting and free, but that still offers protection from the worst online harms.”

He said that for too long, social media platforms had been allowed “to mark their own homework” and that although some progress has been made with the likes of Google, Facebook and Twitter, the government is considering new legislation that will improve online safety without stifling innovation.

“Hand in hand with the new home secretary, we will publish a white paper later this year setting out a range of measures to address both legal and illegal online harms,” he wrote.

Hancock said the legislation could make it mandatory for social media companies to sign up to a code of practice that requires them by law to be more transparent on the extent of harmful content on their platforms, as well as what they are doing to stamp it out.

But in an interview with ITV, he admitted that any new legislation could take “a couple of years” in order to “get the details right”.

Read more on Regulatory compliance and standard requirements

CIO
Security
Networking
Data Center
Data Management
Close