Maren Winter - stock.adobe.com
Ireland’s Data Protection Commissioner (DPC), which is responsible for regulating big tech companies with headquarters in Ireland, has called for objective metrics to measure the effectiveness of European data protection regulators.
The comments of Ireland’s Data protection commissioner Helen Dixon follow complaints that the DPC is failing to stand up to big tech companies, such as Facebook and Microsoft.
The Irish Council for Civil Liberties argued last year that Ireland has failed to deliver decisions on big tech companies, leaving European Union (EU) enforcement “paralysed”.
This week, Facebook whistleblower Frances Haugen told an Irish parliamentary committee that there should be an independent review of the Irish DPC’s regulation of big tech.
Dixon used her annual report published this week to argue that the EU data regulators must agree metrics to measure the effectiveness of enforcement.
“If the collective goal of all of us is to ensure better protection of people from misuses of their personal data and, indeed, to ensure they are not disadvantaged by ‘over-implementation of GDPR [General Data Protection Regulation] rules’, the types of quantitative and qualitive metrics that need to be assessed must be carefully laid out,” she said.
“Further, enforcement priorities must be set and the impact of different enforcement measures and sanctions must be tracked and analysed over time for impact and value for money,” she added.
“Such metrics must, however, move past both superficial totting exercises and assumptions to the effect that the bigger the fine, the greater the change of behaviour it heralds,” she wrote in the DPC’s 2021 annual report.
GDPR enforcement at risk of damage
Dixon said that “in some respects at least”, the DPC needs to do better and that it would be beneficial for regulators to have a “shared understanding” of what measures they are tracking.
“In the absence of an agreed set of measures to determine achievements or deficiencies, the standing of the GDPR’s enforcement regime in overall terms is at risk of damage,” she said.
Dixon said that this was particularly the case “when certain types of allegations” levelled against the Irish DPC “serve only to obscure the true nature and extent of the challenges” presented by the EU regulatory framework – which requires member states to legislate for the enforcement of data protection across the EU.
“We operate in an environment in which, as things stand, there is no agreed standard by which to measure the impact and success (or otherwise) of our regulatory interventions,” said Dixon.
That has created a vacuum and “a narrative has emerged in which the number of cases, the quantity and size of the administrative fines levied, are treated as the sole measure of success, informed by the effectiveness of financial penalties” at driving changes in behaviour.
Luxembourg and Ireland were cited at the top of a league table for fines in the EU, but this tells us little about how effective regulation under GDPR has been, Dixon stated.
Figures comparing the number of cross-border regulatory cases also provide little meaningful insight, as the decisions vary widely in complexity and the investigative procedures applied.
Dixon said that, for example, a decision by the DPC “running to several hundred pages, touching on the complex processes of large multi-national organisations and impacting millions of people is measured side by side with a two-line treatment of a comparatively simple issue that has minimal ramifications for data subjects in general”.
“This is clearly not an informative means of measuring the success (or otherwise) of the GDPR,” she added.
The DPC is working alongside other data protection authorities to agree to a set of metrics to measure regulatory outputs across the EU “on a like-for-like basis”, with a view to addressing questions about the effectiveness of regulatory interventions, sad Dixon.
Regulating big tech
Dixon said that much of the public commentary around the EU about the effectiveness of data protection regulation was set against concerns about the control exercised by large-scale social media platforms.
In Europe, she said, “there is no question” that – even allowing for its imperfections – GDPR will continue to provide “the best-available framework within which the data protection rights of individuals can most effectively be vindicated”.
“We need to identify, with precision, the particular harms and risks we are looking to reduce and/or eliminate,” she added.
Dixon said it is not the role of the DPC or any data protection authority to target “all manifestations” of power exercised by technology platforms.
The DPC questioned the effectiveness of the EU’s “one-stop shop” which allows technology companies to be regulated by a single country’s data protection authority, rather than by multiple data protection authorities in Europe.
She said that the one-stop shop had streamlined the regulatory and administrative challenges faced by technology companies, but that it had been less successful at ensuring a harmonised interpretation of GDPR and creating a level-playing field across the EU member states.
Dixon said that not all the activity of multinational companies falls under the scope of the one-stop shop arrangements, leading to decisions by different EU supervisory bodies that are difficult to reconcile.
“That so much cross-border activity can sit outside the one-stop shop brings into question the effectiveness of the coordination efforts that were intended to be a feature of the regulation of cross-border processing operations,” she said. “It may also be said to undermine the idea central to the GDPR – that a level playing field could be created across Europe.”
Cross-border enquiries and decisions
At the end of 2021, the DPC had 30 cross-border inquiries underway, according to the annual report.
They include three ongoing inquiries into Facebook – now known as Meta – examining data breaches, the processing of children’s data on Facebook’s Instagram, and the legal basis Facebook relies on to process personal data.
Following complaints by Austrian lawyer Max Schrems, the DPC is seeking comments from Facebook over the lawfulness of data transfers from the EU to the US, in addition to comments on the legal basis for Instagram and WhatsApp to process personal data.
Other inquiries are underway into tech companies – this includes Google over its processing of location data, LinkedIn over its processing of personal data for advertising, as well as Apple, Twitter, Yahoo, and TikTok.
The DPC received 10,888 queries and complaints from individuals in 2021, which is an increase of 7% on 2020 figures, of which 8,017 had been concluded by the end of the year.
In one notable decision, the DPC imposed a fine of €225m on WhatsApp for a range of compliance failures in 2021.
The DPC submitted eight EU-wide cross-border draft decisions for review by other data protection authorities between May 2018 and December 2021.
Although most data protection authorities have agreed with the DPC’s draft decisions, Germany has lodged six objections, Italy and Poland five, the Netherlands and France four.
Two decisions have been resolved, two are awaiting dispute resolution, and the DPC is considering four that have objections against them.
Cloud service providers will be required to protect data from unlawful access by foreign governments
Cloud service providers in Europe – including Amazon, Microsoft, Google and IBM – will be required to set up safeguards to prevent non-EU governments gaining access to EU data, under draft legislation from the European Commission.
The Data Act aims to free-up data generated by internet devices, such as smart watches, connected cars and sensors, for access by consumers and businesses. The act “will raise trust” by introducing mandatory safeguards to protect data held on cloud infrastructure in the EU.
“This will avoid unlawful access by non-EU/EEA governments,” the EU said in documents published this week.
The draft law requires cloud services companies and other data processors in the EU to take “all reasonable” technical, legal and organisational measures, including contractual arrangements to prevent governments outside the EU unlawfully accessing non-personal data.
This could include encrypting of data, frequent audits, verified compliance with security certification schemes and corporate policies to protect data, the draft law states.
Court orders from non-EU countries seeking access to data held in the EU can only be enforced where there is an international agreement, such as a mutual legal assistance treaty in place.
Without a legal agreement, third-countries making data requests will have to comply with strict conditions. These include ensuring that the data provider has the right to request a review from a court in the requesting country that will take into account their legal interests under EU law and the national law of the EU state.
Any court orders must set out why requests for data are proportionate, and establish clear links to suspected persons or infringements.
Organisations that receive a request for data from a non-EU country will also have the right to seek the opinion of a regulator over the validity of the request, particularly when the request is for commercially sensitive data, or impinges on national security or defence.
Cloud providers are required to hand over only the minimum amount of data in response to a request from an overseas court.
They will have a duty to inform the data holder about the request before complying with it – except when the request is made for law enforcement purposes.