pe3check - stock.adobe.com

Better the data you know – how GDPR is affecting UK tech companies

As the dust settles from the General Data Protection Regulation, the implications for technology firms in the UK are becoming clearer

Now the EU’s General Data Protection Regulation (GDPR) is in force, we are starting to see a shift in the attitude of UK tech companies as they learn what compliance with the new law means for them and as opinions around privacy evolve.

The UK’s implementation of GDPR is the Data Protection Act 2018 (DPA). It mirrors GDPR unless a tech company operates in, or provides technology to support, one of the following sectors: credit referencing, law enforcement, national security and intelligence, journalism, academia, the arts, crime and taxation, regulatory functions, education, health and social care.

There are, of course, carve-outs and clarifications relevant to any industry, particularly with regard to processing within an employment function or as part of corporate finance, legal advice or proceedings. These are, by their nature, limited to specific business processes.

Overall, the message is consistent – pay attention to your data processing and the privacy rights of individuals, or risk enforcement action.

One function of the DPA was to scope the roles and responsibilities of the UK regulator, the Information Commissioner’s Office (ICO), including setting out a procedure for how to deal with breach and enforcement.

One area of note affects companies marketing an information society service (ISS) to a child. The DPA confirms that the age at which a child can give consent for data processing is 13, lowering the European benchmark of 16.

Under this age, consent must be given by an individual with parental responsibility. Any business operating in this field must, considering the technology available, make reasonable efforts to verify that appropriate consent has been given.

An ISS is one delivered by electronic means (using electronic equipment and processing), provided at a distance (without the physical presence of the customer), and at the request of the recipient (normally for remuneration).

Interestingly for tech companies, the ICO has promised future publications in the form of codes of practice. These will give practical guidance regarding: the new legislation on data sharing; standards of age-appropriate design of ISS likely to be accepted by children (considering likely use and development needs); direct marketing; regulatory action including information, assessment, enforcement and penalty notices; and data processing for journalism.

Consultations on the first two – data sharing and age-appropriate design – are live and open until September 2018.

Online privacy and fake news

On 29 July 2018, parliament published an interim report on disinformation and “fake news”, which pushed several recommendations for consideration from a privacy angle.

The first is the need for the ICO to have additional resource, specifically technical expertise to be able to scrutinise the operations of tech companies and how they use data, both now and in relation to future trends.

With the ICO already under-resourced, by its own admission, due to the loss of staff to the private sector to run GDPR compliance programmes, significant investment will be needed to ensure the regulator can keep abreast of technology.

The suggestion is for a levy to be introduced for tech companies operating in the UK, in a similar way to how the banking sector contributes to the funding of its regulator, the Financial Conduct Authority.

Should this be a concern for tech companies? Some comfort can be taken from the fact that the report narrowly defined the definition of a “tech company” to “different types of social media and internet service providers, such as Facebook, Twitter and Google”.

This suggests that any levy would be limited to particular operators, namely the big players. On the other hand, this may indicate the direction of travel to wider application for tech companies, which should already consider whether they need to pay a data protection fee to the ICO (or risk a financial penalty).

Legal loopholes

Second, the select committee report recommended closing what it deems to be a legal loophole whereby, following Brexit, tech companies may be able to avoid UK data protection law by processing the personal data of UK users from bases in the US.  

Further commentary on this is required, especially as the DPA applies to the processing of personal data in the context of activities of a business or person operating in the UK – whether the processing occurs in the UK or not – and outside the EU. This is a very broad application.

Finally, the report identifies a need for enhanced auditing of the non-financial aspects of tech companies, including security mechanisms and algorithms, to ensure responsible operations.

The suggestion is that the ICO should take this role but, if this comes to fruition, it emphasises again the need for substantial investment for the regulator.

Although it could be an error, the recommendation was expressed for “technology companies”. This begs the question: is the intention for algorithmic auditing to extend beyond social media and internet service providers?

We await, with interest, a government white paper on these and other issues this autumn.  

What’s happening in practice? 

Within the legal market, we are seeing some key themes for tech companies:

  1. With the increased public awareness of GDPR, startups are more conscious of compliance and are keen to get it right from the outset. It is often a challenge for this sector to expend resource on compliance when funds are tight, and the focus is growth.
  2. The continued development, and utilisation of, privacy-enhancing technologies.
  3. The adaption of operational models to avoid personal data collection in the first place.
  4. The necessity to simplify privacy notices and terms and conditions – criticism that has long plagued the big market players.
  5. A shift in understanding of processing. Tech companies that traditionally did not see themselves as data processors have been challenged. Even within a software-as-a-service model, the provider’s ability to tap into controller data for ongoing support and maintenance can be enough to constitute processing. This is leading to the need to consider and negotiate additional contractual protections. With a raft of data-processing agreements circulating, standard contractual clauses from the ICO would be welcome to lessen negotiation timescales.
  6. The risk of going “off scope”. For tech companies operating as processors, care should be taken to ensure they keep within the confines of the controller’s documented instructions. Deviations are likely to result in the processor becoming a controller in its own right and, under the DPA, an offence may be committed for unlawful obtainment, disclosure or retention of personal data without controller consent.

With the ICO describing privacy and security assessments as an evolutionary process for organisations”, these themes will doubtless continue to dominate for the foreseeable future.

Read more on Privacy and data protection

CIO
Security
Networking
Data Center
Data Management
Close