Luca Lorenzelli - stock.adobe.co

Covid-19 contact-tracing apps: the key data protection issues

Organisations intending to deploy contact-tracing apps will need to maintain high standards of privacy compliance, security and ethics to guarantee sufficient uptake and meaningful results

Contact-tracing apps are gaining momentum as a possible way out of lockdowns, but their success will ultimately turn on the extent to which they are installed and used correctly.

Essentially, the contact-tracing process tracks those who have been in contact with a person infected with Covid-19 and notifying such persons so appropriate steps can be taken, such as self-isolation or testing. The goal is containment of the virus. While traditional contact tracing is labour-intensive, the hope is that digital contact tracing will automate the process, allowing for a swifter and more widespread solution.

Most major global health authorities are spending significant resources developing nationwide contact-tracing apps to tackle the virus. The NHS, for example, has outlined its own plans with the NHSX app, which people can use to self-report symptoms, receive advice and, in the future, elect to give the NHS further information so that it can identify hotspots.

Many private organisations are also looking into developing their own apps to provide safe environments, to encourage workers to return to work and the public to visit their venues. However, an overriding concern about the privacy and security implications still remains.

Google and Apple’s contact-tracing framework

To help tackle these concerns, Google and Apple have combined forces to establish a contact-tracing framework (CTF) using Bluetooth technology, allowing links to be made between phones that have been in close proximity with each other.

When an individual contracts the virus or displays symptoms, they can notify the app and its dataset can then contact those who were near the individual during the past 14 days. The CTF does not track location, and matching is decentralised on user handsets, which limits use of the data for wider analysis.

The Information Commissioner’s Office (ICO), in noting that the proposals “appear aligned with the principles of data protection by design and by default”, has given its cautious approval of the framework.

The NHS’s approach

Whilst most European health authorities have adopted CTF, the NHS, along with the French health authorities, has rejected the CTF decentralised approach in favour of a centralised approach (whereby matching will happen via a server).

This departure from the crowd has raised privacy eyebrows. In the NHS’s view, a centralised system will offer better insight into the spread of the virus, as well as any infection hotspots, therefore making it more effective.

It has dismissed the argument cited by the German authority that contact-tracing apps based on the centralised approach will have a significant impact on user experience unless certain changes are made to iPhone settings (which Apple is unwilling to make). Another criticism of the centralised approach is that holding the data in one place increases the risk of misuse and/or theft.

Another blow to the NHSX app has been delivered by a recent report by The Joint Committee on Human Rights (JCHR), which has concluded that though the NHSX app could potentially pave the way out of current lockdown, there are significant concerns that it does not in its current form sufficiently protect the right to privacy.

Among other things, the JCHR’s report calls for new legislation specifically governing the app’s deployment, with guaranteed data and human rights protections. It suggests various obligations, including:

  • A clear description of the limited purposes of the app, so that the data cannot be utilised for any other purpose.
  • A requirement that data be held locally on a user’s device and deleted every 28 days, unless a user has notified the app they have the virus, or suspect they do, and has chosen to upload their data.
  • A requirement that all centrally held data must be subject to the highest security protections and standards.
  • Access to the data must be limited to those with statutory authorisation.
  • Prohibition on any reconstruction of the centrally held data to find additional information about a user.
  • A requirement that the health minister undertake a review of the app every 21 days, followed by a report to Parliament on the app’s efficacy and privacy protections.
  • Powers for a digital contact-tracing human rights commissioner to oversee the roll-out of digital contact tracing, look into individual complaints, and make binding recommendations on data protection measures.

Despite concerns raised about the NHSX app, it should be noted that the ICO has been working closely with the NHS to help ensure a high level of transparency and governance, and is currently reviewing the data protection impact assessments relating to the NHS’s pilot of the app in the Isle of Wight.

What are the data protection issues?

As contact-tracing apps involve the extensive collection and use of personal data, it is critical that users believe that their data will be processed safely, securely and fairly. It is therefore imperative that app developers keep privacy by design at the forefront of their planning.

Security – Though the security considerations will differ for centralised and decentralised CTFs, it is vital that the infrastructure of contact-tracing apps is secured with appropriately robust cryptographic and security techniques. The ICO recommends applying security techniques to secure the data both at rest and in transit.

Data minimisation and purpose limitation – As the data is highly sensitive, privacy campaigners have raised concerns over the potential for it to be used for other purposes. The ICO advises app developers to be transparent with users about the purpose of the app (and whether this is likely to evolve), and only collect the minimum amount of data necessary to achieve said purpose. The data should only be stored for the minimum amount of time necessary, and users should be told the retention period and the reasoning behind it.

Transparency – Transparency is a standard that any app must meet to be General Data Protection Regulation (GDPR) compliant. Privacy notices must be comprehensive but easy to digest, and developers should be transparent about their design choices, any risks their approach poses to individual rights, the benefits the app seeks to achieve, and the purposes of their app.

User control – Users must always be able to exercise their rights. This includes rights under the GDPR, and their right to opt out without any negative consequences. App developers should decouple functions (so users can benefit from one function without being compelled to provide data for another) and ensure users can easily exercise their rights through the app.

Legal basis – The legal bases relied on for processing this data will depend on the nature of the data, how it is collected, and its purpose. Although the ICO noted in its most recent guidance to app developers that consent is likely to be the lawful basis when processing personal data in connection with use of such apps, it has previously expressed concern about how the consent may be collected, and the impact on functionality if it is withdrawn.

It has therefore not ruled out the ability for organisations to rely on other lawful bases, provided organisations can demonstrate the processing is necessary and can be achieved by no other means. It has also clarified that although use of the app must always be voluntary, this does not prevent organisations from relying on lawful bases other than consent.  

Finally, to the extent apps use tracking technology to collect proximity data or enable push notifications, the ICO has confirmed that it may not always be necessary to obtain consent under Privacy and Electronic Communications Regulations (PECR), if use of such tracking technology is strictly necessary for the functioning of the app.

Responsibility for compliance – The named data controller of the personal data collected by the app will be responsible for the majority of GDPR compliance, but organisations must make clear to users the identities, roles and responsibilities of all parties processing personal data as part of the contact tracing.

DPIAs – The ICO considers that data protection impact assessments (DPIAs) are required for contact-tracing solutions, both prior to implementation and at relevant points through the app’s lifecycle. Technical development lifecycles and any product updates should be built to trigger the right thresholds for refreshing DPIAs, so that an app is not evolving without being assessed by the ICO.

Engaging third parties – Where developers use third parties to develop and maintain the app, procedures and processes (as well as compliant contracts) should be implemented to ensure those third parties are invested in ensuring data privacy risks are minimised.

Decommissioning – Decommissioning should be built into an app’s roadmap, including considerations of whether the app will dismantle itself once the crisis ends, and steps that will be taken to erase or anonymise the data once the contact-tracing purposes are redundant. A decommissioning process must be independently verifiable and auditable, and any decommissioning considerations should not only cover a general retirement of the app, but also instances where individuals uninstall the app.

Looking at further considerations, organisations will additionally need to consider the legal issues of deploying the apps in practice. Key questions are likely to be whether they can require their workforce, event attendees or visitors to install and use the app, and to report when they receive notifications.

While those providing devices to their workforce can mandate that workers download the app, those who operate a bring-your-own-device (BYOD) system will run into more difficulties and have limited powers.

Similarly, while a company can require workers to disclose some data to third parties, the data processed by contact-tracing apps is far more sensitive, and forcing an individual to use the app directly conflicts with the position held by the ICO and other data privacy regulators – use must be voluntary.

Much will depend on whether contact-tracing apps come to be adopted as part of generally accepted health and safety practices. However, if organisations wish to see the desired uptake of contact-tracing apps, they should make sure to abide by data protection law, and ensure the principles of fairness, transparency and proportionality are applied.

Bryony Long is a partner at Lewis Silkin, and Mithila Gupte is an associate at Lewis Silkin.

Read more about the security of contact tracing

Read more on Mobile apps and software

CIO
Security
Networking
Data Center
Data Management
Close