cherezoff - stock.adobe.com

Data (Use and Access) Act: Reflections on an eight-month statute

It took a long time, but the government's new data legislation finally made it into the statute book - it holds a lot of promise, but also leaves a lot of unanswered questions

When we had the first reading of the Data (Use and Access) Bill in October last year, I’m not sure many of us would have speculated it would take eight months before it became statute, but it did.

Before that, readers may well remember fondly the Data Protection and Digital Information Bill, numbers one and two, and briefly, the Data Information and Smart Data Bill. All previous incarnations of this legislation that did not make it. But the difference with the Data (Use and Access) Bill (DUA) - after eight long months, yes - is that it has finally passed, gaining Royal Assent on 19 June.

It's a good opportunity now to reflect on what the act does, what it does not do and what potential next steps should be taken.

What the DUA Act does

The big-ticket stuff includes provisions to enable the growth of digital verification services, new smart data schemes akin to open banking, and a new National Underground Asset Register. Other important changes in brief include:

Automatic decision-making (ADM) - The act loosens previous restrictions where decisions based entirely on ADM had legal or similarly significant effects. Some safeguards are included but this is a significant change.

Subject access - The DUA clarifies the time limits for organisations to respond to subject access requests including a “stop the clock” rule.

Children’s data protection - New rules require certain online services likely to be accessed by children to consider how to protect and support them when designing these services.

Scientific research - Scientific research is clarified such that it may include commercial research. It allows researchers to seek consent for broad areas of related research and clearly outlines the safeguards required for using personal data.

Recognised legitimate interests - A new lawful ground for processing personal data is introduced, giving businesses more confidence to use data for crime prevention, safeguarding, responding to emergencies, and other specified legitimate interests.

International data transfer - The rules are simplified and provide necessary clarification for transferring personal data internationally.

Responding to complaints - The DUA requires organisations to handle complaints from individuals who are concerned that the way their information is used breaches data protection legislation.

Storage and access technologies - The use of storage and access technologies, such as cookies, is permitted without explicit consent in certain, low-risk situations.

Changes to the Data Protection Act 2018 - Parts 3 and 4 of the 2018 act are amended, which regulate law enforcement processing and processing by the intelligence services. Some of these amendments mirror key changes being made to the UK GDPR, ensuring consistency across the data protection regimes. Others simplify the legislation, enabling those processing under the law enforcement regime to operate more efficiently.

Strengthens the Information Commissioner's enforcement powers - Raises fines under Privacy and Electronic Communications Regulations to GDPR levels - up to £17.5m or 4% of global turnover.

It’s quite a list.

What the DUA Act doesn't do

Unfortunately, so is the list of what the DUA doesn’t do:

AI governance and cross-sector clarity - One might imagine the DUA would have been a helpful way to address many of the (data) issues related to AI. The government didn’t. Sure, ADM was included but with safeguards and overall structure so far short of what is required. In relation to the more than urgent issue of intellectual property (IP) and copyright, despite our constant efforts in the House of Lords, the act became law leaving the issue worryingly unresolved.

DUA, though long and dense in parts, still leaves much uncertain, not least as a consequence of waiting on commencement regulations. We are similarly waiting on all things from a guidance perspective, with many applications and interpretations completely dependent upon this. Similarly, while the act may well clarify that scientific research may include commercial research, the exact boundaries of what qualifies remain to be defined through practice and guidance

Data rights and portability - In a dispiriting developing theme, DUA also does very little in respect of data portability, barely expanding further than existing frameworks. Moreover, DUA doesn’t create new fundamental digital rights or significantly expand individual control over personal data beyond existing protections

Cross-border enforcement - While the act simplifies transfer rules, it fails to establish new mechanisms for enforcement cooperation.

What happens next?

What happens next, from my and other legislators' perspective, is multifold. The partial nature of the act, the issues unaddressed, and the many rejected amendments leave us with much future action to consider, such as:

Responsible, sustainable AI and copyright - First, by far, all the unanswered questions around AI. Debates during the legislative process underscore the urgent need for cross-sector AI legislation in the UK. Consider the lack of mandatory AI impact assessments, not least in relation to those matters impacting fundamental rights, questions around liability, and IP and copyright. The chances of such legislation seem as distant as ever with the approach the government is taking.

Digital rights: The act, understandably, focuses on increasing data use and use cases. It does this however, with far less on increasing individual rights. Should we now consider a digital Bill of Rights? When it comes to algorithms, should we not think on meaningful transparency requirements rather than just notification? And, to those algorithms, in “automated decision” mode, should we not push for stronger rights and protections for individuals, to challenge, to correct, as I put forward in several amendments? And, would collective redress mechanisms not be worth further consideration?

International comparison - Denmark has proposed letting citizens copyright their own face, voice, and body features. France updated its criminal code in 2024 and sharing AI-created content of someone without consent now gets you up to a year in prison, which goes up to two years and €45,000 if shared online.

Beyond AI - The act is silent when it comes to quantum and its potential impact on data encryption and privacy. Silent in respect of augmented and virtual reality - currently swallowing swathes of individual biometrics - and internet of things devices just collecting away in the corner. All a recipe for market concentration and potential domination?

Implementation challenges - The act’s staged implementation requires that we continually monitor the effectiveness and costs of the new frameworks, considering if the tension between innovation and protection is working.

Parliamentary committee – A big mission and remit though it would be, I think it is time to strongly consider a special parliamentary committee dedicated to digital governance. Difficult, of course; doable, certainly - the alternative, as now, is that we are always on the back foot.

After several iterations and eight months of parliamentary wrangling the DUA does do some good, not least in smart data. It also leaves many questions unanswered. It does not deliver cross-cutting fundamental digital rights or anything to enable the UK’s espoused AI aspirations.

What is the government’s vision, its mission for the UK to be fully equipped, enabled, empowered to play our full role in this digital future now?

Read more on IT legislation and regulation