Sandwish - stock.adobe.com

The upcoming King’s Speech - where are the words on AI?

Despite past promises of regulation on AI, there is no indication the Labour government is planning any legislation in the next session of Parliament - and that's an economic and social mistake

In July 2024, we listened to the Labour government’s first King’s Speech. It contained a single line almost imperceptibly nodding towards something on artificial intelligence (AI). The government said it would “seek to establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models”.

Some 20 months later - no bill, no consultation and very little comment.

It was said there would be a draft bill ready to go at the end of 2024. Nothing appeared. A consultation heralded late last Autumn -still, this Spring, nothing. I suppose there is some consistency in this approach, but it is less defined by diving into the detail and more about mind the gap.

As we lurch towards the end of this Parliamentary session it is abundantly clear – there is still no AI Bill.

The next King’s Speech is likely to be in early May. Two months to go and, somewhat surprisingly, it seems this too will contain nothing on a specific AI bill.

I say “surprising” as the government may well wish to continue with its “wait and see” approach to AI legislation but, this comes at the same time as government is racking up an increasing number of significant issues that it wants to address in relation to AI. All of which, by its own admission, will require primary legislation.

Urgent need

It is for these reasons that I published recently a Parliamentary One-Pager (POP) on the matter. In short, stating the urgent need for cross-sector, cross-economy AI legislation and subsequent regulation in the UK.

I didn’t want to lay out a voluminous report, just a straightforward one-pager making one single point - the time for talk is well past, the UK government must decide its position, its vision for AI in the UK and bring forward the necessary legislation.

Effective governance isn’t a brake pedal - it’s how boards give companies strategic direction
Erin Young, Institute of Directors

The government seems caught on the horns of that tediously recurring false dichotomy that you can only have either innovation or regulation, which is utter nonsense as all history shows. For example, the fintech regulatory sandbox, to cite just one illustration. Measure of success? Well, replicated in just under a hundred jurisdictions around the world.

So, having agile, adaptive legislation and right-sized regulation is good for innovation and investment, good for the citizen, the creative, the consumer, effectively good for our country.

And the public are with the POP on this matter. Recent research from the Ada Lovelace Institute found the current government approach is “increasingly out of step with public attitudes”.

Add to this, even if we take the government at its word, it indicated that targeted primary legislation is required for, among other areas: frontier AI; IP and copyrighted works; the AI Growth Lab; and AI chatbots.

So according to the government, there will be no cross-sector legislation and no domain-specific legislation, but legislation is needed.

Whack-a-mole

When I launched the POP, Gaia Marcus, director at the Ada Lovelace Institute, warned that wait-and-see inevitably becomes whack-a-mole.

Putting the positive business case, Erin Young from the Institute of Directors reframed the conversation, pointing out that “effective governance isn’t a brake pedal, it’s how boards give companies strategic direction.”

And for the people, all of us, Hannah Perry of think-tank Demos made a compelling case that AI could be part of a new social contract between state and citizen, but only if we break out of the “democratic doom loop” with concrete protections like a declaration of digital rights.

It is more than clear that the current approach has resulted in fragmentation and uneven application. Multiple regulators, domain‑specific but not AI‑expert or experienced, and in certain sectors no regulator at all.

The day after my POP was launched, the Joint Committee on Human Rights heard from three such regulators - the Equality and Human Rights Commission (EHRC), the Information Commissioner’s Office and Ofcom - about the fast-growing impact of AI on people’s rights.

Each regulator emphasised that while they already oversee important aspects of AI - such as equality, privacy, and online safety - the pace of technological change is outstripping the speed of traditional regulation. They also highlighted serious concerns ranging from biased algorithms to gaps in oversight, especially where AI is deployed in sensitive areas like policing, welfare, and social media.

A recurring theme was capacity. Regulators face resource constraints, most starkly the EHRC, which has operated on a frozen budget for over a decade, while trying to keep up with increasingly complex technologies.

Again, the public seem to get it quicker than the government - 89% of those polled on the point by the Ada Lovelace Institute support an independent regulator. Interestingly, it is one of the key provisions of my AI [Regulation] private member’s bill.

Economic imperative

The point should be clear, not least to government, that a wait-and-see, voluntary, partial, domain-specific approach can no longer be accepted. It fails to enable, empower, and optimise the UK AI opportunity.

More importantly, it is not what the public want. There is an economic, a social, and a psychological imperative to act. We need a cross-cutting, principles-based, and outcomes-focused AI Bill. It’s time to legislate, together, on AI. It’s time to human lead. Our data, our decisions, our AI futures.

From a UK perspective - economic, social, psychological - the case is clear, we legislate or we lose. The government brings forward a bill, or the benefits fail to be brought forth. “Wait and see” is not a strategy. It is the perspective of the spectator, not the player. To govern is to choose.

The government must choose cross-sector AI legislation and the subsequent right-sized regulation. They must choose to put this at the heart of the next King’s Speech. May will otherwise be a more than unfortunate continuation of “may not”.

Read more about tech legislation

Read more on IT legislation and regulation