NicoElNino - Fotolia

AI: House of Lords focuses on copyright and transparency

Large language models promise to boost UK productivity, but the tech needs greater scrutiny and questions remain over intellectual property

The House of Lords Communications and Digital Committee has called on the government to support copyright holders. The committee’s first report of session 2023-24 on large language models (LLMs) and generative artificial intelligence (GenAI) urged the government not to “sit on its hands” while LLM developers exploit the work of rightsholders. 

The report’s authors called on the UK government to prepare for a period of protracted international competition and technological turbulence as it seeks to take advantage of the opportunities provided by LLMs. They recommended the government assess closely the benefits and risks of open LLMs over closed models, and noted that while open models offer greater access and competition, they raise potential issues over uncontrollable proliferation of dangerous capabilities. Closed models offer more control but also more risk of concentrated power, and the report called for “a nuanced approach”.

It also questioned the government’s transparency regarding the appointment of experts to advise policymakers. The report’s authors warned there is a perception of conflicts of interest, which, they said, is undermining confidence in the integrity of the government’s work on AI.

“Addressing this will become increasingly important as the government brings more private sector expertise into policymaking,” they said. “Some conflicts of interest are inevitable, and we commend private sector leaders engaging in public service, which often involves incurring financial loss.”

The report rebuked tech firms for using data without permission or compensation, recommending the government should end the copyright dispute “definitively”, including through legislation if necessary. It also advised a suite of measures including a way for rightsholders to check training data for copyright breaches, investment in new datasets to encourage tech firms to pay for licensed content, and a requirement for tech firms to declare what their web crawlers are being used for.

Looking at the potential disruption to employment and the jobs market LLM poses, the MPs noted that automating tasks commonly found in some roles risks reducing access routes for people to get a foot on the employment ladder, which in turn increases advantages for those with existing connections and finances to obtain experience.

“The ongoing failure to address digital skills gaps perpetuates bottlenecks at the lower end of the supply chain and risks deepening societal divides between those able to take advantage of opportunities created by technological advances and those who are left behind,” the report stated.

Read more about LLMs and generative AI

  • We look at the main areas enterprise developers need to consider when building, testing and deploying enterprise applications powered by large language models.
  • Generative AI technology has already disrupted how enterprises work. While many companies are uneasy with the fast-evolving technology, that's expected to change.

The committee also assessed the viability of developing a UK government LLM. “We do not recommend using an ‘off-the-shelf’ LLM or developing one from scratch: the former is too risky and the latter requires high-tech R&D efforts ill-suited to the government,” the report’s authors wrote.

They instead recommended commissioning an LLM to high specifications and running it on internal secure facilities. The report also suggested the government could make high-end facilities available to researchers and commercial partners to collaborate on applying LLM technology to national priorities.

Tina Stowell, chair of the House of Lords Communications and Digital Committee, warned that the sector could be dominated by a small number of mega-companies.

“One lesson from the way technology markets have developed since the inception of the internet is the danger of market dominance by a small group of companies,” she said, calling on the government to ensure exaggerated predictions of an AI-driven apocalypse, coming from some of the tech firms, do not lead it to policies that close down open-source AI development or exclude innovative smaller players from developing AI services.

“We must be careful to avoid regulatory capture by the established technology companies in an area where regulators will be scrabbling to keep up with rapidly developing technology,” said Stowell.

Read more on Artificial intelligence, automation and robotics

CIO
Security
Networking
Data Center
Data Management
Close