Published in AI

UK government needs to do more to sort out AI

by on06 May 2024

House of Lords committee wants more competition and less agreeing with Big Tech

A Lords committee is laying into the government, demanding that market competition in artificial intelligence (AI) be made “an explicit policy objective”.

The committee is also giving the government a right earful for its “inadequate and deteriorating” stance on the use of copyrighted material in large language models (LLMs).

After releasing a government response to the Communications and Digital Committee’s report on LLMs and generative AI (GenAI), committee chair Baroness Stowell has penned a letter to digital secretary Michelle Donelan. While she thanks Donelan for her engagement, she also warns about “significant areas where we believe the government needs to go beyond its current position”.

In particular, Stowell criticises the government for its lack of action to uphold competition in AI markets and guard against regulatory capture in key public bodies. She also criticises its reluctance to take meaningful action to protect creatives’ copyright.

Released in February 2024, the committee’s report sounded the alarm about a lack of competition in the UK’s AI markets; the risks of regulatory capture in the Department for Science, Innovation and Technology (DSIT) and the AI Safety Institute (AISI); and the detrimental effects of allowing AI developers to trample over copyright laws.

In a formal response published on 2 May 2024, the government said the Digital Markets, Competition and Consumers Bill (DMCC) would give the Competition and Markets Authority (CMA) the tools it needs to identify and address significant competition issues in various digital markets, including AI. It was noted that the regulator had already published its initial review of the competition implications of AI foundation models.

On regulatory capture, it added: “In line with DSIT’s conflicts of interest policy, AISI requires all individuals joining [its] Research Unit to declare any conflicts of interest. These conflicts are mitigated in line with the process of the conflict agreed by the DSIT permanent secretary.”

While the government noted that the AISI “is dedicated to building new infrastructure to conduct necessary testing and evaluations of advanced AI,” Politico revealed in April 2024 that it has not yet been able to carry out extensive pre-deployment testing of new models despite agreements being made with leading AI companies to open their models for this purpose at the AI Summit in November 2023.

Regarding AI and intellectual property, the government said it was committed to continuing the UK’s “robust” copyright framework: “The basic position under copyright law is that making copies of protected material will infringe copyright unless it is licensed, or an exception applies. However, this is a complex and challenging area, and the interpretation of copyright law and its application to AI models is disputed, both in the UK and internationally.”

The government added it is actively engaging with the relevant “stakeholders to understand broader perspectives about transparency about the purposes of web crawlers”. It reiterated the commitment made in its AI whitepaper to progress work on the transparency of AI models’ inputs and outputs.

While it noted several ongoing legal cases over the use of copyrighted material in AI training models, the government said, “It would not be appropriate for the government to comment on ongoing court cases. These cases are for the courts to decide on and must be allowed to conclude independently.”

The government also reiterated its commitment not to legislate on AI until it has a full understanding of the evidence on risks and their potential mitigations.

Baroness Stowell’s letter details the committee’s ongoing concerns with the UK’s approach to GenAI and LLMs.

Describing the government’s record on copyright as “inadequate and deteriorating,” Stowell said that while the committee appreciates the technical and political complexities involved, “we are not persuaded the government is investing enough creativity, resources and senior political heft to address the problem.”

She added: “The contrast with other issues, notably AI safety, is stark. The government has allocated circa €470 million to a new AI Safety Institute with high-level attention from the prime minister. On copyright, the government has set up and disbanded a failed series of roundtables led by the Intellectual Property Office. The commitment to ministerial engagement is helpful, but the next steps remain unclear. While well-intentioned, this is not enough.”

Stowell said the government’s response “declines to provide a clear view” of whether it supports applying copyright principles to LLMs and whether it is prepared to bring legislation to settle the matter legally.

“Indeed, it suggests that the government does not wish to comment to avoid prejudicing the outcome of ongoing legal cases. This contention is misguided and unconvincing,” she wrote, adding that setting out an intention to address legal uncertainty would not breach any ‘sub judice’ conventions preventing MPs from commenting on ongoing court cases: “It is therefore difficult to escape the conclusion that the government is avoiding taking sides on a contentious topic.”

Stowell concluded that the government’s reticence to take meaningful action amounts to a de facto endorsement of tech firms’ practices.


Last modified on 06 May 2024
Rate this item
(2 votes)

Read more about: