LONDON — Celebrities and other public figures could be offered fresh legal protections to stop artificial intelligence tools mimicking their likenesses, under plans being considered by the British government.
Ministers are preparing to launch a consultation as soon as Tuesday on controversial changes to the U.K. copyright regime that would allow AI firms to train models on copyrighted works for commercial purposes, unless rights holders expressly opt out.
But the plans — first reported by POLITICO and aimed at encouraging more AI investment in the U.K. — have sparked a fierce backlash from the creative sectors.
As a sweetener, the consultation is now expected to include the promise of a new “personality right,” according to three people briefed on the plans, granted anonymity like others in this article to speak freely.
This would offer people, particularly stars who rely heavily on protecting their image, additional legal protections against the use of generative AI tools to mimic their features and likeness without permission.
The new right could also help combat the threat of malicious deepfakes. Such personality rights already exist in several jurisdictions, including parts of the U.S.
Artists’ anger
The possibility of a new right is unlikely to assuage the creative sector’s wider concerns about the copyright shake-up, however.
At a briefing in parliament earlier this week, author Kate Mosse warned the government’s plans would would “kill originality” — a stance backed by Beatles star Paul McCartney. The sector argues that a regime in which content holders have to explicitly “opt in” to AI training is fairer.
The proposals also prompted a warning this week from the Copyright Alliance, a U.S. media body, that any British move that “degrades copyright” risks a “a legal environment that discourages U.K. and U.S. creators and rights holders from participating and investing in creative endeavors within the United Kingdom.”
“Copyright laws should not be cast aside in favor of new policies obligating creators to effectively subsidize AI technologies under the misguided belief that doing so is necessary to incentivize AI technologies,” a letter from the group to Technology Secretary Peter Kyle warned.
Culture Secretary Lisa Nandy insisted that the upcoming British consultation will consider a range of options, insisting ministers “genuinely haven’t made a decision about the best way to go about this.”
One industry figure familiar with the government’s thinking said they expected the consultation to argue that the U.K. risks missing out on opportunities from AI without making it easier for companies to develop models on reams of data in the country.
A British government official said ministers believe the current situation, which has resulted in drawn out legal battles, is unsustainable, and that workable solution is needed to end the uncertainty.
‘Worst of all worlds’
Several technical details would still have to be ironed out, however.
Big questions remain, including on how content holders would be expected to signpost that AI companies don’t have permission to use their data.
The consultation is also likely to require AI companies to be more transparent in revealing the data their models are trained on.
Industry body TechUK has said the U.K.’s current regime represents the “worst of all worlds.” It has urged the government to set out a clear intention to move towards a commercial content licensing market underpinned by an “opt-out” model.
“This is one of the issues that we need to resolve in order to get to the point where, across the economy, we can make full use of AI,” said Antony Walker, deputy CEO of TechUK.
“The current uncertainty is not just hindering upstream AI innovation in the U.K., but also its adoption across the economy,” he said. “It’s in everybody’s interests that we move forward and resolve the issue. And I think the government recognizes that.”
The Department for Science, Innovation and Technology has been approached for comment.