A group of UK lawmakers from across the political spectrum has thrown its support behind a new report that calls for AI to be regulated in the market in order to protect human music creators.
The report calls for the UK to pass new laws that would protect artists’ personalities from being copied by AI without permission; mandate transparent labelling of AI-generated content; and require AI developers to gain permission from copyright holders to use their materials for training, among other things.
The report, put out by the All-Party Parliamentary Group on Music – an informal group that includes members of the House of Commons and House of Lords – was funded by industry umbrella group UK Music, and is not an official report of Parliament or any parliamentary committees.
Titled Artificial Intelligence and the Music Industry – Master or Servant?, the report includes an opinion survey showing that 83% of UK adults want action to be taken on the issue of unauthorized deepfakes of artists like Taylor Swift and Drake.
The survey, carried out in March by Whitestone Insight, also found that 80% of UK adults believe the law should prevent AI from being trained on on artist’s music without their permission; 77% believe that it amounts to “theft” when an AI uses an artist’s music without acknowledgment; and 83% agree that an artist’s creative “personality” should be protected against copying by AI.
The poll also found that 69% of UK adults fear that AI will replace human creativity, and 62% are worried about the rise of deepfakes of music artists.
“We must… confront the danger that unfettered developments in AI could pose to the UK’s musicians and music businesses.”
UK Labour Party MP Kevin Brennan
“We must… confront the danger that unfettered developments in AI could pose to the UK’s musicians and music businesses,” Labour MP Kevin Brennan wrote in the report’s preamble.
“We ignore the necessity to sow policies, which will harvest the benefits of AI, and help stave off the threats it poses, at our peril. Our central insight must always be that AI can be a great servant but would be a terrible master.”
“We need the government to act now before AI tech companies destroy our world-beating industry,” UK Music Interim Chief Executive Tom Kiehl said in a statement.
“The polling UK Music has commissioned reveals the overwhelming view of the public that AI firms that take music without permission and payment to creators are guilty of theft.
“The public also voiced their concerns about the alarming growth of explicit deepfake images of music stars like Dua Lipa and the need for urgent action in this area.”
The report sets out eight recommendations, including “transparent labelling” of AI-generated materials; a requirement for AI developers to keep records of the materials used in training AI, and a requirement for them to gain rightsholders’ permission; establish a rule that AI-generated content without human creativity involved can’t be copyrighted; and the creation of a “personality right” that would protect people’s voice, image, name and likeness from AI deepfakes.
“We need the government to act now before AI tech companies destroy our world-beating industry.”
Tom Kiehl, UK Music
These principles should be implemented through a UK AI Bill, the report recommends, which would also require AI developers to abide by UK laws even if their AI algorithms were developed in other jurisdictions, “as a condition of market access.”
In practice, this would mean that, if an AI company created its AI in a country where the technology is exempt from copyright laws, it would still have to abide by UK copyright laws if it wants to make its AI available in the UK.
Some of these principles have already been put into law, or are being debated, in other jurisdictions.
For instance, the European Union’s recently-passed AI Act requires AI developers to keep records of the materials they used to train their algorithms, and also requires them to obtain rightsholders’ permission to use their materials in training. However, that EU rule contains a vaguely-worded carve-out for “relevant copyright exceptions and limitations.”
In the US House of Representatives, lawmakers are debating the No AI FRAUD Act, which would extend copyright protection to individuals’ voice and likeness. The Senate is currently working on the wording of similar bill dubbed the NO FAKES Act.
Tennessee has passed into law its own version of that law, the ELVIS Act, which extends the state’s right of publicity – i.e., the right to control one’s own identity – to include voice and likeness.
Meanwhile, China has enacted new regulations that require the labelling of AI-generated content, and social media platforms, including YouTube and TikTok, have set out their own rules requiring such labelling.
“We can provide a high quality, fair, human-led AI system that protects human artistry and acknowledges every part of the value chain if we demand that approach together.”
Rachel Lyske, DAACI
While the UK report’s recommendations have widespread backing within the music industry, it has also garnered support from some AI developers as well, among them Rachel Lyske, CEO of DAACI, developer of an AI-powered music composition tool.
“The UK is a world leader and exporter of quality music, recording innovation and world class artists. There is a window of opportunity for the UK to also be the world leader and exporter of generative AI. These two things do not have to be separate,” Lyske said in a statement.
“The UK government must and can give the homegrown UK music and the UK music technology industry a chance to get it right. We can provide a high quality, fair, human-led AI system that protects human artistry and acknowledges every part of the value chain if we demand that approach together.”
Besides UK Music, a number of industry groups have added their input to the report, including the Association of Independent Music (AIM), the British Phonographic Industry (BPI), the Ivors Academy, the Musicians’ Union, PPL, and PRS for Music.Music Business Worldwide