Music industry backs new ‘TRAIN Act’ requiring transparency in materials used to train AI

Andy Feliciotti/Unsplash

Numerous music industry groups and the three majors – Sony Music Entertainment, Universal Music Group, and Warner Music Group – have thrown their support behind a proposed new US law that would require AI developers to disclose the materials they used to train their AI models.

The Transparency and Responsibility for Artificial Intelligence Networks (TRAIN) Act, introduced by Sen. Peter Welch, a Vermont Democrat, would apply only when a rightsholder suspects their works were used to train a generative AI tool.

The rightsholder would be able to ask any US district court clerk to issue an administrative subpoena to an AI developer, requiring them to hand over materials that are “sufficient to identify with certainty” whether their copyrighted works were used.

A subpoena would be granted only if the rightsholder declares that they have “a good faith belief” that their work was used to train the model.

Sen. Welch said in a statement on Monday (November 25) that the bill is meant to address the “black box” problem with AI developers’ use of copyrighted materials. Simply put, AI developers often don’t reveal what data or data sets they used to train their AI.

This makes it difficult for rightsholders to know if their works have been used in AI training without their authorization, and it puts the onus on rightsholders to prove their material was used by an AI developer when they take legal action against that developer.

For instance, in the cases against chatbot developer Anthropic, and generative AI music apps Suno and Udio, music companies have gone to great lengths to show similarities between the AI output and their copyrighted materials. The TRAIN Act would simplify and speed up that process.

“This is simple: if your work is used to train AI, there should be a way for you, the copyright holder, to determine that it’s been used by a training model, and you should get compensated if it was,” Sen. Welch said.

“We need to give America’s musicians, artists, and creators a tool to find out when AI companies are using their work to train models without artists’ permission. As AI evolves and gets more embedded into our daily lives, we need to set a higher standard for transparency.”

“If your work is used to train AI, there should be a way for you, the copyright holder, to determine that it’s been used by a training model, and you should get compensated if it was.”

Sen. Peter Welch (D-VT)

Notably, the TRAIN Act doesn’t require AI developers to compensate copyright holders for the use of their works in training AI, only to disclose such use.

A separate bill – the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) – that was introduced in the Senate earlier this year would make it unlawful to use copyrighted works to train AI without permission – a key ask of many music industry organizations and companies.

In copyright cases before the courts, AI companies have argued, among other things, that using copyrighted works to train AI should benefit from the “fair use” exemption to copyright law. The copyright holders suing AI developers strenuously object to that, arguing that fair use was never meant to be applied to situations like the mass ingestion of copyrighted materials to train generative AI tools that then regurgitate similar or identical content.

Courts are considering whether or not the use of copyrighted works in AI is “fair use” in several copyright lawsuits.

The TRAIN Act and COPIED Act are among a growing number of bills before Congress aimed at regulating the use of AI. Others include the NO FAKES Act, introduced in the Senate this year, which would allow individuals to sue if their voice or likeness is used in an AI deepfake. A similar bill, the No AI FRAUD Act, was also introduced in the House of Representatives this year.

The TRAIN Act is similar in intent to – but different in method from – the Generative AI Copyright Disclosure Act, which Rep. Adam Schiff, a California Democrat, introduced earlier this year in the House of Representatives.

That bill also seeks transparency from AI developers in the materials they use, but it does so by requiring AI companies to send a notice to the Register of Copyrights that includes “a sufficiently detailed summary of any copyrighted works used.”

“We strongly support the bill which prioritizes creators who continue to be exploited by unjust AI practices.”

David Israelite, NMPA

Tech companies objected to that bill on the grounds that AI training uses so much material that it would be unworkable to send notices about all the copyrighted materials used.

The TRAIN Act partly addresses this issue by requiring AI companies to disclose only the use (or non-use) of a particular copyrighted work, and only upon request by the rightsholder.

The process created by the TRAIN Act “necessitates precise record-keeping standards from AI developers and gives rightsholders the ability to see whether their copyrighted works have been used without authorization,” said David Israelite, President and CEO of the National Music Publishers’ Association (NMPA), one of the organizations backing the bill.

“We strongly support the bill which prioritizes creators who continue to be exploited by unjust AI practices.”

“Senator Welch’s carefully calibrated bill will bring much needed transparency to AI.”

Mitch Glazier, RIAA

Other music industry groups backing the bill are the American Association of Independent Music (A2IM), the American Federation of Musicians, the American Society of Composers, Authors, and Publishers (ASCAP), BMI, Global Music Rights, the Recording Academy, the Recording Industry Association of America (RIAA), SESAC, and Sound Exchange.

A number of unions and industry groups from film, TV, news media and book publishing also back the bill.

“Senator Welch’s carefully calibrated bill will bring much needed transparency to AI, ensuring artists and rightsholders have fair access to the courts when their work is copied for training without authorization or consent. RIAA applauds Senator Welch’s leadership and urges the Senate to enact this important, narrow measure into law,” said Mitch Glazier, Chairman and CEO of the RIAA.

“The future of America’s vibrant creative economy depends upon laws that protect the rights of human creators.”

Elizabeth Matthews, ASCAP

“The future of America’s vibrant creative economy depends upon laws that protect the rights of human creators,” said Elizabeth Matthews, CEO of ASCAP.

“By requiring transparency about when and how copyrighted works are used to train generative AI models, the TRAIN Act paves the way for creators to be fairly compensated for the use of their work. On behalf of ASCAP’s more than one million songwriters, composer and music publisher members, we applaud Senator Welch for his leadership.”

“Some AI companies are using creators’ copyrighted works without their permission or compensation to ‘train’ their systems, but there is currently no way for creators to confirm that use or require companies to disclose it,” said Mike O’Neill, President & CEO, BMI.

“The TRAIN Act will provide a legal avenue for music creators to compel these companies to disclose those actions, which will be a step in the right direction towards greater transparency and accountability.”Music Business Worldwide

Related Posts