Senator introduces bill to force disclosure from AI developers

A new Senate bill aims to make it easier for human creators to determine if their work has been used without permission to teach artificial intelligence, marking the latest attempt to address the lack of transparency in AI development.

The Transparency and Responsibility for Artificial Intelligence Networks (TRAIN) Act would enable copyright holders to disclose the training records of artificial AI models, if the holder can demonstrate a “good faith belief” that their work was used to train the model.

The developers are only required to publish educational materials that are “sufficient to know with certainty” whether the copyright owner’s works have been used. Failure to comply would create a legal presumption – until proven otherwise – that the AI ​​developer used the copyrighted work.

Sen. Peter Welch, D-Vt., who introduced the bill Thursday, said the state must “establish a higher standard of transparency” as AI continues to integrate itself into the lives of Americans.

“This is simple: if your work is used to train AI, there should be a way for you, the copyright owner, to verify that it was used by the training model, and you should be paid if you do,” Welch said in. a statement. “We need to give American musicians, artists, and designers a tool to know when AI companies are using their work to train models without the artists’ consent.”

The explosion of available generative AI technology has led to many legal and ethical questions for artists, who fear these tools will enable others to create their work without permission, credit or compensation.

Although most AI developers do not publish their training data, the viral Midjourney spreadsheet gave credence to the concerns of artists earlier this year when it listed thousands of people whose work was used to train the popular AI art generator.

Companies that rely on human labor for production have tried to recruit for AI developers as well.

In recent years, media outlets such as The New York Times and The Wall Street Journal have sued AI companies such as OpenAI and Perplexity AI for copyright infringement. And the world’s major record companies joined forces in June to take two prominent AI music production companies to court, alleging that they trained their models on decades of copyrighted recordings without permission.

As the legal controversy continues to rise, more than 36,000 creative professionals – including Oscar-winning actress Julianne Moore, author James Patterson and Radiohead’s Thom Yorke – have signed an open letter calling for a ban on using human skills to train AI without permission.

There is no comprehensive federal law in place to govern the development of AI, although several states have tried to push through regulations related to AI, especially around the details. In September, California passed into law two bills aimed at protecting gamers and other players from illegal use of digital content.

Similar bills have been introduced in Congress, including the bipartisan “NO FAKES” Act, which would aim to protect people’s identities from unauthorized digital reproduction, and the “AI CONSENT” Act, which requires online platforms to obtain consent before using customers’ personal data to train AI. No one has received a vote so far.

In a press release, Welch said the TRAIN Act was supported by several organizations — including the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA), the American Federation of Musicians, and the Recording Academy — as well as major music labels — including Universal Music Group, Warner Music Group and Sony Music Group.

There are only a few weeks left in Congress, however, and members are focusing on what needs to be done to avoid a government shutdown on December 20. Welch’s office said it plans to bring the bill back next year, as any legislation that is not passed will need to be implemented. it was reintroduced in the new Congress when it begins in early January.

This article was originally published on NBCNews.com

Leave a Comment