Home

Donate

Fair Use Arguments for Training Generative AI Are "Wrong," Says AI Executive

Justin Hendrix / Nov 16, 2023

Justin Hendrix is CEO and Editor of Tech Policy Press

Yesterday afternoon, an executive at Stability AI – maker of such generative AI systems as Stable Diffusion – posted a note to X (formerly Twitter) explaining why he had resigned from the company. Ed Newton-Rex, who served as vice president of audio at Stability AI, said he quit because he “wasn’t able to change the prevailing opinion on fair use at the company,” an opinion most clearly articulated in its statement to the US Copyright Office in response to a call for public comment on the question.

Newton-Rex, who is also a composer, founded Jukedeck, an AI music generation company that was acquired by TikTok owner ByteDance in 2019. He said he was proud of his team’s work on a “state-of-the-art AI music generation product trained on licensed training data, sharing the revenue from the model with rights-holders,” but that it was clear that the company held a view on the use of unlicensed media “that is fairly standard across many of the large generative AI companies, and other big tech companies building these models.”

That view, more or less, is that anything you can hoover up on the internet is fair game to train your system. But Newton-Rex said that “training generative AI models in this way is, to me, wrong.” His reasoning is that it pulls the rug entirely out from under the entire creative economy:

Companies worth billions of dollars are, without permission, training generative AI models on creators’ works, which are then being used to create new content that in many cases can compete with the original works. I don’t see how this can be acceptable in a society that has set up the economics of the creative arts such that creators rely on copyright.

Earlier the same day, I happened to moderate a panel discussion on the harms of AI systems to artists and human creativity at an event in Washington D.C. hosted by the Open Markets Institute and the AI Now Institute called “AI and the Public Interest.” I was joined by Liz Pelly, a freelance journalist who writes about the music industry, and is also writing a book about Spotify; Ashley Irwin, the president of the Society of Composers & Lyricists; and Jen Jacobsen, a former media executive who is now executive director of the Artist Rights Alliance. My panelists took a similar view on the question of fair use as Newton-Rex.

Open Markets conference at the JW Marriott in Washington, DC Wednesday, November 15, 2023. (© 2023 Michael Connor / Connor Studios)

Jacobsen, for instance, made the argument outright. Referring to cases before the courts that will have to answer this question, Jacobsen said she does not anticipate there is “going to be some blanket assertion that the use of AI in training models is going to be fair use. And in fact, we've seen in a lot of cases that the use of the works to train ends up having such a substitutional effect in the marketplace where the works that are coming out, as I was saying before, competing directly against the original works, that really is very strongly against the notion that that could be considered fair use.”

Irwin pointed out that there is an economy around training human musicians and composers, and that economy is at risk of being severely undermined by new AI systems.

“What we've learned as creators in whatever field we're in – I'll speak specifically to music – yes, we've learned it, we've had influences, we've had lessons as a child. Any of that stuff that we learned from was paid for, it was from copyrighted material. Even if my parents paid 25 cents for the sheet music when I was six years old to learn from Beethoven public domain, they still had to buy the sheet. And that's what it comes down to. Everything that we've learned and been influenced with, whether it was on the radio, whether it was in a library, whether it was at lessons with sheet music, somebody paid for that intellectual property that we learned from.”

Irwin is calling for the relationship between tech firms and artists to be governed by what he calls “the three Cs: consent, credit, and compensation for our works that are injected into these machines.” He says compensation should occur before any media is crawled or ingested into training sets, not after it has already been used to train a model.

Jacobsen pointed to proposed legislation – the Protecting Working Musicians Act, put forward by Rep. Deborah Ross (D-NC) – that would give artists the ability to collectively bargain with the owners of streaming platforms and with companies that develop generative AI systems for fair terms governing the use of their creations.

Pelly pointed out that AI has already vastly reshaped the economy of the music industry and other creative industries, tilting it heavily in favor of corporations.

“It's also important to also zoom out and remember that over the past 15 years, streaming services have basically reshaped what music looks like using AI reshaped not just the creation of music, but how we discover music, the context within which we understand music, what it looks like to have a career as a musician,” she said. All of this is “powered by machine learning, data, algorithms merged with a business model that is incredibly lacking in transparency, especially for independent musicians.”

It’s in the context of this already exploitative industry that generative AI is now being applied. Despite that, the panelists agreed that there are benefits to artists in the use of artificial intelligence, and that musicians in particular have seen various forms of the technology infused in their work for years. Newton-Rex struck a similar note in his resignation post.

“To be clear, I’m a supporter of generative AI,” he wrote. “It will have many benefits — that’s why I’ve worked on it for 13 years. But I can only support generative AI that doesn’t exploit creators by training models — which may replace them — on their work without permission.”

Stability AI was sued earlier this year by Getty, which claimed that the tech firm had engaged in "brazen infringement of Getty Images’ intellectual property on a staggering scale."

The US Copyright Office announced yesterday that it would extend its call for public comment on questions related to generative AI until December 6.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics