Home

Donate

AI and Harms to Artists and Creators

Justin Hendrix / Nov 19, 2023

Audio of this conversation is available via your favorite podcast service.

On November 15, the Open Markets Institute and the AI Now Institute hosted an event in Washington D.C. featuring discussion on how to understand the promise, threats, and practical regulatory challenges presented by artificial intelligence. The event marked the release of a new report from the Open Markets Institute and the Center for Journalism and Liberty at Open Markets titled “AI in the Public Interest: Confronting the Monopoly Threat.”

At the event, I moderated a discussion on harms to artists and creators, exploring questions around copyright and fair use, the ways in which AI is shaping the entire incentive structure for creative labor, and the economic impacts of the "junkification" of online content. The panelists included Liz Pelly, a freelance journalist specialized in the music industry; Ashley Irwin, President of the Society of Composers & Lyricists; and Jen Jacobsen, Executive Director of the Artist Rights Alliance.

What follows is a lightly edited transcript of the discussion.

Justin Hendrix:

Good morning, I'm Justin Hendrix, editor of Tech Policy Press, a nonprofit media venture intended to provoke new ideas, debate and discussion, at the intersection of technology and democracy. On Wednesday of last week, I participated in an event put on by the Open Markets Institute and the AI Now Institute, and held in Washington D.C. The topic was AI and the public interest. And the discussion sought to understand the promise, threats, and practical regulatory challenges in managing the advent of large scale AI. It also looked at the power dynamics that are emerging in industries, between those that develop and apply AI and everyone else in the market, and whether competition, law, and policy has a role to play in protecting society against excesses and negative outcomes, including [inaudible 00:00:57] democracy. Here's Sarah Myers West, AI Now's managing director, framing up the day.

Sarah Myers West:

Today's AI boom is driven at its core by commercial surveillance, and its incentive structures are shaped by the existing infrastructural dominance of a small handful of firms. This is what's driving the push to build AI and deploy it in the world at larger and larger scale, increasing the demand for resources that only big tech firms can provide, and further cementing their considerable advantage.

Justin Hendrix:

On the morning of the event, the Open Markets Institute released a report titled, AI In the Public Interest: Confronting the Monopoly Threat. The organization's Europe director, Max von Thun, introduced its findings.

Max von Thun:

Some of you might be asking yourself now, why does this concentration in AI matter? As our report finds, there are a whole host of problems that monopoly power over AI presents us with, some of them already very real, and some of them likely to emerge in the near future. Many, if not most of these, are amplifications of existing digital harms, including the ability of a few gatekeepers to exploit the individuals and businesses that are dependent on their platforms to their ability to exclude and eliminate rivals.

Justin Hendrix:

One of the speakers at the event was Federal Trade Commissioner Alvaro Bedoya. In a conversation with University College of London Professor Cristina Caffarra, he touched on a range of issues with regard to AI, and how the FTC is thinking about it. One of the examples he gave had to do with actors in Hollywood.

FTC Commissioner Alvaro Bedoya:

I'm sure people in the room follow the writers and actors strike, but I was particularly struck by the stories about background actors, people we know as extras. Where, they would describe... These folks don't make a lot of money, but it's a decent way to supplement your income if you live in LA, particularly if you get your union card. And they would describe getting on the set, and then suddenly towards the end of filming... These are allegations, I have not investigated them, I don't know if they're true, but they're certainly compelling. They would say, someone would tap me on the shoulder and say, actually, you got to go to that tent over there. They said, well, why do I have to go to that tent? Well, they're going to scan you. You're going to scan me? What? They said, well, you got to do a full body scan, from 360 degrees, 80 cameras... And some of the actors protested, and they were told, oh okay, well, yeah, if you can't do it, you're going to get fired.

And some of the stories, some of the allegations were really harrowing, some of the actors described having to do this nude. But if you know a little bit about the movie industry, you know that a lot of the way that actors get their start is by being extras. So, Sylvester Stallone famously was discovered as an actor, as an extra. Just being a guy in the background who, a producer or director said, who's that guy? I like that guy.

And so, if studios are able to literally map your body, and some of these scans involved full emotion capture, why would they need the living article, if later on they can just slot in the file that they captured when this was a 20-something up and coming actor? And so, what I did in that op-ed, and there's similar claims being made about writers, where there was a concern that writers would be asked for their first script that gets bought, oh, well actually we want the rights to everything you've written. We want that to be fed into our proprietary AI. This was one of the concerns they had.

Justin Hendrix:

Commissioner Bedoya said the FTC is thinking about how to apply competition law to these questions.

FTC Commissioner Alvaro Bedoya:

Antitrust law may have something to say about this. One of the things I most admire about Chair Khan is her looking at our core Section 5 authority, saying, we need to make sure we are doing right by its original purpose. Which was a law that was meant to track innovation in industry, both good innovation and innovation in unfair methods of competition, and it empowered us to look at situations where a powerful market actor was using that power not to compete on the merits with someone, but to take them off the playing field. And I think that law may have something to say about these scenarios that are being described by actors and writers.

Justin Hendrix:

After the Commissioner spoke, I moderated a panel discussion on harms to artists and human creativity. In this episode, we're going to listen into that discussion, which is lightly edited from the live event video.

<event audio commences>

I want to thank Barry, and I also want to welcome my panelists up onto the stage. AI affects perhaps every sector of the economy, and we've heard that this morning. We're going to focus specifically on artists and creative folks, and the concerns that they have, joined by an excellent panel here, I'll go from my right to left. Jen Jacobsen, who's executive director at the Artist Rights Alliance, former policy executive at Sony Music Time Warner, will tell us a little bit about perhaps the intersection of that experience as we go along. Liz Pelly, a freelance journalist who specializes in the music industry. Also, a colleague of mine at NYU, I should say. And has a book coming on the subject of Spotify's business model soon enough, and we'll talk a little bit about that and how that intersects with platform power. And then, Ashley Irwin, president of the Society of Composers and Lyricists, who composed the music for the Academy Award show, I understand. And like me, is a southerner, but maybe from south of the equator.

Ashley Irwin:

Of the equator, yeah.

Justin Hendrix:

So, quite pleased to have you here. We're going to give each of you a chance to say something to start, to set things up a bit. But maybe, actually, I'll start with you, because you mentioned that you are based in Hollywood, you've been here in D.C. quite a lot lately, and folks are listening, or at least you're getting in the door. You've had a lot of meetings lately. What's on the mind of folks in D.C. when it comes to artists?

Ashley Irwin:

I think this is my sixth trip up here since March, I've testified in front of the House judiciary on IP and copyright. And I think one of the things that most engaged what we had to say, or what was most engaging in what we had to say, is we actually showed up with some proposals as opposed to just complaints. And there was a lot of talk about AI and how it was going to affect our industry, and a lot of panic across all industries. But we had some suggestions of what might be considered, not solutions, but at least talking points, to get the discussion going. And we began with the three Cs, consent, credit, and compensation, for our works that are injected into these machines. And that seemed to grab hold, it was a little catchy phrase, but it seemed to grab hold and people started to run with it. And since then, I've seen it in many publications, even in Canada and whatever. So, that was where it started.

Justin Hendrix:

Liz, to you next, you're working on this book now, around Spotify, so you're thinking perhaps about AI slightly differently than maybe folks in the room who focused on generative AI. What's your particular angle right now?

Liz Pelly:

Yeah, I think that right now in music and creative fields generally there's a lot of conversation about generative AI and copyright, which is incredibly important for copyrights to be protected when we're thinking about how generative AI might be used, but I think it's also important to also zoom out and remember that over the past 15 years, streaming services have basically reshaped what music looks like using AI. So, reshaped not just the creation of music, but how we discover music, the context within which we understand music, what it looks like to have a career as a musician. And powered by machine learning data algorithms, have emerged with a business model that is incredibly lacking in transparency, especially for independent musicians, a business model that is basically created in partnership with major record labels, while independent artists don't really have the same seat at the table in terms of input on these systems that are really determining how they're able to make a career. And then, also streaming services have used AI data and algorithms also to exercise incredible gatekeeper power when it comes to playlists and discovery.

And in the report that Open Markets released today, there's talk about the way in which AI contributes to media deterioration, and I think there's also a really important point in the way data driven streaming platforms have also contributed to media deterioration, in terms of just how we understand the context of music. And then, yeah, also the shifting dynamic of artists being seen as customers more than artists and creators. So, [inaudible 00:10:03] that for now.

Justin Hendrix:

A theme we've heard throughout the day, this idea of AI being introduced not into necessarily a perfect world, but especially in the terms of media, introduced into a environment rife with exploitation and surveillance, capitalism, and various other problems. Jen, how about you? You've got so far as to maybe have a hand in getting a piece of legislation put forward?

Jen Jacobsen:

Well, it's at the very beginning stages, but we'll see. I want to, if it's okay, take a step back for one second though and pick up on something that Liz said about artists having a seat at the table. And that's what the Artist Rights Alliance is about, making sure that artists have a voice. And as Ashley was talking about, making sure that artists have the ability to give consent and get paid for how their works are being used. And in the AI context, and this goes to the legislation, artists are very concerned about not having that voice. But I do want to say upfront, artists are not averse to AI in every context. Musicians use AI in a lot of ways, as tool in their creativity, in the production studio, on tours, countless ways that AI has been used as a tool, and the technology content relationship is long and strong, and has peaks and valleys, but AI is something that's very valuable.

The issue here is when that work is being used without consent and compensation to create work that then competes directly against the artist's own work in the marketplace, and we can talk more about that. But when the artist doesn't have the ability to not only consent to it, but doesn't even know a lot of times, there's no transparency about when their work is being used, that's a real problem. This legislation you refer to, the Protect Working Musicians Act, is one small tool that could be helpful. It would give small independent artists and indie small record labels the ability to engage in negotiations with big AI developers, and also streaming platforms, about how their work is going to be paid for. Because a lot of times there's no transparency, as Liz was saying, into the way those relationships are, the way those rates are created, and there's no ability to have a voice. And we can talk more about the consent and compensation issue, but that's the framing for artists.

Justin Hendrix:

Yes. Liz, do you want to pick up on that? Since she mentioned our pre-chatter there, the problem of the general dynamic in streaming at the moment, which of course is one of the major economic superstructures for artists generally.

Liz Pelly:

When you talk about the lack of transparency with the business model, independent musicians make up 30% of the recorded music market, and right now, even just a couple of weeks ago, there is a new royalty model that is going to go into effect in January on Spotify, that also is being experimented with by... Or put into a practice by Deezer, that says that artists who have under 1000 streams a year, which doesn't sound like a lot, but it's actually a lot of artists, that work will be demonetized. And for example, that is a policy that independent artists have had no say in helping shape, but it is something that Universal Music Group had a lot of say in helping develop with Deezer, and then Spotify picked up on it.

So, because the major labels holds the vast majority of the IP and licenses that are seen as valuable to streaming services, they have been partners since the beginning in determining the deals and contracts that determine how these systems work. So, there's just a huge imbalance for independent artists, who are the vast majority of artists, who aren't having a say in terms of how these systems operate.

Justin Hendrix:

And Ashley, your folks are not, they're not even unionized, right? So, not really even the ability to come together to negotiate with larger entities?

Ashley Irwin:

Well, probably... Well, I think the only craft in working in the film and television industry, which is, our members are composers and songwriters who work specifically for audiovisual media, we're the only ones who are not unionized in Hollywood, no collective bargaining agreement, because the studios consider us independent contractors. And that's a problem. So, we don't have the ability, as most unions do, like, we were watching obviously the writers and the actors and the directors and their interaction, recently with the strikes, just to see what their position on AI would be, and how that may influence some of the things that we try and get into some of the legislation that Jen's talking about, up here on the Hill. Our concerns, once again, are not so much the use of AI, because we've been using versions of AI for probably 30 years, I think I've been using some permutation of it, but the difference is generative AI is a whole other beast, being used by people who are not arbiters of taste.

Part of what we are engaged for when we are employed on a film or a TV show is, people come to us because they like what we write as music. And what we reject, from our own compositions and songs, the audience never hears. We have that taste, the machines don't have that taste. The machines just spit stuff out. And much like a slot machine, if you sat there long enough, you're going to hit a jackpot. But the arbiters of taste, art, in all its aspects, is a reflection of society. And everything we do in the artistic world is a reflection of today's currency from songs, from books, literature, painting, everything. If that goes away, society suffers. If that's hampered in any way from the human aspect, society suffers. And that's part of what, it's not just an economic thing, it's a cultural thing as well that's really going to be an impact.

Justin Hendrix:

Well, let's talk about that, because I think this issue, this tension between this cultural question and copyright law, ends up being the core of it. I'm interested, I'm not asking any of you to be experts necessarily on copyright law, I know there are a few in the audience actually, but when you think about the arguments we're hearing about fair use, think about maybe some of the litigation we see that's questioning exactly how the law should think about a model hoovering up all of human production over the centuries, and then making it possible to represent some output from that input. I don't know. How are you thinking about that as you think about the rights and the livelihoods of the artists that you work with?

Jen Jacobsen:

Well, I would just say that, I'm not a copyright expert, but I do work on these issues tangentially, not even tangentially, but in the policy context, fair use, which is a defense that people can use to defend themselves against a copyright infringement lawsuit, is a very complicated test. And as you mentioned, there are a lot of cases that are being determined right now that could have an impact on how fair use is viewed in the AI context. But I would say there is not going to be some blanket assertion that the use of AI in training models is going to be fair use. And in fact, we've seen in a lot of cases that the use of the works to train ends up having such a substitutional effect in the marketplace, where the works that are coming out, as I was saying before, competing directly against the original works, that really is very strongly against the notion that that could be considered fair use.

We were talking about the artist having control and being able to have some kind of consent in the use of their own works. There are many layers in which the lack of consent and control can impact an artist and their rights. The deepfakes issue we've talked about, where an artist's name, their voice, their likeness is used, and confusing consumers as to whether this is the actual artist, this is a fake, that's a huge problem in terms of the artist-fan relationship, in terms of what music you're getting out there. So, that's one way. Another way is simply the dilution of the market, where you're having works out there that are not only competing with the original artist's works, but they are devaluing the original works by flooding the market with these AI generated works, and then decreasing substantially the amount of money that an artist can make from their original works.

So, this gets away from your original question, but it just shows how there is so much harm that can be caused by the use of their works to train that it's a very complicated test, and we'll see a lot of developments in the courts over the next few months on it.

Justin Hendrix:

Mm-hmm.

Ashley Irwin:

Can I just say something? That there has been in... And once again, I'm not a lawyer either. But there has been some precedent on some of these soundalikes. Because, years ago, when I was heavily involved in advertising, I remember there were two cases. One was a Bette Midler soundalike, and one was a Tom Waits soundalike, that were both adjudicated in the courts as being an infringement on both those artists for advertising. They had got voiceover people in to sound like Tom Waits, and sound like Bette Midler. And they lost the case, so there is a precedent for that in the courts already, for likeness.

Justin Hendrix:

So, let's maybe talk a little bit about your reactions, commissioner Bedoya talked a little bit about the Writer's Guild agreement, and some of these things. How closely were you all watching that WGA SAG-AFTRA? I am also less familiar with SAG-AFTRA's agreement, as one of the earlier speakers said, but WGA, a writer can choose whether or not to use AI, and how to use AI in their work. The companies have to disclose if they're handing a writer material that maybe has been influenced by AI. And I guess most importantly, AI generated material can't be used in some way to undermine the rights of a writer, their credits or otherwise their compensation. Are you looking to those agreements as models for the types of arrangements that you want to see for artists across the board?

Ashley Irwin:

We are, and I've actually got a meeting with the Writers Guild next week to see how we can support each other in those actions. But once again, we're working from a position without any labor backbone, we don't have that. So, we're really looking to some form of legislation to help support whatever these positions are, from our point of view. And we're working with the copyright office as well. But our situation is very similar in as much as, when we are given... Just to give you an example. When we are given, let's say, a cut of a movie or a television show, in a lot of cases it will be, there'll be what's called temp music, which is temporary music, which they'll put from another source underneath that film to show to test audiences, or studio executives, or whatever.

And there's a lot of times an expectation from us to sometimes copy that music in a way, or use the style of that. Not always, but sometimes if they get attached, and that's really working with the test audience. Well, in those situations we try and develop the style rather than rip off the music. I think in generative AI, they'd just be ripping off the music. They'd just be basically replacing it, getting far enough away from the plagiaristic aspect of it. And look, there are certain things out there that they may have a good argument for. Things like how many Law and Orders do we have? How many CSIs do we have? How many Marvel movies do we have? Do we need a new score for every Marvel Movie? I don't know, maybe not. We certainly need some characters, they keep killing them all off, but anyway, that's another...

Justin Hendrix:

I think they can bring them back to life.

Jen Jacobsen:

To your point on working with the writers, I would say that there are a lot of creators in the different industries that are working together on this, and we are partnering and supporting each other and helping each other. There's a large coalition called the Human Artistry Campaign, which is a large group of, I think it's over 150 artists/creator group, that are members of it, supporting each other in the context of AI and its impact on human creativity. And that includes people from all the creative communities that you can imagine, writers, music, visual arts, everything. And so, it's really incumbent upon us to stick together as creators in this space, especially when we're working against large behemoth platforms.

Justin Hendrix:

I want to come back to something we talked about just prior. And you've brought this up a bit, this idea of flooding of the market with stuff. You called it, "The problem of junk," when we were talking earlier. This idea that generative AI in particular because of just the vast volume of material that is likely to be created and published in just the next bit, and the extraordinary amount of material and media that's on the internet, imagine it just hitting an exponential at this point. Talk about the problem of junk, you've already mentioned how it might push out or dilute the competitive market for artists, how do you think about it?

Jen Jacobsen:

Well, we are seeing already, if you go to a large platform, Spotify, others, when a person wants to call up mood music for example, it's possible that some of these platforms will create their own mood music playlists that are entirely made up of AI generated tracks, and they don't necessarily have to pay for them because they've created them themselves, and put them on their own platform. So, that's the way in which not only are you gatekeeping, but you're also driving down the price that human actual artists are going to be able to get in the market. I don't know if you have anything to add to that, but...

Ashley Irwin:

Yeah, as you said, and Liz, you touched on that too, where you're forcing people into listening to things and directing their taste.

Liz Pelly:

Yeah. And it's interesting because AI relates to this issue in two different ways, which is, one, yes, streaming services are being flooded with what you might think of as junk music, or AI generated music that fits really well on these mood playlists. They also, there are companies that are specifically providing music to Spotify for a lower royalty rate, for this exact purpose, of filling up these official mood playlists, where working artists maybe would've had a spot. But in addition to the AI potentially being used to generate this stock music that is taking up so much visual real estate, prime visual real estate in the playlist ecosystem, I think even more than using AI to create this music, streaming services are maybe not counting on us really enjoying this music, but using AI to hyper-personalize the recommendations that you're being served, to the extent that perhaps you won't even notice or care the music that you're being served is junk, because their AI driven hyper-personalization machine has basically completely shaped your listening behavior.

So that when you open a streaming app, you're not really thinking about searching for a specific artist, searching for a specific album or label, you're just clicking onto a playlist that's been served to you using AI driven hyper-personalization. And it really is, in other areas of the platform economy, the way AI has been used to create behavior manipulation or even addictive qualities, I do think that there is a certain extent to which streaming services are using a similar tactic that contribute to habit formation around types of content that may be cheaper to them, and that also is an AI powered risk.

Justin Hendrix:

Yeah, I'm sure some folks who are parents in the room might have found their maybe young children watching YouTube and seeing these strange AI generated approximations of programs that they watch, maybe, that are often nonsensical, maybe not even have any plot or narrative or whatever, but look like Thomas the Tank Engine maybe a bit. That stuff's already happening.

Liz Pelly:

And I just want to add one other point to this discussion too, is that I think oftentimes when we think about the contributions of generative AI and junk and fraud, there's a tendency to point to specific bad actors, but it really is a systemic AI driven issue on these platforms. It's not just necessarily a few bad actors working kids in their room, trying to figure out how to game the system. It really is services and labels working together to create a system that is deeply unfair to the vast majority of creators on the platform.

Justin Hendrix:

Yeah, I think that's something that, this idea of all of this being introduced into an environment that is driven by a culture of exploitation, and all the infrastructures are essentially designed around that, and designed around attention economy, et cetera. I know that we might have a question or two from the audience, we've only got a couple of minutes left, and I think they're going to be read out from the back by the voice of God, which is a female voice I believe.

Questions:

Hi. So we have two groups of questions. So, the first group is asking about your thoughts on artists to choose to sign their rights over, or their images and their likeness, or who choose to share their royalties with AI, and what opportunities that might create, or if this is a sustainable model for the industry at large. And the second group of questions are asking about, given that AI is all about large amounts of data, and therefore the contributions of any one artist is quite marginal within that larger system, how do we think about this relative to how the current industry works, especially in terms of copyright law, and what does or does not need to change when we think about these systems?

Justin Hendrix:

So, we've seen examples already, I think, of actors in certain parts of the industry permitting themselves to be deepfakes, so that their output can be increased, without them having to do any more work. There are folks who are going to do more of this.

Ashley Irwin:

Absolutely.

Justin Hendrix:

What do we think of it?

Jen Jacobsen:

I would just go back to Ashley's original point, which is that it's about consent. It's about the artist having consent. And any system like that, we believe, and I think you guys would agree, that it has to be an opt-in system. That the artist has to be able to make that choice. And if they're making that choice, that's okay, technology can do wonderful things, and if that's what you want to do with your art, that's fine. And that goes also to that legislation, the Protect Working Musicians Act... Congresswoman Ross, by the way, I didn't mention before, is the sponsor of that, she's been awesome. It allows artists to be able to engage in the discussions that will allow them to have that choice.

Ashley Irwin:

We have, in our ecosystem, the music ecosystem, we already have collective licensing agencies for mechanical royalties, for digital streams, for performance royalties. And it wouldn't be very hard to model something like a sound exchange, like an MLC, like an ASCAP, like a BMI, for collective licensing of our works. And it's not a compulsory license, but it's collective, that you pay a license fee or you want to ingest stuff, knock yourself out. But as Jen says, it's your choice to opt into that licensing system.

Justin Hendrix:

Only have a minute or two left. Anybody on question two?

Liz Pelly:

Oh, I just wanted to add, I think there also needs to be transparency around artists knowing what they're opting into. So, I think something that we didn't mention yet is that there is an incredible opportunity for the FPC to do some sort of investigation into, for example, the deals between the major labels and the streaming services, so the independent musicians and all artists can better understand the systems that they're basically forced to use to have a music career. Same thing with AI systems, regulatory measures that could be taken to create a situation where there's actual transparency, I think would be incredibly important.

Justin Hendrix:

Well, maybe if Commissioner Bedoya is still in the room, you can whisper that to him, but I saw a lot of heads nodding on that.

Questions:

So, the second group of questions were about, with each artist providing a small marginal contribution to a large data model, how do we need to rethink copyright systems, or just how we think of what is being produced by AI?

Justin Hendrix:

So, a lot of folks think maybe there'll be some extraordinary system that will do fractional payments based on the extent to which your rights were represented in the training data. Is that kind of the idea that's out there a little bit, that maybe we'll have some magical...

Ashley Irwin:

No, I think... And maybe I was not clear. I think the licensing should be on the ingest side. So, once it's in the machine, you do whatever you want with it, but you pay to put it into the machine. And then it can be manipulated however they choose to manipulate it, if they've paid the license. So, it's not on the output side.

Justin Hendrix:

And maybe I'll just push you a little bit on that too, because we were talking earlier about the idea that a lot of the fair use folks, it's almost like the idea is that the machine, it's like anybody, it's listening to lots of music and then it's coming up with something that's basically expressing something in the style of what you ask-

Ashley Irwin:

Yeah, and that's very true. It learns a lot faster than a human can learn, and that's part of their argument. But the problem with that argument is that, what we've learned as creators, in whatever field we're in, I'll speak specifically to music, yes, we've learned it, we've had influencers, we've had lessons as a child, any of that stuff that we learned from was paid for. It was from copyrighted material. Even if it was, as I said earlier, when we were talking, my parents paid 25 cents for the sheet music when I was six years old to learn [inaudible 00:34:09]. Beethoven, public domain, they still had to buy the sheet music. And that's what it comes down to, everything that we've learned and been influenced with, whether it was on the radio, whether it was in a library, whether it was at lessons with sheet music, somebody paid for that intellectual property that we learned from. And that's really all we're asking for, that analogy, with generative AI.

Justin Hendrix:

So, there was a working economy there somehow.

Ashley Irwin:

Yeah.

Jen Jacobsen:

I was just going to say, with respect to the copyright law part of that question, the tools to protect artists are already there in the law, they just need to be applied to this context. The tools to protect human creators and protect that unique art that only humans can give, the emotion, the nuance, it's there in the law-

Ashley Irwin:

Yeah, and it's essentially a derivative work, and there's revision for derivative works in the copyright needs to be expanded, how it's expanded, that's for people with more intelligence than me. Essentially, we're talking about derivative work.

Justin Hendrix:

Well, we will see, I suppose, over the next year or so, how the arguments of the fair use maximalists will play out, and whether those arguments will find purchase in the courts and to the benefit of the tech firms, or perhaps things will even out a little more in the favor of artists and creators and craftsmen and folks who make things generally. I want to thank Barry, I thank Courtney as well for inviting me, and Karina for helping put this on. I want to thank our panelists, so give them a round of applause.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics