An artist explains why Marvel's use of AI to animate a sequence is worrying
(SOUNDBITE OF KRIS BOWERS' "NICK FURY (MAIN TITLE THEME)")
SCOTT SIMON, HOST:
Marvel's latest big production wrapped up this week on Disney+ - "Secret Invasion." Nick Fury, Samuel L. Jackson's character, uncovers a plot where shape-shifting aliens impersonate high-ranking officials across the globe. Oh, that again? But close viewers recognized something unsettling about the series. Its intro sequence used artificial intelligence to animate images. The studio said using AI simply affirmed the film's theme - who's really who? Now, actors are on strike now in part because they fear that AI might take away acting jobs. But the art world is already contending against those tools. Karla Ortiz, who has worked on massive Marvel properties, including "Guardians Of The Galaxy," "Loki," "Black Panther" and "Doctor Strange," testified this month in front of a Senate committee on AI and intellectual property. She joins us now from San Francisco. Ms. Ortiz, thanks so much for being with us.
KARLA ORTIZ: Thanks for having me, Scott.
SIMON: You're an illustrator and concept artist. What concerns you?
ORTIZ: Well, we are seeing generative AI models encroach almost every space in our industry. These technologies are currently replacing jobs. But to add a level of complexity to it, they're replacing jobs using our work as training data. It's not a hypothetical for us. It's happening right now. It's existential for us, really.
SIMON: From what you've seen, what is AI doing? How does it - how does that exploit your - the images that you have created?
ORTIZ: For an AI model to be able to generate imagery that says, you know, oil painting in the style of Karla Ortiz, it still needs to know what an oil painting is. So it'll be trained upon huge amounts of numbers of oil paintings from artists all around the world throughout history and through the present. And it also needs to know what my work looks like. There's a model with Stability AI that utilizes a large data set called LAION. It contains 5.8 billion text and image data pairs. And LAION is open for anyone to see, and that's how I found out the entire body of my fine art work to be in those data sets. And it's shocking. It's shocking to see. It took away my ability to consent to being a part of this technology. They took away my credit. And no one will know that my work powered those images outside of my name being clearly linked to it. And it took any kind of compensation away. It's really painful, Scott.
SIMON: AI companies, producers, say they don't need to pay and under the fair use doctrine, they can reproduce the image. How do you react to that?
ORTIZ: With a multitude of emotions, Scott. I don't think that, you know, theory will last. I don't - I just don't see it lasting.
SIMON: Well, we should explain. You're involved in a couple of lawsuits right now, aren't you?
ORTIZ: Yes. Yes, I am. One lawsuit, one class action lawsuit. One is all I can handle (laughter).
SIMON: Well, let me put it to you this way. Let me ask you, as an artist, how is what AI is doing different from what Andy Warhol did when he painted a Brillo box or a Campbell's soup can?
ORTIZ: Yes. So it's a matter of scale. For example, generative AI, some of the more, you know, larger data sets contain about 5.8 billion text and image data pairs. And within those, you know, massive data sets exist everything. And then, you have users who are encouraged to generate imagery at a level that no human artist can ever compete. We're talking about hundreds, if not thousands, of generated imagery within, like, a day, maybe weeks tops, in seconds? Who knows? It's - the technologies are getting better. And so, yeah, it's totally different.
SIMON: As a talented artist, is there a part of you that is also a little dazzled by the possibility of these technologies? And could you use it to do something new and different?
ORTIZ: You know, it's hard to not see the stolen work of my peers. You know, when I see these technologies, it's hard for me to see this as something that will be useful to me when I know it's already taken away jobs and opportunities. There might be a future where these models are trained correctly - right? - where things are built in public domain, you know, data only because that belongs to all of us. And any expansion upon that to be done via good, livable, you know, licensing - then maybe. But even at that level, to me, personally, as an artist, I wouldn't use it. I don't think I would. I love painting. I love every step of the process, from the doodle to the sketch to the drawing to figuring out the light, the colors, the composition. All of that is what makes art brilliant and wonderful. And to outsource that process to a machine feels empty to me.
SIMON: Karla Ortiz, a fine artist, concept artist, wowed people on screens, both big and little, with her work. Thank you so much for being with us.
ORTIZ: Thank you for having me, Scott. Transcript provided by NPR, Copyright NPR.
NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.