Hollywood AI Crisis: Will Artificial Intelligence Eliminate Acting & Jobs?

0
102

Filmmaker Justine Bateman doesn’t think there is a thriving future for actors in filmed entertainment as we know it. She believes that artificial intelligence will ultimately suck the creative marrow out of Hollywood.
“AI can create a convincing simulation of a hu­­man actor, and the tech is improving at an alarming rate,” says Bateman, the former “Family Ties” star who has a degree in computer science and digital media management from UCLA. In a few short years, she asks, why would anyone need to pay real actors?
“I stress that this is an existential threat,” continues Bateman, who is an adviser to SAG-AFTRA on AI issues. “And if they can do this with actors, they can do it with writers, directors, cinematographers — everyone. We’ll be replaced with Frankenstein spoonfuls of our own work.”
When it comes to Hollywood, the sudden rise of AI-enabled content-creation platforms has only widened the already enormous divide between entertainment unions and the industry’s largest employers. Generative AI — advanced computing platforms that can create new text and imagery based on reams of existing reference material — has become a veritable villain in the current labor standoff. Signs on the unions’ picket lines this summer have been emblazoned with anti-AI slogans like “AI is not ART,” “Wrote ChatGPT This” and “AI’s not taking your dumb notes.”
What it comes down to is the WGA and SAG-AFTRA want ironclad guarantees from studios and streamers that robots won’t become replacements for performers and scribes.
Francessco Muzzi / StoryTK
“Everyone understands that AI is a tidal wave that is coming for us one way or the other,” says Sarah Moses, entertainment litigation partner at law firm Manatt, Phelps & Phillips. “But it’s hard to negotiate right now because there are so many unknowns.”
Let’s start with this one: Is the hand-wringing over generative AI overblown?
VIP+ Survey: Industry Sentiments on Gen AI Use
There’s a general consensus that AI can’t produce original TV shows and movies that are anything close to what audiences expect — and that it may never be a satisfying substitute for the real human deal.
“It’s fun to imagine a generative AI scriptwriter, but there is absolutely no interest in that. Because it’s lousy,” says Monica Landers, founder and CEO of StoryFit, an Austin-based company that uses AI to analyze screenplays. “It can’t hold on to the pacing to reveal the plot. It’s not made for character growth. It’s so empty right now.”
James Cameron, the filmmaker behind AI thriller “The Terminator” and blockbusters like “Avatar” and “Titanic,” has no plans to use artificial intelligence to write a script. “I just don’t believe that a disembodied mind that’s just regurgitating what other embodied minds have said will ever have something that’s going to move an audience,” Cameron recently told CTV News. He added, “Let’s wait 20 years, and if an AI wins an Oscar for best screenplay, I think we’ve got to take them seriously.”
Generative AI is transforming everything about computing, and it represents an advance on par with the introduction of the PC or the smartphone. In the right hands, it will yield a class of powerful new creative tools.
“We should look at AI as ‘augmented intelligence,’ rather than ‘artificial intelligence,’” says Sunny Dhillon, managing partner at venture-capital firm Kyber Knight Capital. “This will let filmmakers work faster and cheaper and make movies that are more compelling.”
The reason that AI is changing the way the world works is that now anyone can harness the power of generative AI tools simply by using plain-language instructions, says Jensen Huang, founder and CEO of Nvidia, the chipmaker that has emerged as a powerhouse in the AI space. All of a sudden, you don’t need specialized training in, say, coding to create something incredible.
Justine Bateman, Sharon Lawrence and “Mad Men” creator Matthew Weiner picketing outside of Netflix studios on Aug. 8 GC Images
“We’ve democratized computer science,” Huang told a rapturous crowd at the Siggraph 2023 computer-graphics confab in L.A. last week. “Generative AI is the new killer app.”
The suits in Hollywood are clearly intrigued by AI, as are executives in every industry. About 96% of AI decision-makers at media and entertain­ment companies said they plan to increase their spending on generative AI technology in the next 12 months, according to a survey of 6,000 employees by enterprise search-engine vendor Lucidworks conducted between May and July. They add that they’re not looking to use it to eliminate workers; just 4% of those surveyed said they ex­­pect AI adoption to result in “job displacement.”
VIP+ Analysis: Four Ways Hollywood Should Use AI
Disney CEO Bob Iger says the company is starting to use AI to operate more efficiently. “Overall, I’m bullish about the prospects because I think they’ll create efficiencies and ways for us to basically provide better services to customers,” he told Wall Street analysts in May.
At Amazon, every one of the tech giant’s teams — including in its entertainment business — has multiple generative AI technology projects in the works, according to CEO Andy Jassy, speaking on Amazon’s Aug. 3 earnings call. He was light on specifics but said generative AI “is going to be at the heart of what we do” as Amazon looks to use it to reduce costs and “reinvent” cus-tomer experiences.
A recent Netflix job listing for a product manager in the company’s machine-learning group listed a salary range of between $300,000 and $900,000. That triggered ire among the Hollywood unions’ rank and file. “Talk about tone-deafness, FFS!” tweeted WGA member Christopher Derrick, whose writing credits include “Star Trek: Picard” and “The Equalizer.” Never mind that the Netflix job in question centers on AI developments for content personalization and optimizing payment-processing systems. With the unions coalesced around fighting the encroachment of all things AI, Netflix’s listing is now a boogeyman.
Forms of artificial intelligence, of course, have existed for decades (think spell-check programs or digital thermostats). What’s new — and, to Bateman and others, extremely alarming — is how rapidly gen AI has advanced to be able to create snippets of film or television that, if you squint, can look like actual humans in scenes produced by human directors. An AI-generated script can read like something written by a professional writer. A tipping point came in November 2022 when software firm OpenAI released ChatGPT, a chatbot that can spit out fully formed essays on a range of different topics.
“For years, everyone has known AI was coming,” says Jason Vredenburg, an associate professor at Stevens Institute of Technology who teaches American film history. “But when ChatGPT came out, everyone was shocked. They realized it was coming faster than anyone thought.”
NVIDIA’s Jensen Huang at Computex 2023 in Taipei AFP via Getty Images
That’s why in the current entertainment industry labor disputes, AI has become a flashpoint. The WGA wants its members to be able to use AI tools but is asking for assurances that using those tools won’t count against them in determining credit and pay. The AMPTP has offered to spell out explicitly that a “writer” must be a human, but also says further discussion is needed in the area.
Meanwhile, SAG-AFTRA has alleged that studios “want to scan a background performer’s image, pay them for a half a day’s labor and then use an individual’s likeness for any purpose forever without their consent.” The AMPTP has emphatically denied this.
But background actors have become deeply worried about their employment prospects in an AI world. “We don’t know what all our scans are being used for,” says Prince Royal, an actor who says that when he worked as an extra on “The Flash” he was required to submit to a 360-degree image scan — or go home without pay. AMPTP says under its proposal, producers must obtain background actors’ full consent and separately bargain for payment for each use of a digital replica.
Per the AMPTP, what is needed is “a balanced approach based on careful use, not prohibition.”
Multiple legal issues with generative AI have yet to be fully ironed out. One gray area — the subject of several lawsuits — has to do with the source material used to “train” generative AI models.
Last month, a lawsuit on behalf of Sarah Silverman and other authors accused Meta and OpenAI of illegally using copyrighted works — including Silverman’s 2010 bestselling memoir “The Bedwetter: Stories of Courage, Redemption, and Pee” — to train their AI systems. (Meta declined to comment. Silverman also declined to comment, citing the pending litigation. OpenAI did not respond to a request for comment.)
VIP+ Analysis: AI Copyright Cases Under Cloud of Uncertainty
Some experts say there’s nothing in the law that prevents AI systems from using any kind of source material for training. U.S. laws stipulate that you can’t copy a specific copyrighted work or performance, says Yale Law School professor Robert Post. But actors, he says, study all kinds of performances to inform their own. “Everyone has a right to study them,” including someone who is using AI to do so, says Post. He adds, “It’s not copyright infringement to read a lot of books.”
Other commentators have argued that the U.S. needs a federal law protecting name, image and likeness that supersedes today’s state-by-state regulations.
Hilary Krane, chief legal officer at CAA, believes generative AI issues pertaining to intellectual property, as well as name, image and likeness rights, are solvable and addressable concerns. “We want to make sure the rules of the road are clear,” she says. “Uncertainty always breeds fear.”
The music industry has been the canary in the coal mine for many technologies, including piracy, and has been among the first in the entertainment field to have to grapple with generative AI that mimics specific artists’ styles and voices. Earlier this year, Universal Music Group filed copyright-infringement claims to remove an AI-generated song called “Heart on My Sleeve” that sounded very much like a collab between Drake and the Weeknd, neither of whom was involved with the song.
What the industry might need is some kind of truth-in-labeling regulations for generative AI, similar to the way the FDA requires standardized food labeling, says Michael Huppe, CEO of SoundExchange, a nonprofit rights management organization for digital sound recordings. “Maybe you’ll end up paying more for a ‘human-only’ streaming service,” he says.
To date, generative AI has made the biggest inroads in VFX for tasks like de-aging actors and dubbing scenes into other languages. But given the antagonism from some on the picket lines toward all things AI, it’s a weird time to be selling AI-enabled products to Hollywood.
StoryFit, for example, works with studios and networks to break down a script and identify whether elements and characters in a story resonate with audiences. Monica Landers compares it to an audience preview but using an AI-modeled audience to react to a project when it’s at the script stage.
“We don’t want to be the bad guy,” says Landers. “But we do find ourselves in the middle of it. I often find myself responding defensively: ‘We are not generative AI; we are using AI to help creativity.’”
James Cameron and Sam Worthington on the set of 20th Century Studios’ “Avatar 2” Courtesy of Mark Fellman/20th Ce
Some years ago, professional language translators were very worried about AI eliminating their jobs when tools like Google Translate became more proficient. What actually happened, according to Landers, was that the AI tools took on a lot of the “boring scutwork” of translators, but they have not been able to replicate mood or convey idioms in the way human translators can. In the context of the Hollywood strikes, Landers says, “My hope is we can figure out a way that writers and actors are protected enough so that they can be part of creating in whatever this new world is going to be.”
Scott Mann, co-CEO of Flawless, which specializes in gen-AI filmmaking tools, and a director and producer whose films include “Fall” and “Heist,” says the recent uproar over AI in creative industries has highlighted “the responsibility factor” — that is, producers must obtain permission from actors and compensate them for anything based on their likeness.
“The truth is, there are huge misunderstandings around gen AI,” says Mann. “I don’t see any point in time where you take humans out of the equation. Fundamentally, films are created from human feelings and emotions.”
On AI issues, the path of least resistance for the unions and the AMPTP may be to agree to certain “broad strokes” and to table the more nitty-gritty points of contention until the next round of negotiations, says Moses. “The parties might be best served kicking the can down the road.”
But that road may be coming up fast. Joe Russo, co-director of Marvel movies including “Avengers: Endgame” and “Avengers: Infinity War,” has predicted that generative AI may be able to create feature-length movies within two years — and that the technology could put you, the viewer, into an AI-generated movie.
“At some point, perhaps, you could tell a video-streaming service, ‘Hey, I want a movie starring my photoreal avatar and Marilyn Monroe’s photo-real avatar,’” Russo said in an April interview with Collider. “It renders a very competent story with dialogue that mimics your voice. … And suddenly now you have a rom-com starring you that’s 90 minutes long. So you can curate your story specifically to you.”
Echoing Russo’s thought experiment, Bateman predicts that rights holders will sooner or later let consumers insert themselves into classic movies — imagine yourself starring in “Citizen Kane.” Or maybe AI generates some kind of derivative work cobbled together from an AI database trained on existing stories and characters trending in your feeds. Either way, in Bateman’s doomsday outlook, the market for human-created entertainment will wither away, replaced by synthetic fare produced at a fraction of the cost.
“There’s no scenario I can think of where this doesn’t happen,” Bateman says. “Generative AI can’t make great films. But it can create a high volume of content and regurgitate sequels.”
But others say that instead of fighting the future, the more productive strategy is to figure out where AI technology is most useful, while working to ensure copyright protections for individual creators as well as name, image and likeness rights for performers.
The real question the industry faces is how people can get past their fear about AI and start to unlock its potential, says CJ Bangah, who leads PwC’s entertainment and media practice.
“Some of the most impactful artisans in the future will find ways to use AI in ways we are just beginning to understand,” Bangah says, adding, “‘Hand-crafted’ doesn’t mean not using technology.”
Gene Maddaus contributed to this story.
See Variety Intelligence Platform’s AI assessments:
• Union contract just one step to actors’ AI protections
• What writers and studios must iron out on AI
• How AI will augment human creativity in filmmaking
• How AI will revolutionize streaming
• Studio transparency key to quelling WGA’s AI battle
Plus, dive into VIP+’s expansive special report …