On November 8, the Screen Actors Guild and American Federation of Television and Radio Artists (SAG-AFTRA) approved a tentative agreement with Hollywood studios, suspending a historic 118-day strike from coast to coast. Still under review, the contract includes an 11% pay bump for the first contract year alone, bonuses and residuals for high-performing productions on streaming platforms, intimacy coordinators for scenes requiring nudity and sexual interactions, wage increases for background actors, and ethnically and racially informed hair and makeup services, among other noteworthy provisions.
Despite these wins, artificial intelligence remains one of the most highly contested points. As the technology evolves at light speed and artists across disciplines fear that creative labor is threatened by generative models, AI guardrails continue to be a major concern for SAG-AFTRA members ahead of the December 5 ratification vote.
As it stands now, the tentative agreement reached with the Alliance of Motion Picture and Television Producers (AMPTP) secures comprehensive protections for actors surrounding the replication of their voice and likeness using tools like AI or CGI. Studios will be required to obtain clear and informed consent from performers for the creation and use of their digital replicas and compensate them with the highest available rate as well as future residuals. However, some concerns remain: For instance, informed consent for the use of a digital replica isn’t automatically rescinded after a performer’s death, though producers must obtain consent from their estate or representative.
Additionally, performers are responsible for negotiating their own compensation for contracted use of digital replicas developed without their physical participation — and if they disagreed with their representation or felt there was a violation of the agreed-upon terms, they would have to sue the studio and hope for a good outcome.
The agreement lightly brushes upon so-called “synthetic performers,” meaning characters not based on real humans, such as the controversial Instagram robo-model Lil Miquela. Studios that use generative AI for a synthetic performer that specifically derives a likeness from a real person, such as by incorporating a unique facial feature, must also obtain consent.
When the strike was suspended on November 9, many were initially relieved that there was an established foundation of protections as the industry navigates the use — and potential misuse — of artificial intelligence.
Dan Jasnow, a partner at the ArentFox Schiff law firm, which provides intellectual property and ethics-related legal services to businesses that use generative AI, told Hyperallergic that “there are things for both sides to be proud of” in the new agreement.
“Studios will be able to continue exploring AI’s revolutionary creative potential, while actors, for the first time, are guaranteed compensation for the use of their AI likenesses,” Jasnow said. “In any negotiated agreement, some people are bound to be disappointed, but we shouldn’t lose sight of what this represents: Some of the first enforceable industry-wide parameters for responsible and ethical use of AI.”
Not everyone shares the sentiment of compromise, however. SAG-AFTRA’s own AI advisor, Justine Bateman, took to X to express her confusion that “a union representing human actors would give approval of those same actors being replaced by an AI object.” While raising concerns about the tentative agreement, Bateman went on to say that the endorsement of generative AI and digital replicas was “akin to SAG giving a thumbs-up for studios/streamers using non-union actors.”
Voiceover actor Sara Cravens also expressed her misgivings about the deal, outlining the potential for discriminatory hiring practices based on consent for digital replicas that may pressure performers into conceding, as well as her distrust for relying on studios’ and producers’ “good-faith” assessments for appropriate compensation.
“If you inform me that you want to digitally take my voice and replicate it, and I say no … the job will just go to someone else who says yes,” Cravens said on Instagram. “How is that protection and not coercion? Why doesn’t our contract say consent CANNOT be a condition of employment?”
Bateman and Cravens are among hundreds of voices expressing distrust in the AI provisions. Others have said that they’ll vote “No” for the agreement, stating that while the other bonuses and protections are welcome, they’re essentially rendered useless if studios are allowed to use digital replicas and synthetic performers in lieu of human talent.
A spokesperson for SAG-AFTRA was not immediately able to comment on Cravens’s and Bateman’s apprehensions, but explained that the union board has addressed many concerns surrounding the AI protection clauses in the Frequently Asked Questions and AI Resources sections of its website.