California Makes It Illegal to Use AI to Replace Actors
© CC BY 2.0
In a groundbreaking move, Governor Gavin Newsom signed into law two California bills that make it illegal to use AI to impersonate an actor’s voice, likeness, or performance without their explicit permission. The bills, AB 1836 and AB 2602, aim to extend protections for performers in the age of synthetic media.
What the Laws Do
- AB 2602 prohibits contracts that allow a company to replace a real performer with a “digital replica” unless the performer is fully informed and consents, with legal counsel or union representation.
- AB 1836 addresses the rights related to deceased performers: it bans creating or distributing digital replicas of a deceased person’s voice or likeness in audiovisual works without consent from their estate or family.
- Together, the laws close a loophole that previously allowed AI uses under the guise of “artistic work” or derivative rights.
These protections were championed by SAG-AFTRA, the actors’ union, which argued that performers need new safeguards in an era where AI can mimic anyone’s appearance or voice.
Why This Is Significant
- This marks the first state-level law in the U.S. explicitly empowering actors against unauthorized AI impersonation of their performances.
- It represents a legal shift from focusing on copyright to protecting the right of publicity, persona, and performance rights in AI contexts.
- The legislation anticipates a future where AI might replicate entire film scenes, voices, or motion capture in place of human actors — and sets boundaries on how far that can go legally.
- For deceased actors, the law ensures their persona can’t be resurrected digitally for new works without estate consent, discouraging posthumous AI re-creation.
Challenges & Open Questions
- Defining “digital replica” is tricky. The bill doesn’t fully clarify how close a voice or face has to be to the original before it’s considered a replica.
- Enforcement and litigation: How courts interpret these bills in practice will be key. Companies may test the boundaries, especially in blockbuster films or global streaming projects.
- Interstate coherence: Since the law is state-level, questions remain about how AI works produced elsewhere will be handled, or whether similar laws will follow in other states.
- Creative flexibility vs protection: The law could slow or complicate certain creative uses of AI, especially in blending AI elements with live action, motion capture, or voice synthesis—producers will need to navigate carefully to stay compliant.

What to Watch
- Whether other states or the federal government follow California’s lead with comparable protections for performers.
- High-stakes cases where studios use AI in film, TV or video games, and whether lawsuits test or challenge the boundaries of these new statutes.
- How the AI, entertainment, and tech industries adapt — through licensing, contracts, transparency requirements, or new workflows.
- Whether these laws influence how AI tools are designed: e.g,. built-in protections, opt-outs, watermarking, or restrictions around imitating performances.
California’s move signals that the industry is entering a new era: yes, AI can imitate — but not at the expense of performer rights.
You might also want to read: Teacher Files Police Report After Student Steals $300 Hello Kitty