Open AI is making changes to the generative AI video generator Sora 2 and its ability to replicate likenesses and copyrighted material without permission. These changes are coming in the way of an “opt-in protocol, where all artists can choose whether they wish to participate in the exploitation of their voice and likeness using A.I.”
Similar concerns were raised by several member of the Japanese members of government about the use of anime and manga characters. As well as the estate of Martin Luther King Jr. asking Open AI to pause the use of his likeness in videos created by Sora 2.
This time the complaint came from Breaking Bad actor Bryan Cranston, who said:
“I was deeply concerned not just for myself, but for all performers whose work and identity can be misused in this way,” Cranston said in a statement. “I am grateful to OpenAI for its policy and for improving its guardrails, and hope that they and all of the companies involved in this work, respect our personal and professional right to manage replication of our voice and likeness.”
From the press release:
Actor Bryan Cranston’s voice and likeness were able to be generated in some outputs without consent or compensation when OpenAI’s Sora 2 was initially launched in an invite-only release two weeks ago. While from the start it was OpenAI’s policy to require opt-in for the use of voice and likeness, OpenAI expressed regret for these unintentional generations. OpenAI has strengthened guardrails around replication of voice and likeness when individuals do not opt-in.
“Bryan Cranston is one of countless performers whose voice and likeness are in danger of massive misappropriation by replication technology,” said SAG-AFTRA President Sean Astin when discussing the new opt-in implementation to Sora 2. “Bryan did the right thing by communicating with his union and his professional representatives to have the matter addressed. This particular case has a positive resolution.”
“I’m glad that OpenAI has committed to using an opt-in protocol, where all artists have the ability to choose whether they wish to participate in the exploitation of their voice and likeness using A.I.,” Astin added. “This policy must be durable and I thank all of the stakeholders, including OpenAI for working together to have the appropriate protections enshrined in law. Simply put, opt-in protocols are the only way to do business and the NO FAKES Act will make us safer.”
The NO FAKES Act aims to regulate their right to publicity, also called personality rights, which allow individuals to control the commercial use of their name, image, likeness, and other aspects of their identity. This act aims to put some sort of control in the production and distribution of an unauthorized AI-generated content of an individuals using their likeness or voice.
From The SAG-AFTRA press release:
This new framework aligns with the principles embodied in the NO FAKES Act, pending federal legislation designed to protect performers and the public from unauthorized digital replication. OpenAI, SAG-AFTRA, Bryan Cranston and his representatives at United Talent Agency, the Association of Talent Agents and Creative Artists Agency share a strong, unified position in favor of the NO FAKES Act and endorse its goal of establishing a national standard that ensures performers’ voices and likenesses cannot be used without permission in AI generated content. Together, we believe consent and compensation are the foundation of a sustainable and ethical creative ecosystem for entertainment and tech.
Open AI publicly endorsed the NO FAKES act and Sam Altman, CEO, added:
“OpenAI is deeply committed to protecting performers from the misappropriation of their voice and likeness. We were an early supporter of the NO FAKES Act when it was introduced last year, and will always stand behind the rights of performers.”











