Scarlett Johansson’s OpenAI clash is just the start of legal wrangles over artificial intelligence


When OpenAI’s new voice assistant said it was “doing fantastic” in a launch demo this month, Scarlett Johansson was not.

The Hollywood star said she was “shocked, angered and in disbelief” that the updated version of ChatGPT, which can listen to spoken prompts and respond verbally, had a voice “eerily similar” to hers.

One of Johansson’s signature roles was as the voice of a futuristic version of Siri in the 2013 film Her and, for the actor, the similarity was stark. The OpenAI chief executive, Sam Altman, appeared to acknowledge the film’s influence with a one-word post on X on the day of the launch: “her”.

In a statement, Johansson said Altman had approached her last year to be a voice of ChatGPT and that she had declined for “personal reasons”. OpenAI confirmed this in a blogpost but said she had been approached to be an extra voice for ChatGPT, after five had already been chosen, including the voice that had alarmed Johansson. She was approached again days before the 13 May launch, OpenAI added, about becoming a “future additional voice”.

OpenAI wrote that AI voices should not “deliberately mimic a celebrity’s distinctive voice” and that the voice in question used by the new GPT-4o model, Sky, was not an imitation of Scarlett Johansson but “belongs to a different professional actress using her own natural speaking voice”.

The relationship between AI and the creative industries is already strained, with authors, artists and music publishers bringing lawsuits over copyright infringement, but for some campaigners the furore is emblematic of tensions between wider society and a technology whose advances could leave politicians, regulators and industries trailing in its wake.

Joaquin Phoenix in Her

Christian Nunes, the president of the National Organization for Women, which has spoken out on the issue of deepfakes, said “people feel like their choice and autonomy is being taken from them” by the technology, while Sneha Revanur, the founder of Encode Justice, a youth-led group that campaigns for AI regulation, said the Johansson row highlighted a “collapse of trust” in AI.

OpenAI, which has dropped Sky, wrote in another blogpost this month that it wanted to contribute to the “development of a broadly beneficial social contract for content in the AI age”. It also revealed it was developing a tool called Media Manager that would allow creators and content owners to flag their work and whether they wanted it included in training of AI models, which “learn” from a mass of material taken from the internet.

When OpenAI talks of a social contract, however, the entertainment industry is seeking something more concrete. Sag-Aftra, the US actors’ union, feels this is a teachable moment for the tech industry.

Jeffrey Bennett, the Sag-Aftra general counsel, says: “I am willing to bet there are quite a few companies out there that don’t even understand that there are rights in voice. So there is going to be a lot of education that has to happen. And we are now prepared to do that, aggressively.”

Sag-Aftra, whose members went on strike last year over a range of issues that included use of AI, wants a person’s image, voice and likeness enshrined as an intellectual property right at federal – or countrywide – level.

“We feel like the time is urgent to establish a federal intellectual property right in image, voice and likeness. If you have an intellectual property right at the federal level you can demand online platforms take down unauthorised uses of digital replicas,” Bennett says.

To that end, Sag-Aftra is backing the No Fakes Act, a bipartisan bill in the US Senate that seeks to protect performers from unauthorised digital replicas.

Chris Mammen, a partner and specialist in IP at the US law firm Womble Bond Dickinson, sees an evolving relationship between Hollywood and the tech industry.

“I think the technology is developing so rapidly, and potential new uses of the technology also being invented almost daily, there are bound to be tensions and disputes but also new opportunities and new deals to be made,” he said.

skip past newsletter promotion

When Johansson made her comments on 20 May, she said she had hired legal counsel. It is unclear if Johansson is considering legal action, now that OpenAI has withdrawn Sky. Johansson’s representatives have been contacted for comment.

However, legal experts contacted by the Guardian believe she could have a basis for a case and point to “right of publicity” claims that can be brought under state law, including in California. The right of publicity protects someone’s name, image, likeness and other distinguishing features of their identity from unauthorised use.

“Generally, a person’s right of publicity can be deemed violated when a party uses the person’s name, image, or likeness, including voice, without his or her permission, to promote a business or product,” said Purvi Patel Albers, a partner at the US firm Haynes Boon.

Even if Johansson’s voice was not used directly, there is precedent for a lawsuit from a case brought by the singer Bette Midler against the Ford Motor Company in the 1980s, which had used a Midler impersonator to replicate her singing voice in a commercial. Midler won in the US court of appeals.

“The Midler case confirms that it does not have to be an exact replica to be actionable,” Albers said.

Mark Humphrey, a partner at the law firm Mitchell Silberberg & Knupp, said Johansson had “some favourable facts” such as the “her” post and the fact OpenAI approached her again shortly before the launch.

“If everything OpenAI has claimed is true, and there was no intent for Sky to sound like Ms Johansson, why was OpenAI still trying to negotiate with her at the 11th hour?” However, Humphrey added that he had spoken to people who thought Sky did not sound like Johansson. The Washington Post reported a statement from the actor behind Sky, who said she had “never been compared” to Johansson by “the people who do know me closely”.

Daniel Gervais, a law professor and intellectual property expert at Vanderbilt University, said Johansson would face an “uphill battle” even if states like Tennessee had recently expanded their right of publicity law to protect an individual’s voice.

“There are a few state laws that protect voice in addition to name, image and likeness, but they have been tested. They are being challenged on a variety of grounds, including the first amendment,” he said.

As the use and competence of generative AI grows, so will the legal battles around it.


Leave a Reply

Your email address will not be published. Required fields are marked *