An AI app cloned Scarlett Johansson’s voice for an ad—but deepfakes aren’t just a problem for celebrities


Movie star Scarlett Johansson is taking legal action against an AI app that used her name and an AI-generated version of her voice in an advertisement without her permission, according to Variety.

The 22-second ad was posted to X, formerly Twitter, on Oct. 28 by AI image-generating app Lisa AI: 90s Yearbook & Avatar, according to Variety. The ad featured images of Johansson and an AI-generated voice similar to hers promoting the app. However, fine print displayed under the ad indicated the AI-generated content “has nothing to do with this person.”

Representatives for Johansson confirmed to Variety that she is not a spokesperson for the app and her lawyer told the publication that legal action is being taken. CNBC has not viewed the ad and it appears to have been taken down. Lisa AI and a representative for Johansson didn’t respond to CNBC Make It’s request for comment.

While many celebrities have been the subject of deepfakes, they can create problems for everyday people too. Here’s what to know.

What is a deepfake?

The word deepfake comes from the concept of “deep learning,” which falls under the broader umbrella of machine learning. It’s when algorithms are trained to identify patterns in large data sets, then use those pattern recognition skills on a new data set or to produce outputs that are similar to the original data set.

Here’s a simplified example: An AI model could be fed audio clips of a person talking and learn how to identify their speech patterns, tonality and other unique aspects of their voice. The AI model could then create a synthetic version of the voice.

The problem is the technology can be used in harmful ways, says Jamyn Edis, an adjunct professor at New York University with over 25 years of experience in the technology and media industries.

“Deepfakes are simply a new vector for impersonation and fraud, and as such can be used in similar malicious ways, whether or not one is a celebrity,” he tells CNBC Make It. “Examples could be of your likeness — or those of your loved ones — being used to generate pornography or utilized for extortion or to circumvent security by hijacking your identity.”

What’s even more concerning is that it’s becoming harder to tell the difference between what’s real and what’s fake as deepfake technology rapidly evolves, Edis says.

How to protect yourself

There are a few things you can do if you find yourself wondering whether something you’re viewing may be a deepfake.

For one, ask yourself whether the images you’re seeing seem to align with reality, Edis says. Since celebrities are required to disclose when they’re being paid to promote products, if you see an ad featuring a celebrity pushing something obscure, it’s a good idea to check their other social media accounts for a disclosure.

Large tech companies, including Meta, Google and Microsoft, are also developing tools to help people spot deepfakes.

President Biden recently announced the first executive order on AI, which would require watermarking to clearly label AI-generated content and other safety measures.

However, technology has historically stayed one step ahead of regulations or attempts to guardrail it, says Edis.

“With time, social norms and legal regulations typically correct humanity’s worst instincts,” he says.
“Until then, we will continue to see the weaponization of deepfake technology for negative outcomes.”   

DON’T MISS: Want to be smarter and more successful with your money, work & life? Sign up for our new newsletter!

CNBC will host its virtual Your Money event on November 9 at 12 p.m. ET, with experts including Jim Cramer, Ben McKenzie and Farnoosh Torabi. Learn how to boost your finances, invest for the future, and mitigate risk amid record-high inflation. Register for free here

CHECK OUT: This new tool lets artists ‘poison’ their artwork to deter AI companies from scraping it

How I built a $400 million food delivery company called Caviar

Leave a Reply

Your email address will not be published. Required fields are marked *