WASHINGTON, D.C. (7News) — With primary elections coming up next year, experts are concerned AI deepfake ads will mislead voters.
Deepfakes can change video to make it appear that candidates said or did something they never actually did.
SEE ALSO | Artificial Intelligence: How concerned should you be about voice cloning, deepfakes
Meta, which operates Facebook and Instagram, is now going to require political advertisers around the world to disclose whether they’re using artificial intelligence in their ads starting in 2024.
The idea is that the disclosure could help limit the use of deceptive and misleading deepfakes.
But the rules don’t cover everything. Some AI tools like color correction won’t be subject to the new rules.
But while Meta institutes the rules, it’s also unveiling new AI tools of its own that can do things like change backgrounds or allow people to develop their own features.
For more information on Meta’s tools, click here.