Photos of Hamas leadership enjoying “luxurious lives” went viral on Friday, but many internet users noticed something weird about the images. The photos looked fake, with all the hallmarks of being created with artificial intelligence software. But they’re real images, they were just run through an image “upscaler” under the mistaken belief that it would give them a higher resolution. The result? They look like they were made with AI.
The images of Hamas leadership went viral on Friday after they were published to the social media platform X by Hananya Naftali. The images have been seen by over 13 million people at the time of this writing.
“Dear Palestinians, While the leaders of Hamas are living luxurious lives enjoying good lives, they ask you to sacrifice yourselves and your children,” Naftali tweeted.
But, as I suspected, Naftali confirmed to me on Saturday over email that the images were run through an image “upscaler” online, with the inaccurate belief that they would look better.
“I took the images from this 2014 articles by Ynetnews. The images were also aired on television back then. Since the images are low quality, I increased the resolution but did not apply any digital filter to them,” Naftali told me by email on Saturday.
Websites that promote the ability to “upscale” images are actually using artificial intelligence tools in an effort to make them look better, and that’s a problem for people who don’t understand the technology they’re using. Even good online tools can make low resolution images like this appear fake. And with all the new tools that have cropped up like DALL-E and Stable Diffusion that allow users to create incredibly life-like images using nothing but text prompts, many people have their guards up about fake images.
I tried it myself using the first Google result for “upscale images,” a website called upscale.media. I run the low resolution images from Ynetnews through the upscaler and got similar results, as you can see below. It makes them look fake.
You can’t really “increase the resolution” on a photo in a meaningful way. What’s actually happening in simple terms is that a computer program is trying to fill in gaps of knowledge with what it thinks should be there. I’ve explained this problem before when it comes to old deteriorated films from the 19th century. People assume that we can actually “fix” those old films, and in some cases they can look better. But in other cases, they can actually look worse. And even if they look better, they can provide new details that simply didn’t exist in the real world because that’s what the computer thinks looks best.
One way to imagine what’s happening with the pixels is to imagine you have two dots on a piece of paper. The computer program wants to fill in the line between those points but doesn’t know if it’s a squiggly line, a straight line, or what color the line is, and so on. The computer program fills in what it thinks the line should look like, and in some cases, it makes the bigger picture look better. But sometimes, if there’s not enough information for the program to infer what the line looks like, it can look like a muddled mess. And that’s what happened here.
After I explained the situation to Naftali and why people thought they were AI-generated he stopped replying to my questions, including which website he used in an effort to make the images look better. I’ll update this post if I hear back.
It’s no surprise that people are on high alert about fake images, given everything we’ve seen this year. There was a viral image of the Pope wearing a gigantic white puffer jacket, that viral photo of Rand Paul wearing a bathrobe, and there were those fake images of Donald Trump with Martin Luther King Jr. The Republican National Committee even aired an ad using AI-generated images to attack President Joe Biden back in April.
And as I wrote at the start of this war between Israel and Hamas on October 7, there were already images being passed around that didn’t show what they purported to show.
The long and the short of it: The images published by Naftali are real and are from 2014 or earlier. They’ve just been run through a computer program that too many people think can actually make images higher resolution.
Update, 12:05 p.m. ET: It appears a Community Note has now been added to explain the reporting from the article you’re reading right now, but without any credit to Forbes.