Selected Articles

Viral ‘All Eyes on Rafah’ image inspires wave of AI-generated Instagram posts about Israel-Hamas war


Dozens of images related to the war in the Gaza Strip that appear to have been made using artificial intelligence have spread across Instagram in recent days following the viral success of a post calling for “All Eyes on Rafah,” which has now been shared more than 47 million times.

The images, a mix of pro-Irsael and pro-Palestinian posts, include imitations of the original Rafah post as well as more graphic depictions, including a bloodied Israeli Prime Minister Benjamin Netanyahu and an Israeli child confronted by a Hamas fighter. One seemingly AI-generated image shows a large crowd gathered in a town square with giant block letters spelling out “bring them home now,” a reference to the 125 Israeli hostages who remain in captivity in Gaza. It’s been shared over 134,000 times on Instagram.

A seemingly AI-generated pro-Israel image responding to the viral

The sudden and rapid spread of the images comes as international attention has been refocused on Israel’s push into Rafah following an Israeli airstrike that local officials said killed at least 45 civilians Sunday.

The proliferation of the images adds to what has been an ongoing battle for attention on social media between voices supporting Israel and its campaign in Gaza and those supporting Palestinians. And while AI images have increasingly become common across the internet, their use on Instagram — a platform that has at times eschewed news while remaining a crucial outlet for Palestinian journalists — underscores how the technology is already beginning to influence political speech online.

Shortly after the “All Eyes on Rafah” image began to go viral, pro-Israel images began to circulate. The images bear many of the hallmarks of AI-created content, including repeated or blurred visual elements. Some accounts and people who have posted the images have been explicit about their use of AI to create them.

The images have posed a challenge for Meta, particularly around how to enforce its policies against AI-generated content and depictions of violence.

On Wednesday, several Israeli media outlets reported that a pro-Israel Instagram template responding to the “Rafah” image was removed from Instagram by the platform. The AI-generated image showed a Hamas gunman standing over a baby in a puddle of blood and a burning Israeli flag with text reading, “Where were your eyes on October 7?”

An AI-generated image posted to Instagram imitating the style of the original

Israel’s official Instagram account published several responses about the removal of the image, complaining in one now-deleted post, “Instagram decided to take down the template, intentionally silencing people from sharing what happened on October 7th.”

The Times of Israel has reported that the original post had been reinstated. Meta said that the image did not violate its terms of use and was mistakenly removed.

Many of the images have been uploaded to Instagram’s “template” feature that lets users quickly share posts to their own accounts.

Some viral templates show graphic images that appear to be made by AI, including one that falsely depicts Netanyahu in different contexts, including with blood on his hands and body. One image, which was shared 5 million times on Instagram, depicted Netanyahu in a waist-high pool of blood, with accompanying text saying “terrorist baby killer!”

Meta has said that it hopes to not amplify political content heading into the 2024 presidential election, but the spread of such images suggests that the platform may be struggling to accomplish that goal in the age of AI.

Despite a new initiative from Meta to more strictly label AI-generated content on its platforms, none of the templates seen by NBC News contained clarifications that the images were made with AI.

Following the virality of the original “Rafah” image, many pro-Palestinian activists criticized the use of AI in sharing their message, arguing that it was sanitizing the real horrors of the war.

This article was originally published on