If the Vietnam War was the first major televised conflict and the Gulf War was launched alongside burgeoning cable networks, then Donald Trump and Benjamin Netanyahu’s strikes on Iran are landing in the early days of the AI media age.
That is having strange consequences: real footage of mass civilian graves in Iran has been dismissed by AI chatbots as fake, while Instagram users have been fooled into following an account ostensibly belonging to a beautiful US soldier who happens to also post sexualised pictures of her feet.
With Iran essentially a black box for reliable journalism and Trump’s reputation with the international community and domestic voters on the line ahead of this year’s midterm elections, the stakes are high in the media war.
The widespread use of artificial intelligence in war propaganda has been expected for some time, says David Wroe, resident senior fellow at Canberra-based think tank the Australian Strategic Policy Institute. But the speed at which this content is now being created and pushed out on social media is unprecedented.
“You can do that quickly. You can do it essentially free of charge. You can do it at a massive scale. But you can also use AI to coordinate inauthentic accounts too,” Wroe says.
Tom Sulston, head of policy at Australian not-for-profit Digital Rights Watch, says the basic concept is to “flood the zone”, a strategy initially pioneered by MAGA loyalist and conservative political strategist Steve Bannon. Propagandists build a haystack around the needle of truth and pump out so much information – both true and false – that it is impossible to discern fact from fiction.
On March 2, a fake account using the name of Iran’s newly appointed Supreme Leader Mojtaba Khamenei posted a video of the world’s tallest building, the Burj Khalifa in Dubai, on fire. Iran has attacked the United Arab Emirates in retaliation for Israeli and American strikes, but has not struck the iconic tower. Still, the video was reshared by more than 300 accounts, says digital investigator Benjamin Strick, gathering thousands of likes before being fact-checked by X.
Exactly who was behind the fake video is unclear. But social media platforms rely on keeping users on the platform, so they can pay creators handsomely for engagement and clicks. And there is a lot of easy money to be made, says Strick, because on Elon Musk’s X and other platforms to a lesser degree, shareable content is accelerated regardless of whether it’s true or false, while fact checking is a slower, more laborious process.
Some of that material is coming from “useful idiots”, essentially everyday news creators trying to make “a quick buck out of a hot news item” on social media, Strick says.
But there’s also proxy state actors like the Tehran Times, an Iranian news organisation aligned with the hard-line regime which has posted a litany of AI-produced disinformation, as well as Russian state-backed broadcaster Russia Today, which regularly republishes this content.
The Tehran Times was responsible for a recent viral post depicting before and after satellite images of a US military site in Bahrain destroyed by Iranian strikes. Again, these strikes never happened.
“Then we’d have these proxy actors that we don’t know whether they’re paid by Iran or their links to Iran because they’re pretty vague; fake accounts that have been set up to be or pretend to be someone that might appear American or British, for example,” Strick says.
Both Strick and Sulston say there have been numerous examples of social media accounts that have pivoted from posting on divisive issues in the West, such as Scottish independence or immigration, which helped the profiles garner significant followings, to the Middle East conflict.
Their output includes content purporting to show Iranian strikes piercing Israel’s famous Iron Dome defensive system, resulting in burning buildings in Tel Aviv, and a successful strike on the USS Abraham Lincoln.
A recent study from Clemson University’s Media Forensics Hub found at least 62 accounts linked to Iran’s Islamic Revolutionary Guard Corps (IRGC) purporting to be based in the Americas and British Isles with the intention of amplifying politically divisive content and disinformation aligned with IRGC narratives.
“They are designed to exploit regional fault lines to advance Iranian regime interests,” reads the report, which was based on an analysis of 60,000 posts on X alone.
Wroe says Iran’s online campaign has been “heavily supported” by both Russia and China-linked accounts, including via state outlets such as Russia Today.
At the same time, users attempting to separate fact from fiction by using AI tools are being given unreliable answers. Both Google’s Gemini chatbot and Elon Musk’s Grok both confidently stated that images of mass graves being dug in the Iranian town of Minab, after dozens of girls were killed in an airstrike, were actually taken from a different country and time. That, The Guardian reported, was incorrect.
But with the “nebulous and often self-contradictory” rationale behind the US-Israeli declaration of war, democratic norms of dignity in war have been abandoned and the US administration has in turn been nebulous in addressing its strategic progress in the conflict, says Sulston. The result is vacated space for Iran to discredit its war effort and aims.
Instead, the Trump administration’s approach online is consistent with the press conferences that President Donald Trump and War Secretary Pete Hegseth have held, says Wroe, with a focus on tactical and operational achievements.
“They’re very focused on the military successes measured in how much they’ve destroyed, how many bombs they’ve dropped and how many Iranian missile sites have destroyed.
“They push back very hard against questions and doubts about the strategic achievements like: Have you actually toppled the regime? Have you ended the nuclear program? Have you ensured energy security by figuring out some kind of solution for the Strait of Hormuz?” says Wroe.
In another video posted by the White House on X, a clip of the cartoon character says, “Do you want to see me do it again?” overlaid on unclassified footage of US missiles blowing up more Iranian jets and trucks. The caption reads: “Will not stop until the objectives are met. Unrelenting. Unapologetic.”
“I think the conclusion you have to draw is that it’s just very domestically orientated and specifically aimed at existing supporters [...] this is pretty much in keeping with their usual messaging on foreign and security policy, and in this case, it’s just counterproductive to their strategic aims,” says Wroe.
Opportunists have joined in, including the operator of the account of United States military officer Jessica Foster, whose pro-Trump posts and pictures of her feet gained more than one million followers on Instagram, then pushed followers to a paid OnlyFans account. Her account – which was all AI generated – featured fake images with Trump, other prominent military figures and powerful artillery. Meta eventually removed the account, but only after it was contacted by The Washington Post.
Other official American communications have so far focused on the success of its military operations through the use of memes, popular culture and Call of Duty-style video footage to dehumanise the dead.
One video from the official White House account shows a fake Nintendo Wii Sports-style game titled “Operation Epic Fury” with footage of real missile strikes shown when a player hits a target.
Sydney University media professor Catharine Lumby says the strategy is shocking. “To see this war turned into a memification and gamification kind of contest takes my breath away,” Lumby says.
The strategy also risks backfiring.
“[The Trump administration is] making their allies’ stomachs turn,” says Wroe. “They’re only going to make it harder for NATO allies and allies like Australia to support the war and support operations like keeping the Strait of Hormuz open.”
Iran has piggybacked America’s meme-based strategy into their own official communications, mass producing content like the now infamous “Lego Trump” videos, which play up numerous factors like Trump’s closeness to Israeli Prime Minister Benjamin Netanyahu, Iranian military successes and alleged attempts to cover up the Epstein Files in America.
The simple reason behind this content? It is shareable.
“The White House account is putting out a lot of a lot of this stuff [...] but it’s pretty awful, because it almost sets the standard for leaders of certain countries to engage in AI content in a way that politically benefits them,” Strick says.
In 2016, then-secretary of state and Vietnam veteran John Kerry reflected on what the United States had learned in that conflict. Soldiers, he said, should always be treated with dignity. The healing and reconciliation that had allowed servicemen’s bodies to be recovered from Vietnam was a tribute to both nations.
And “we were right to think about what had gone wrong and to enact laws that shed greater light on how our government goes about its business,” Kerry said.
Those hard-earned lessons now lie forgotten somewhere under a welter of juvenile memes and AI generated war porn.
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.

















