Why ask who owns the algorithms in 2025?
In 2025, the front line of the Middle East isn’t only outside Gaza or around Natanz. It is also inside your phone.
Your feed fills with Gaza footage, anti-Israel protests, “free Palestine” slogans, angry threads about Iran’s regime, and rumours about the next strike between Israel and Iran. You can feel that something – or someone – is pushing these emotional waves harder than before.
Nick Berg has lived inside that world. As an Iranian-American caught between Tehran and the United States, . The son of a man operating in the shadows of the Islamic Republic, then later as a Special Operations soldier and veteran.
He has seen hybrid warfare from the ground up. His novel Shadows of Tehran is a military thriller born from that experience: covert networks, proxy battles, psychological warfare, and the invisible pressures on civilians who are just trying to survive.
This article asks the question almost nobody wants to say out loud:
How is it possible that Iran, China, and Russia now have more practical control over the algorithms – and the populations they shape – than the USA, Europ,e and the UK, especially in the context of hybrid warfare in the Israel–Iran conflict?
What does hybrid warfare in the Israel–Iran conflict mean in 2025?
How Iran fights through proxies, terror finance and the shadow economy
Hybrid warfare in the Israel–Iran conflict describes how the two enemies fight without facing each other head-on.
For Iran, hybrid warfare means working through proxy networks in Gaza, Lebanon, Syria, and Iraq, using terror finance and a shadow economy to move money and weapons around sanctions, running cyber operations and information campaigns, and wrapping it all in the language of “resistance.”
How Israel and its allies respond in the hybrid war
For Israel, the United States and other partners, hybrid warfare means special operations, counterterrorism campaigns, air strikes, cyberattacks, and diplomatic pressure.
On the hard side, that includes years of covert and overt strikes against Iranian-linked targets in Syria, Iraq and sometimes inside Iran itself: suspected weapons convoys, IRGC facilities, and proxy command centres.
Open-source investigations and defence analyses describe a long pattern of Israeli “campaign between wars” operations in Syria designed to disrupt Iranian arms transfers and infrastructure without sliding into full-scale war. (source Washington Institute)
Other work on the “shadow war” between Israel and Iran highlights sabotage and targeted killings aimed at Iran’s nuclear programme and missile forces. (source Nordic Defense Review)
Layered onto this are cyberattacks and diplomatic pressure
From the Stuxnet worm—widely reported as a joint US-Israeli operation that damaged centrifuges at Natanz—to more recent campaigns like the “Predatory Sparrow” hacks on Iranian gas stations, banks and crypto infrastructure, cyber tools let Israel and its partners impose costs on Tehran with a degree of deniability. (source EBSC)
At the same time, US and European policy papers stress the role of sanctions, UN diplomacy and regional security coordination as part of a broader strategy to constrain Iran’s nuclear ambitions and its proxy network—essentially the mirror image of Iran’s own hybrid approach. (source Council of Foreign Affairs)
From rockets and assassinations to algorithmic battles
For years, that mix of rockets, assassinations, sabotage, and proxies defined hybrid warfare in the Israel–Iran conflict. What has changed is that the decisive phase of the contest now happens inside algorithmic systems that were never designed for war.
TikTok, Instagram Reels, YouTube, ,X and Telegram are no longer just places where people talk about conflict. They are active terrain in the conflict.
They decide which images of Gaza dominate global consciousness, which versions of “free Palestine” become normal, how anti-Israel protests are framed, and how much space the Iranian opposition gets before the regime’s narrative crowds them out.
How Shadows of Tehran already lives in this hybrid world
In Shadows of Tehran, hybrid warfare still looks mainly analog: whispers in Tehran city cafés, radio intercepts, Special Forces teams trying to see through deliberate fog.
But the logic author Nick Berg describes – control perception, and you control the battlefield – is exactly the logic that now runs through our feeds.
How did Iran, China, and Russia gain more control over algorithms than democracies?
They did not invent better AI. They fused state power with platform control, while democracies left most algorithmic power in private hands.
How did China turn algorithms into a governing tool?
China did it the most openly. In 2022, its Administrative Provisions on Algorithm Recommendation came fully into force, putting recommendation systems under the supervision of the Cyberspace Administration of China. (source DIGICHINA)
Large platforms must register their algorithms, ensure they promote “positive energy” and avoid content that threatens national security or social stability. (source The State Council The People’s Republic China)
In practice, the algorithms behind Douyin, WeChat, Weibo and other giants operate inside a political frame laid down by the Party.
After the new rules took effect, major apps adjusted their settings to comply, including options to switch off certain recommendation modes and changes to how trending topics are curated. (source ChinaEU)
Research on “positive energy” campaigns and Douyin’s separate “Positive Energy” section shows how recommendation features are explicitly used to promote state-approved narratives and filter out content that clashes with official ideology. (source SCISPACE)
Studies of Weibo’s “digital architecture” likewise find that platform governance and algorithms are designed to make some political forces flourish and constrain others, turning curation into a tool of information control rather than neutral business logic. (source Vincent Brussee, Leiden University)
When Beijing wants to downplay protests, bury sympathy for Hong Kong or foreground patriotic narratives, it is not pleading with platforms from the outside – it is adjusting its own nervous system through this regulatory and algorithmic stack. (source Global Times)
How did Russia learn to hijack Western feeds?
Russia took a different route. It never built its own global TikTok. Instead, it specialised in hijacking the openness of Western systems.
In 2024, a joint operation by the US, Canada, the Netherlands and others exposed and dismantled a Russian AI-driven bot farm that had created large numbers of realistic fake accounts on X. (source Ground News)
These accounts were powered by software known as Meliorator, developed within the orbit of state-backed broadcaster RT, and were designed to spread pro-Russian narratives and sow discord in Western democracies. (Source Joint Cyber Advisory)
Russia’s strategy is blunt and effective: you don’t need to own the algorithms of the West if you can systematically seed them with polarising content and let engagement-driven systems do the rest. (source Council on Foreign Relations)
How did Iran turn the internet into a switch?
Iran’s path is more inward but deeply tied to hybrid warfare in the Israel–Iran conflict. Over the last decade, Tehran has built the National Information Network (NIN), a state-controlled intranet that keeps domestic services running even when the global internet is throttled or switched off. (source Freedom House) (source Article 19)
Analysts have noted that without NIN a full shutdown would cost Iran hundreds of millions of dollars per day; with NIN, the regime can impose targeted blackouts and still keep banks, government services and regime-friendly platforms alive. (source Atlantic Council)
We saw this during the Mahsa Amini/Woman, Life, Freedom protests in 2022 , when nationwide and regional shutdowns hit major platforms. (source Aljazeera)
And we saw it again in June 2025, when Iran drastically restricted access to the global internet after Israeli strikes, leaving citizens with NIN-hosted services and heavily filtered access to the outside world. (source Wired)
Inside that controlled environment, the state has a crude but powerful advantage: people are nudged toward platforms it can monitor and curate. Dissent about Woman, Life, Freedom protests or criticism of the regime’s role in the Israel–Iran conflict is pushed down; loyalist narratives are pushed up.
When did hybrid warfare in the Israel–Iran conflict move from airstrikes to feeds?
From Gaza 2013 to a new phase of hybrid warfare
For someone who remembers Gaza in 2013, the shift feels abrupt. The rockets, blockades, and periodic bombardments have been part of the story for years. Yet the global emotional reaction to the 2023–25 phase of the war – and to hybrid warfare in the Israel–Iran conflict more broadly – feels like a step change.
How smartphones turned Gaza and Tehran into live war cameras
Part of that is connectivity. In the early 2010s, many people in Gaza or provincial Iran only “sort of knew” what was happening in real time. Internet access was patchy, smartphones were still climbing toward mass adoption, and television was still dominant. (source Palestinian Central Bureau of Statistics)
Over the following decade, smartphone ownership and internet penetration climbed dramatically, including in Gaza and urban Iran, despite sanctions and repression. (source Palestinian Central Bureau of Statistics)
That produced a new kind of witness. A strike in Gaza is now recorded from multiple angles, uploaded within minutes, shared with relatives in Europe, the Gulf or the US, then remixed into TikToks and Reels that splice trauma with music, slogans and calls to action. (source Princeton Politcal Review)
The same is true when security forces in Tehran beat protesters or when missiles hit a site linked to Iran’s programme or proxies: there is almost always a camera. (source Amnesty)
Why TikTok-style feeds amplify hybrid warfare in the Israel–Iran conflict
The second part of the shift is architecture. TikTok-style feeds are engineered to maximise attention. (source Cornell University)
Studies of news content on TikTok around the Israel–Gaza war found that emotionally intense war narratives – especially those centred on civilian suffering – consistently attract more engagement than nuanced or peace-oriented reporting. (source Taylor & Francis Online)
A 2025 study involving Northeastern University and NYU’s Cybersecurity for Democracy project went further, showing that in the US, TikTok posts about the Israel–Gaza conflict skewed heavily toward pro-Palestinian content; in one analysis, roughly seventeen pro-Palestinian videos appeared for every pro-Israel video, and the pattern persisted over time. (source Cybersecurity For Democracy)
Those numbers don’t prove deliberate bias in the code, but they show that the emotional and demographic structure of the platform favours one framing of the conflict.
For hybrid warfare in the Israel–Iran conflict, Tehran doesn’t need to “own TikTok” to benefit from the way TikTok behaves.
How Iran and Russia now treat your feed as operational space
The third layer is intent. Governments now treat these feeds as operational space. (source NATO StratCom COE)
Iran, Russia, and their partners pour resources into online campaigns around Gaza (source Office Of The Director Of National Intelligence), Lebanon, Ukrain,e and domestic unrest, often using AI-assisted content and botnets. (source U.S. Department of Justice)
Israel and Western governments run their own influence efforts. The difference is that autocracies can coordinate these efforts with tighter control and far less public scrutiny. (source Army University Press)
What are the algorithms actually doing to our view of the conflict?
Engagement first, truth later
At the heart of this sits a simple logic: most big platforms still optimise for engagement. (source TikTok) (source YouTube) (source PubMed)
If a video of bombed-out high-rises in Gaza, tagged with “genocide,” generates more watch-time and comments than a sober explainer on hybrid warfare in the Israel–Iran conflict, the system will push the former.
If a thread blaming a shadowy cabal for every aspect of the Israel–Iran confrontation gets more shares than a careful breakdown of proxy networks and terror finance, the system will favour the conspiracy. (source Science.org)
The algorithms are not moral actors. They chase patterns. But design choices – what counts as meaningful engagement, how quickly they reinforce user preferences, how easily coordinated networks can game their signals – have political consequences. (source Oxford Academic)
How democratic rules try to tame the feed
In open societies, this runs on top of regulatory frameworks aimed at transparency and risk, not narrative control.
The EU’s Digital Services Act (DSA) is the clearest example: obliges very large platforms to be transparent about how their recommender systems work, include them in systemic-risk reports, and publish detailed transparency reports about their moderation practices. (source Internet Policy Review DSA)
These are serious obligations with real fines attached. But they are not an attempt to centrally script what people should believe about Israel, Iran or Gaza. (source EPThinkTank)
Why authoritarian control of algorithms looks very different
In Iran and China, the distance between state and algorithm is far smaller. The goal is not open debate; it is regime survival. (source China Law Translate) (source American Foreign Policy Council)
That is why hybrid warfare in the Israel–Iran conflict looks so different depending on which side of the firewall you stand.
Inside Iran, NIN and layered censorship limit what citizens see about Israeli strikes, corruption, or the Woman, Life, Freedom movement; outside, an ocean of content about Gaza, Iran’s proxies and the West’s role runs through systems tuned for emotion and profit. (source GOV.UK)
How Russia, Iran and China exploit the asymmetry
Russia’s state-linked “bot farms” are well documented: the Mueller indictment of the Internet Research Agency describes a Kremlin-backed operation that created fake accounts on U.S. social platforms with the explicit goal to “sow discord” in American politics by pushing divisive content into open, speech-protective environments. (source Justice.gov)
In 2024, the U.S. Justice Department and independent analysts at CSIS exposed and disrupted a newer, AI-enabled Russian bot farm that ran about a thousand inauthentic accounts to spread pro-Kremlin narratives across major platforms. (source U.S. Department of Justice)
Think-tank and intelligence reporting now treats this as a “playbook” that others study: the Atlantic Council’s work on Iranian digital influence efforts shows Tehran building networks of covert websites and social-media personas to amplify pro-Iran lines abroad. (source Atlantic Council)
While RAND, IISS and recent Meta takedown reports describe Chinese campaigns that deploy coordinated fake accounts and cross-platform “Spamouflage” networks to shape foreign opinion—often around elections or security issues—using more targeted, goal-driven operations than Russia’s original mass-flooding approach. (source RAND)
Why is algorithmic hybrid warfare a threat to democracy and diaspora?
How algorithmic warfare erodes democratic trust
The clearest risk is to democratic trust. When hostile states use algorithmic tools to inflame culture wars, question election results and shape perceptions of allies’ policies in the Middle East, they don’t have to convert everyone.
They only need to deepen confusion and polarisation so that elections look rigged no matter who wins, institutions feel illegitimate to large chunks of the population, and “truth” becomes just another team jersey. (source Brookings) (source MDPI)
When content moderation looks like cancel culture
At the same time, platforms’ attempts to remove disinformation can look like cancel culture or political censorship, especially when moderation mistakes hit genuine activists or journalists. That feeds a second-order crisis around freedom of speech and free speech – exactly the kind of crisis authoritarian regimes want inside democracies. (source FIBGAR) (source PubMd)
What this does to diaspora communities
There is also a more intimate risk, particularly for diasporas and veterans. Iranian-Americans, Jewish communities, Palestinians, and other Middle Eastern diasporas live inside overlapping storylines. (source European Parliament)
A single clip from Gaza or Tehran can split a family WhatsApp group. A protest in a European capital can be read as solidarity, betrayal, or naïve manipulation depending on who is watching. For many, hybrid warfare in the Israel–Iran conflict is not abstract; it is something that reverberates through cousins, parents, old classmates, and childhood neighbourhoods. (source Congress.GOV)
Why veterans feel the shock first
Veterans add another layer. People who have spent time in the military – especially in Special Operations or counterterrorism units – have already lived inside hybrid warfare. (source SOFSupport)
They know how rumours, psychological operations and selective leaks are used to shape a battlefield long before a public statement is made.
Many come home with PTSD or moral injury that makes the constant barrage of online conflict especially hard to process.
Nick Berg’s own journey – Tehran childhood, Special Operation deployments, veteran life – sits right at that intersection of war, identity and mental health.
His work on resilience, veteran mental health and veterans helping veterans is about surviving the long tail of conflict. Today, civilians scrolling endlessly through war content need a softer version of the same resilience.
Where do terror finance and proxy networks meet the algorithm war?
Why hybrid warfare is more than memes and hashtags
Hybrid warfare in the Israel–Iran conflict is not just memes and hashtags. The online layer sits on top of money and guns.
How terror finance feeds proxy networks and media machines
Terror finance and the broader shadow economy move funds from Iran’s sanction-ridden system into proxy networks in Gaza, Lebanon, Syria, Iraq and beyond. (FinCen Advisory)
Those channels pay for rockets, training camps and encrypted communications – and for media operations: video production teams, social accounts, “charities” and “news outlets” that double as narrative pipelines. (source The Washington Institute)
When propaganda clips become fuel for Western algorithms
When a militia or armed group posts slick, high-production footage of rocket launches or drone strikes—like the recent Hezbollah “military media” videos that showcase weapons systems and underground launch sites—that material is crafted as both intimidation and recruitment. (source DFRLab)
Front organisations and propaganda channels tied to Hamas, Hezbollah and their supporters likewise flood Telegram, X and other platforms with graphic images of dead or injured children from Gaza, often framed with language about “genocide” and “martyrs,” explicitly designed to shock, mobilise and radicalise sympathetic audiences. (source New York Post)
Congressional hearings, UN and RAND research all describe how such content functions simultaneously as online recruitment material and as propaganda in the broader information war. (source Congress.GOV) (source RAND)
Once bigger accounts, activist networks or sympathetic influencers in the West pick up those clips and images and start resharing them, they are fed into engagement-driven recommendation systems, where algorithmic amplification can carry them far beyond the original channels and into the emotional layer of hybrid warfare around the Israel–Iran conflict. (Committees Parliament UK)
How Shadows of Tehran captures the shadow economy behind the feeds
Shadows of Tehran (order here!) doesn’t read like a finance manual, but it evokes the feeling of living in a world where embassies, smugglers, charities, militias and intelligence services are linked by invisible money flows.
That is the same underworld from which much of the online narrative war is funded.
How does Shadows of Tehran help us understand this new phase of war?
Why tie algorithmic hybrid warfare to a military thriller?
Why anchor a blog about hybrid warfare in the Israel–Iran conflict and algorithmic control to a military thriller? Because stories are often the only way people can actually absorb this level of complexity.
Shadows of Tehran as a resilience story, not propaganda
Nick Berg’s Shadows of Tehran is not propaganda for any state or faction. It is a resilience story about a man pulled between Tehran city and the US, between loyalty and doubt, between the demands of Special Operations and the psychological costs those demands impose.
It blends elements of military memoir and military thriller, grounded in modern special forces warfare and counterterrorism tactics, but it also grapples with culture, mysticism, women’s rights, dual identity and what “resistance” really means.
What the novel reveals about hybrid warfare in the Israel–Iran conflict
The book’s focus on psychological warfare, proxy networks and lived experience makes it a useful lens on hybrid warfare in the Israel–Iran conflict today.
It reminds readers that behind every narrative about Israel and Iran are real people making impossible choices; behind each algorithmically boosted protest chant are older battles over meaning in Tehran’s alleys, on US bases and in diaspora apartments; and that resilience is not a slogan but a discipline, built over years of facing fear without becoming what you hate.
Turning pnberg.com into more than just a news feed
Reading Shadows of Tehran alongside analytical pieces on hybrid warfare, terror finance, veterans, women’s rights and free speech turns the site into more than just a news feed.
It becomes a place where a Special Operations veteran and Iranian-American author invites readers to think about war as a human, cultural and moral problem – not just a military one.
What can ordinary people do against algorithmic control?
Refuse to be a passive target in hybrid warfare
You can’t personally rewrite TikTok’s code or dismantle NIN. But you can choose not to be a passive target in hybrid warfare in the Israel–Iran conflict.
You can recognise when content is crafted purely to enrage or shame you. You can notice when dozens of accounts suddenly share the same wording or the same clip – often a sign of coordination rather than coincidence.
You can slow down before reposting, especially when something confirms all your worst assumptions about “the other side.”
Diversify your information diet
You can also diversify your information diet. Let no single app define the conflict. Mix frontline testimonies with solid journalism, policy analysis and voices from different parts of the diaspora.
Seek out Iranian opposition figures, Israeli citizens and pro-Israel advocates, facing online campaigns from Hamas-linked and Iranian-aligned Palestinian activists and independent researchers who are willing to document abuses by all actors, not just one.
Build your own resilience against the feed
Finally, you can work on your own resilience. That is where Nick Berg’s broader world – veteran mental health, resilience therapy, veterans helping veterans, service dogs, women’s rights stories – intersects with your scroll and turns it into something more than passive witnessing.
The same tools he uses to help people who have lived through actual battlefields – naming what you feel, building routines, leaning on community, learning when to disengage – are tools civilians can use when their battlefield is a phone screen.
Living in permanent crisis mode is exactly what hostile regimes and engagement-hungry platforms want: a population that is exhausted, jumpy, and easier to steer with the next shocking clip.
Choosing small boundaries – no doomscrolling before bed, time-boxing news, muting accounts that only inflame – is already a political act.
Reaching out to real people instead of arguing with avatars, and putting your energy into constructive action – supporting credible charities, backing voices of freedom, helping someone in your own town – are quiet ways of refusing to let them script your inner life.
Resilience here doesn’t mean looking away from suffering; it means staying human enough that you can keep caring without letting the war live rent-free in your head.
So who really owns the algorithms in this conflict?
Who builds the tech and who uses it better?
On paper, the most advanced AI labs and the largest social platforms are still based in the United States or its allies.
Yet in the specific arena of hybrid warfare in the Israel–Iran conflict, the uncomfortable reality is that Iran, China and Russia have more direct, coherent strategies for using algorithms to protect their regimes and project influence than the US, EU or UK have for defending open societies.
Why anchor a blog about hybrid warfare in the Israel–Iran conflict and algorithmic control to a military thriller?
Because stories are often the only way people can actually absorb this level of complexity.
How authoritarian regimes weaponise algorithms
China can order its platforms to boost or bury narratives at will.
Russia has shown it is willing to deploy AI-driven bot farms and disinformation networks on an industrial scale.
Iran has built a digital cage with NIN that lets it cut off the world and steer its citizens towards tightly monitored spaces.
Why democracies are slower to steer the feeds
Democracies have stronger protections for free speech, pluralism, and the rule of law, but those protections also slow their response and limit their ability to steer feeds for political goals.
The EU’s DSA is an important step toward transparency and accountability, yet it leaves the fundamental logic of engagement-based algorithms intact.
The sharp answer – and a warning from Shadows of Tehran
The answer to the original question is therefore sharp: in the realm of hybrid warfare in the Israel–Iran conflict, authoritarian regimes currently exercise more practical control over the algorithms that govern their citizens, and more freedom to exploit Western platforms than Western governments do over the systems inside their own societies.
Nick Berg’s Shadows of Tehran is, among other things, a warning against leaving that imbalance untouched. The code may be written in Silicon Valley, but if citizens and leaders in democracies do not insist that algorithms ultimately answer to human rights and open debate rather than to regimes or shareholders, the next phase of this conflict will be fought on terms set by Tehran, Beijing and Moscow – not by us.










