Russian Bots, Kremlin Agitprop, and Meta’s Blind Eye: The Disinformation War Over Hungary’s 2026 Election

By Guerin Lee Green
Published: April 8, 2026

One hundred and sixty-two illegal political advertisements. That’s how many Fidesz candidates successfully ran on Facebook in a single month after Meta banned political ads in the EU. Meta caught 19 of them — fewer than 12%. This investigation documents how the Kremlin and Hungary’s ruling party built an integrated disinformation machine on Facebook, and how Zuckerberg’s platform let them.


The Stakes: Hungary’s Most Contested Election in 16 Years

Hungary heads to the polls on April 12 facing a watershed moment. Prime Minister Viktor Orbán, who has held power continuously since 2010, faces the strongest electoral challenge of his political career from Péter Magyar, the telegenic leader of the center-right Tisza party.

Independent polls tell a striking story. The latest projection from polling agency Median, based on five surveys totaling 5,000 respondents, shows Tisza on track to win a two-thirds parliamentary majority — the supermajority threshold needed to amend Hungary’s constitution and unlock frozen EU funds. A poll by 21 Research Centre published April 1 showed Tisza capturing 56% of decided voters, compared to just 37% for Fidesz. The BBC reported the latest tracking poll showing Tisza at 58% versus Orbán’s 35%.

For Orbán — and for Vladimir Putin — the stakes could not be higher. A Tisza victory would shatter the Kremlin’s most important European alliance and potentially end Hungary’s years-long blockade of EU support for Ukraine. That geopolitical calculus explains what happened next.

The Kremlin’s Blueprint: A Multi-Vector Interference Campaign

The Social Design Agency Plan

The first confirmation of an organized Kremlin campaign came in early March, when the Financial Times obtained documents showing the Russian presidential administration had approved a covert social media plan drafted by the Social Design Agency (SDA), a Kremlin-linked media consultancy already under Western sanctions.

The plan called for flooding Hungarian social media with pro-Orbán messages, memes, infographics, and short videos — disguised as content from “local” Hungarian users and influential figures. The campaign’s core narrative: Orbán is Hungary’s only authentic sovereign defender; Magyar is a “Brussels puppet with no outside support.” According to sources familiar with the matter cited by the FT, the SDA deliberately avoided direct coordination with the Hungarian government to provide Fidesz plausible deniability.

The SDA’s involvement is not trivial history. In 2024, the U.S. Justice Department accused the same agency of running “Doppelgänger,” an operation that built fake regional news websites to spread pro-Russian and anti-Ukrainian propaganda across multiple countries. The U.S., U.K., and several other Western nations have placed the SDA and its leadership on formal sanctions lists.

GRU Operatives in Budapest

Even more alarming was a parallel covert ground operation. In early March 2026, VSquare — the Central European investigative outlet — reported, citing multiple European national security sources, that Moscow had dispatched a three-person team of GRU officers to the Russian Embassy in Budapest. The operation is overseen by Sergei Kiriyenko, Putin’s First Deputy Chief of Staff and the architect of Russia’s foreign election interference apparatus, who deployed the same blueprint in Moldova.

The team was provided with diplomatic or service passports, shielding them from expulsion — mirroring exactly the tactics used in Moldova, where Russian embassy personnel were later found to have coordinated subversive political activities. Western intelligence agencies have established the identities of the operatives. The Washington Post independently corroborated the VSquare reporting.

Magyar publicly demanded Orbán expel the Russian intelligence officers and convene Hungary’s National Security Committee. The Orbán government’s response? It filed an espionage complaint against Szabolcs Panyi, the investigative journalist who broke the story.

130,000 reactions in a few days, mostly from foreign users — an engagement rate “rare for the Hungarian segment,” pointing to coordinated foreign amplification.

Fake Profiles on Facebook: The Domestic Amplification Layer

The Russian operations do not operate in isolation — they are amplified by a sophisticated domestic network of inauthentic Facebook profiles tied to the Orbán ecosystem.

In February 2026, EDMO-affiliated Hungarian investigators published one of the most detailed analyses of a fake-profile network yet produced in Europe, documenting what they described as “the most sophisticated pro-Fidesz fake-profile network on Facebook.” The profiles displayed telltale signs of inauthenticity: using stock photos and AI-generated faces, posting coordinated content in near-simultaneous fashion, and building artificial social credibility by interconnecting with each other before targeting real users. Investigators identified one network member who appeared to be simultaneously operating as multiple distinct Facebook personas — “Cintia,” “Zoltán,” and “Kiara” — all sharing pro-government content.

This follows a pattern documented as far back as 2023, when Political Capital identified over 500 fake Facebook profiles that had infiltrated more than 450 Facebook groups nationwide, potentially reaching tens of thousands of Hungarian users. The profiles’ cover and profile images were traced back to Russian social media sites.

The disinformation infrastructure also runs through Hungarian proxy organizations with opaque funding. Megafon, a centrally organized influencer hub tied to the Fidesz ecosystem, spent more than €1.7 million on Facebook promotional content in 2024 alone. The National Resistance Movement — another pro-Fidesz front group with documented ties to Megafon — ran AI-generated attack videos against Tisza on Facebook, some reaching millions of views. In a broader 2024 context, Political Capital calculated that Fidesz and its proxies outspent all 13 opposition parties combined on social media advertising by a factor of 2.5 to 1, and outspent independent media 11 to 1 on Facebook ads.

The Bot Armies: Matryoshka and Storm-1516

How the Matryoshka Network Works

The most active bot network targeting Hungary’s election is known as Matryoshka — named after the Russian nesting dolls its layered structure resembles. Investigators from the Russian opposition outlet The Insider and the Bot Blocker project identified the network spreading fabricated stories and manipulated videos across social media platforms bearing the stolen logos of trusted Western outlets including Reuters, DW, and Euronews.

The network operates through layered bots, trolls, and anonymous profiles that simultaneously amplify identical narratives across multiple platforms. Its core methodology: manufacture a fake video using AI deepfake technology, dress it in the visual identity of a credible news organization, and then cascade it through the layered bot network to manufacture the appearance of organic virality.

Among the fabricated incidents spread by Matryoshka:

  • False videos claiming Ukrainian refugees were plotting to assassinate Viktor Orbán, falsely attributed to DW’s Hungarian service
  • A fabricated Kyiv Independent report claiming a French MEP said Ukraine was preparing “provocations” and a coup against Hungary
  • Anti-Ukrainian videos that gathered over 290,000 combined views, according to monitoring group Antibot4Navalny
  • Fake posts claiming Hungarians were being urged to “take up arms and kill Viktor Orbán” — attributed to Ukrainian sources

A single Facebook post featuring manipulated images from pro-Orbán tabloid Ripost.hu received 130,000 reactions in a few days, mostly from foreign users — an engagement rate described by researchers as “rare for the Hungarian segment,” pointing to coordinated foreign amplification.

Storm-1516 Targets Magyar Directly

A second Russian operation, Storm-1516, took aim at Péter Magyar personally. Researchers at the Gnida Project, an investigative collective tracking disinformation, identified a fake Euronews-style article circulated on a counterfeit Euronews website claiming Magyar had delivered a “blistering critique of Trump” at a campaign rally — a fabrication calculated to alienate Hungary’s electorate given Trump’s popularity there.

Storm-1516 is a well-documented Russian propaganda group originating as an offshoot of the Internet Research Agency. It has previously targeted Kamala Harris and Tim Walz during the 2024 U.S. presidential election and spread disinformation during Germany’s February 2025 elections — earning a formal diplomatic protest from the German Foreign Ministry.

The AI Weaponization of Facebook

Artificial intelligence has become the production engine of the disinformation campaign. Investigators at EDMO and the Hungarian fact-checking site Lakmusz identified 17 high-reach TikTok channels, all launched in March 2026, using AI-generated personas — a young woman, an elderly professor, a soccer fan — to convey coordinated anti-Tisza messages. These channels are assessed as part of the Matryoshka operational framework.

On Facebook, Fidesz proxy organizations posted AI-generated videos attacking Tisza. One video — which ran as an active paid advertisement on Meta’s platform — featured family members unwrapping gifts labeled “TISZA” containing political attack messages, reaching millions of views before researchers flagged it. Fidesz allies also produced 14 AI-generated video clips designed to discredit Magyar and members of his family, according to Magyar himself, who publicly exposed the campaign.

The European Digital Media Observatory (EDMO) characterized the overall environment starkly: “Entire AI-propelled ecosystems are created on social media platforms to spread pro-government messages, and social media platforms’ rules about political advertisement are consistently circumvented and violated.”

Meta’s Abdication: The Zuckerberg Factor

The January 2025 Pivot

To understand Facebook’s inaction in Hungary, one must start on January 7, 2025, when Mark Zuckerberg announced a sweeping transformation of Meta’s content moderation architecture. In a video released exclusively on Fox News — a venue chosen deliberately for its audience — Zuckerberg announced the elimination of Meta’s third-party fact-checking program, replacing it with a crowd-sourced “community notes” model modeled on Elon Musk’s X.

Zuckerberg framed the move as a free speech correction, saying fact-checkers had become “too politically biased” and had “destroyed more trust than they’ve created.” He announced that Meta’s trust and safety moderation team would be relocated from California, and that Meta would “work with President Trump to push back on governments around the world” — naming European authorities specifically. Trump told a press conference he was “impressed” and that Zuckerberg was “probably” responding to threats Trump had made against him.

The timing and framing of this announcement left little doubt about its intent. As tech policy analyst Joan Donovan wrote at the time, Zuckerberg was not returning Meta to free-speech roots — he was “preparing for an autocratic future” by aligning with the world’s most powerful authoritarian-friendly political actor. Meta made the announcement while pledging $1 million to Trump’s inaugural fund and while Zuckerberg sought a meeting at the White House to “discuss how Meta can assist the administration.”

The Fact-Checker Dismantlement: What It Costs

The research consequences of Meta’s pivot are concrete and measurable. A peer-reviewed study published in the ACL Anthology in July 2025 analyzed the relationship between community notes and professional fact-checking and found that community notes cite professional fact-checking sources up to five times more than previously reported — meaning community notes cannot function without the professional infrastructure Zuckerberg just dismantled. Researchers from the University of Copenhagen concluded that “the pressure on fact-checkers exerted by platforms and politicians by defunding and discrediting fact-checking organisations will have corrosive effects on the quality of notes and destructive implications for information integrity more widely.”

Tufts University’s Fletcher School, in a January 2025 assessment, found the community notes model had “failed completely” at X, where the community interventions “often happen too slowly and miss the window when a problematic post is most viral.”

Meta officially retains third-party fact-checkers in Europe, citing the EU’s Digital Services Act. But its commitment to enforcing those standards has proven hollow in practice.

The Advertising Rule Bypass: 162 Violations in One Month

Meta banned political advertising on its EU platforms in October 2025, citing “unworkable requirements and legal uncertainties” around the EU’s new political advertising transparency regulation. The announcement was widely read as compliance theater.

The reality, documented by Political Capital and independently confirmed by EDMO’s Hungarian hub, is that 14 Fidesz parliamentary candidates successfully ran 162 political ads on Facebook in January 2026 alone — and Meta caught only 19 of them. The 162 ads that slipped through were not ambiguous cases. Political Capital noted that for registered parliamentary candidates, “there can be no question of political classification.”

The proxy workaround is equally brazen. Pro-Fidesz front organizations that maintain no formal legal relationship with the party simply purchase the ads and run identical messaging. Because Meta’s filters target registered political parties and not their proxy networks, the pipeline remains fully operational. Researchers from the Hungarian hub warn there is “a real risk that the Hungarian election campaign and other upcoming campaigns will be influenced by a significant number of illicit political ads — including deepfakes — with a huge reach.”

Meta’s response to these documented violations? A spokesperson told Euronews’ fact-checking team, The Cube, that it is against company policy “for advertisers to run ads about social issues, elections and politics in the EU” — and provided no explanation for why the 162 violations went uncaught or unaddressed.

The Disinformation Whitewash

Throughout the campaign, Meta has consistently been more responsive to Orbán’s political complaints about alleged censorship than to documented evidence of disinformation. When a right-wing commentator named Mario Nawfal — a Lebanon-born Australian influencer aligned with populist figures who conducted a paid interview with Orbán — alleged that Meta was restricting Fidesz posts, the claim instantly dominated international right-wing media ecosystems.

Euronews’ fact-checking team investigated and found no valid evidence that Meta had restricted Orbán’s posts. Meta confirmed none of Orbán’s posts were deleted. The claims were traced to mass-reporting campaigns coordinated by opposition activists — a legitimate use of Meta’s own Community Standards reporting mechanisms — not to any Meta policy decision.

Yet while Meta devoted press resources to rebutting the false censorship allegation, documented evidence of 162 unlawful political ads running on its own platform in a single month generated no comparable response.

The European Response: DSA Mechanisms and Their Limits

The European Commission activated its DSA “rapid response” mechanism ahead of the Hungarian elections, bringing together platforms, fact-checkers, and civil society organizations to identify and flag suspected foreign interference. Forty-four signatories, including Meta and TikTok, agreed to operate under the system through one week after the vote.

The limits of this mechanism are structural. The DSA’s enforcement depends substantially on platform cooperation and researcher data access. A German appellate court ruling in February 2026 ordered X to grant researchers access to public data relevant to the Hungarian election — but only after Democracy Reporting International had been forced to litigate to obtain what should have been a matter of regulatory compliance. EDMO-affiliated researchers described being functionally “left in the dark” without timely access to platform data.

The European Parliament has pressed the Commission to enforce the DSA more aggressively and impose sanctions on platforms in violation. To date, formal enforcement proceedings against Meta for election-related violations have not moved at the speed the Hungarian situation demands.

The Orbán Countermove: Weaponizing the Narrative

One of the more cynical elements of Orbán’s campaign is the systematic effort to flip the disinformation narrative against Meta and the opposition. By amplifying false claims that Facebook was “suppressing” Orbán’s posts — claims the platform itself debunked — the government established a preemptive alibi: if Orbán loses, it was because of foreign Big Tech interference, not because Hungarian voters chose change.

This narrative has also been deployed against journalists. Hungary’s intelligence apparatus launched investigations against two Tisza IT specialists using a child exploitation tip-off as a pretext — no illegal material was found. The Orbán government filed espionage charges against journalist Szabolcs Panyi for reporting the GRU operation in Budapest. These are not coincidences. They are the architecture of a government using state power to criminalize accountability journalism.

What This Means for Democracy — and Big Tech’s Role in It

Hungary’s disinformation crisis is a preview, not an anomaly. The same Russian operations — Storm-1516 and Matryoshka — have been deployed against elections in Moldova, Germany, Romania, and the United States. The Social Design Agency’s Doppelgänger operation ran across dozens of countries. What makes the Hungarian case distinctive is the combination of Russian foreign interference with a domestic authoritarian infrastructure that has fully merged propaganda, proxy spending, fake-profile networks, and AI content production into a single integrated machine — all running on Facebook.

Meta’s abdication of its fact-checking infrastructure, timed precisely with Trump’s inauguration and justified with language echoing Trump’s attacks on “biased” fact-checkers, has made that machine significantly more effective. The community notes model that replaced professional fact-checking has demonstrably failed on X, where it was pioneered, and academic research shows it cannot function without the professional fact-checking infrastructure that Zuckerberg dismantled.

When a 130-country global fact-checking infrastructure is dismantled to curry favor with one incoming American president, the collateral damage falls disproportionately on smaller democracies without robust independent media ecosystems. Hungary — where Orbán has systematically captured or closed independent outlets, where pro-government media spent 11 times as much on Facebook advertising as independent media in 2024 — is exactly the type of country that depended most heavily on Meta’s external guardrails.

Conclusion: Zuckerberg’s Silence Has a Cost

The record heading into Hungary’s April 12 vote is unambiguous:

  • Russian GRU operatives were physically deployed to Budapest under diplomatic cover to run election interference
  • The Kremlin-linked Social Design Agency, already under Western sanctions for election interference, prepared a covert Hungarian social media disinformation plan
  • Two documented Russian disinformation networks — Matryoshka and Storm-1516 — have been actively operating on Hungarian social media for weeks
  • Pro-Fidesz fake-profile networks with images sourced from Russian social media have infiltrated hundreds of Facebook groups
  • 162 illegal political ads ran on Facebook in a single month, with Meta catching fewer than 12% of them
  • AI-generated deepfake attack videos ran as active paid advertisements on Meta’s platform

And throughout all of this, Meta has found more bandwidth to rebut a debunked claim that it suppressed Orbán’s posts than to explain how 162 documented violations of its own advertising rules occurred on its watch.

The question that Hungary forces onto the table is not whether disinformation can be stopped entirely — it cannot. The question is whether the most powerful information distribution infrastructure in human history chooses to try. Under Zuckerberg’s current political alignment, the answer from Meta has been a resounding, consequential silence.


Sources for this report include VSquare, Direkt36, EDMO, the Hungarian Digital Media Observatory (HDMO), Political Capital Institute, Lakmusz, the Kyiv Independent, the Financial Times, Reuters, Bloomberg, DW, Euronews, the BBC, Chatham House, the Heinrich Böll Foundation, and peer-reviewed research from the University of Copenhagen and Tufts University’s Fletcher School.

.


Related Coverage:
Read this investigation on FiorReports
See coverage from The Cherry Creek News

Comments are closed.