Home

What AfD’s Dark Campaign in Germany Tells Us About Disinformation

Felix Kartte / Oct 3, 2024

January 20 , 2024: A sign held up at a protest against AFD - Alternative For Germany - in Frankfurt am Main, Germany. Shutterstock

In Brandenburg—the state encapsulating Germany’s capital, Berlin—the far-right Alternative for Germany (AfD) won 30% of the vote in September’s election—just short of its first place win in the recent state election in nearby Thuringia. What political message attracted voters in such unprecedented numbers? Xenophobia. One of the most troubling recent proposals from the AfD is to ban refugees from public events—allegedly to reduce terrorism risks. This is Germany, 90 years after the Nuremberg Laws. It is a grim reminder of Hitler’s first legislative actions in 1933 to ban Jews from public life.

Ann-Katrin Müller is a regular attendant of far-right rallies—as SPIEGEL correspondent, she covered the recent election. Disinformation is often treated as an abstract issue, so I joined Müller for AfD’s final campaign event to experience firsthand the extremists who have weaponized disinformation as a core strategy to push for an authoritarian Germany.

What I witnessed from AfD politicians on-site was far more extreme and inflammatory than their already toxic online presence. Campaign merch included Kubotans—hand weapons that are illegal in many countries—as supposed “self-defense" against migrants. Children were given balloons shaped like “deportation airplanes,” the AfD has been using visuals of aircrafts to trivialize its proposal to essentially expel non-white people from the country. The rhetoric fused Kremlin propaganda with virulent attacks on refugees, journalists, churches, trade unions, LGBTQ+ individuals, climate activists. This wasn’t fringe talk—it was the core of the party’s platform.

This hateful ideology is no longer confined to the margins of society; it's creeping into mainstream discourse. We, as democratic citizens, are complicit in normalizing it. Extremists are gaslighting us—boldly lying and then claiming that legitimate journalism is deceiving the public by reporting on these lies. They want to ban rainbow flags and attack school curricula, but allege it is democratic governments who seek to censor. As a result of these tactics, the public increasingly treats the extremist agenda as just "the other side" of the debate, falling into the trap of false equivalency.

The most troubling effect of this organized deception is on how we address the root problem: disinformation itself. Governments, academics, and civic groups have developed programs to fight disinformation, only to have their efforts attacked as censorship by Kremlin operatives, extremist movements like the AfD in Germany and MAGA supporters in America. Public institutions and platforms have largely stopped using the term "disinformation" to avoid controversy, opting instead for vague labels like "information manipulation" or "digital threats." But disinformation is not a technical issue—it’s a political strategy, historically rooted in Soviet-era KGB operations designed to destabilize democratic societies.

Democracies need to confront this authoritarian capture of public discourse. I left the AfD event with some early thoughts about how different stakeholders should respond.

1. Politicians

The mainstreaming of divisive rhetoric by centrist politicians is alarming. Many mistake the loud, organized minority on social media for "the public" and tailor their messaging accordingly. A more aggressive tone is then "rewarded" with high engagement, reinforcing the cycle. Centrist politicians need strategies for communicating effectively without fueling identity politics and polarization. Unfortunately, social media platforms incentivize exactly the kind of divisive rhetoric that erodes democratic discourse.

2. The Expert Community

The counter-disinformation community has invested heavily in Open Source Intelligence (OSINT) and forensic methods to trace covert disinformation on the internet's fringes. But today, authoritarian propaganda has left the shadows. It increasingly permeates mainstream discourse, easily accessible via social media and traditional broadcast channels. To find it, you merely need to open TikTok, or switch on your TV. Platforms amplify this propaganda, and traditional media often contributes by giving extremists unfiltered airtime. As disinformation shifts from the margins to the center, it’s reshaping political debate across the spectrum. Toxic discourse is becoming normalized.

The current focus on fact-checking and generative AI in expert discourse also feels disconnected from the reality of the threat. We shouldn’t slice disinformation into isolated phenomena, but see it as part of a coordinated, well-funded, transnational campaign to install authoritarianism in Europe. The fascist takeover of broadcast airwaves was central to cementing Nazi rule in Germany in the 1930s. Today, extremists still understand the power of mass communications and they use it relentlessly.

3. The European Union

The EU has made progress in identifying authoritarian influence through frameworks like FIMI (Foreign Information Manipulation and Interference), but a gap is growing between Brussels' policy debates and reality. This isn’t an academic exercise—it’s an information war. As the EU regulates online platforms and AI, policymakers must integrate real-world insights from those affected by digital harms. The distinctions between "domestic" and "foreign" disinformation, and between regulatory and communication strategies, feel increasingly obsolete. Authoritarian movements are transnational and supported by a loose alliance of illiberal governments, radicals, and tech magnates like Elon Musk.

4. Tech firms

Tech companies have long shaped how governments and experts talk about disinformation, despite having no genuine incentive to tackle it. When I last checked, no tech company had any policies explicitly addressing disinformation, instead hiding behind deceptively technical terms like "coordinated inauthentic behavior." The tech lobby has led us into a "complexity trap"—a known lobbying tactic where issues are portrayed as so intricate that no clear solution seems possible, thus justifying inaction. Meta, for instance, has reframed disinformation as "coordinated inauthentic behavior," a term focused on technical metrics and spammy behavior rather than addressing the political intent behind these campaigns. But disinformation is not a technical glitch—it is a deliberate strategy by authoritarians to undermine democracy.

A recent study by University of Potsdam researchers revealed that young German TikTok users were shown AfD content nine times more often than that of any other political party. Other studies similarly suggest that tech platforms amplify extremist content at the expense of democratic discourse. It’s difficult to justify why actors aiming to "abolish" democracy are given such amplified reach. These aren’t inevitable outcomes; they are corporate decisions shaped by profit motives.

This is a systemic “bait and switch” perpetrated on the media consuming public. Our default assumption is that the media is a reflection of public tastes and opinions; and therefore what you see when you open your phone is a mirror of society. But that is a fallacy. What we see in our screens is a fun-house mirror, a twisted distortion of public opinion designed to provoke emotions that hold more of our attention to sell to advertisers. The tech industry rejects even the implication that they have a social responsibility as the stewards of information markets to respect democratic values or even basic human decency. Under the banner of the free marketplace of ideas, we are all slowly poisoned and left vulnerable to exploitation. The solution isn’t mass censorship. It is the removal of the amplification algorithms that distort public discourse and normalize extremists by dragging them into the mainstream.

5. The News Media

One chilling observation I made at the AfD event was the sustained attacks on journalists and public broadcasters—a tactic straight out of the authoritarian playbook, from Russia to the US. These attacks aim to destroy trust in credible journalism and silence criticism. Even more disturbing, media outlets often allow extremists to spread propaganda under the guise of "balance." It was disheartening to watch a public broadcast TV crew film give airtime to those who were publicly insulting them. All too often, hate and anger of the far right is portrayed as virulent strength and not as weakness and fear kicking down at the most vulnerable. Media decision-makers may need to continue reviewing their policies to avoid normalizing authoritarian propaganda. Also, journalists covering the far-right face growing personal dangers, yet few media outlets offer sufficient support to protect them.

6. Philanthropy

The only civil society presence at the AfD event came from the local church, which rang its bells to drown out the inflammatory speeches. More financial and structural support is needed for grassroots initiatives battling extremism. Also, philanthropic investments should focus on media production, audience building, and developing alternative distribution platforms that don’t thrive on the amplification of hatred and division.

Disinformation, not AI, poses the most immediate threat to global human rights. It is the political strategy of those seeking to limit—or abolish—fundamental rights and freedoms, and they are increasingly succeeding. From my vantage, the funder community appears eager to move on from the disinformation fight to the next frontier of AI. This is a major mistake.

Of course, there is value in funding "big picture" AI projects and working to set safeguards for new and powerful technologies. But in the short term, AI will exacerbate the information crisis and make it harder to solve any large public problem—AI governance included. The AI conversation often seems stuck in progressive bubbles, disconnected from the realities of places like the East German towns and villages where AfD is taking over control. Funding for AI governance frameworks should not come at the expense of support for local projects that do the everyday work of democracy—addressing the worst manifestations of hate and polarization in real-world communities.

Disinformation is not just misleading social media content or AI-generated images—it’s a deliberate political strategy. It is an action, a political performance designed to infiltrate both online and offline spaces. A recent example of this occurred in the German state of Thuringia, where the AfD used its strengthened parliamentary position to block the election of a new speaker. The constitutional court eventually intervened to resolve the issue, but the AfD’s intention was never to obstruct the parliamentary process permanently. Instead, they sought to stall it, casting themselves as victims of an "oppressive" political majority, all while attacking the very democratic procedures they claim to defend. It is gaslighting of the most vicious type, and as democratic publics we cannot fall for it.

Authors

Felix Kartte
Felix Kartte is a Senior Fellow at Stiftung Mercator, a policy advisor and technology expert. He started his career as a researcher and journalist and has covered digital democracy for outlets such as Süddeutsche Zeitung and Politico. During his time at the European Commission and European External ...

Topics