Home

Donate
Perspective

Scholars Must Recognize the Role of Affect and Emotion in Disinformation

Calum Matheson / Nov 11, 2025

This perspective is part of a series of provocations published on Tech Policy Press in advance of a symposium at the University of Pittsburgh's Communication Technology Research Lab (CTRL) on threats to knowledge and US democracy.

In a “post-truth” era defined by “alternative facts” and “fake news,” calls for media literacy have gained new urgency but face technological, social, and institutional headwinds. Meanwhile, the sources of disinformation adapt and multiply. Interventions designed to help the public process information and better evaluate logic are necessary but not sufficient because the issues we face are not confined to information or logic problems. Modern thought distortion—targeted propaganda, misinformation, conspiracism, and so on—is, above all, a problem of affect. “Affect” refers to the predispositions, intensities, and attachments that condition how we respond emotionally to stimuli.

Although affect by definition resists direct observation, actors such as Cambridge Analytica have long recognized that motivated reasoning, cognitive heuristics, and confirmation bias are all magnified by feeling. We—academics, educators, journalists, and activists—must help to cultivate what literary theorist Kenneth Burke called “equipment for living” accessible to anyone targeted by persuasion at the intersection of emotion and media, whether it be speech, text, images, or video. Doing so requires that we formulate responses even when our objects of study are difficult to pin down with certainty.

The tactics of digital emotional manipulation are many and varied, but many of the most prominent exploit frustration and rage. Social media algorithms may reward engagement indiscriminately—traffic driven by anger still seems to generate at least as much attention as that driven by pleasure, and companies have shown unwillingness to police themselves in the past. Trolling culture is now generalized online and used at the highest levels of government in the United States. Trolling exploits sincerity through provocations which spark either an emotional reaction, which is then mocked and cast as evidence of instability or stupidity, or a refusal to “feed the trolls,” which lets the attack stand and may move the Overton window.

This tactic is used by some who have been identified as neo-Nazi online trolls, like Andrew Auernheimer, but also prominent figures such as Steve Bannon. The ambiguous nature of the message always allows the troll to dance around condemnation, claiming selectively that they are “just joking” with irony, satire, or hyperbole. Enraging a target group may cause them to over-focus and miss other serious issues, as Bannon alleges in his strategy of “flooding the zone.” It produces exhaustion, shame, or a sense of hopelessness. Audiences friendly to the troll can latch on to their enemy’s distress as a source of sadistic pleasure and in-group bonding.

The Trump administration has used this strategy to great effect. For example, a series of AI-generated videos has caused massive uproar. These include shorts depicting a “Trump Gaza,” Russell Vought as the Grim Reaper, and Trump himself wearing a crown and dropping feces on American protestors from a fighter jet. Bracketing the psychoanalytic implications of these videos, none had a tangible effect on policy. However, they may cause “resistance fatigue.” Those who respond are cast as unhinged victims of “Trump Derangement Syndrome,” a term used by Trump’s supporters to counter accusations of their own cult-like behavior. Mocking discomfiture on the left is perhaps the only universal binding agent among the GOP. Meanwhile, those who do not feel strongly invested in politics may be prompted by divisive rhetoric to disengage further, a result that could benefit Trump even if his trolling otherwise fails.

How can educators, broadly defined, respond to the dangers of affective propaganda without falling into its traps? We must not abandon our defense of truth, expertise, and critical thinking, but we must learn to be more comfortable with concepts like affect, desire, and the unconscious that are meant to help us speculate about what we cannot directly observe. We must accept that better information processing skills are insufficient without new qualitative ways of being in the world. Alongside better logic and evidence evaluation, we must support emotional mindfulness, critical reflection on affect, and attitudes that accept uncertainty and ambiguity, both for those exposed to disinformation and those who study it. Journalists and academics base public outreach primarily around quantifiable data as evidenced in fact-checking, education about logical fallacies, and media literacy campaigns around news quality identification. But agents of disinformation rely on “reality-based communities” bogging themselves down meticulously studying data while the pace of lies overwhelms them.

The ancient discipline of rhetoric was developed for precisely this reason as a means to understand persuasion in situations where action is necessary but definitive answers are elusive. Rhetoricians argue that we frequently act with limited evidence and therefore should be reflective about contingency, ambiguity, and multiple meanings, not to mention the passions that mislead us. Critical rhetoric’s equipment for living is an attitude, not just an applied skill, according to Burke. Invoking ancient rhetorical traditions and contemporary scholarship, French psychoanalyst and psychiatrist Jacques Lacan argued that affect is inextricable from the cultural forces that condition what we desire and the language and symbols we use, a point validated by modern studies.

While there are some promising starts, more of our efforts to inure vulnerable people (read: everyone) to manipulation must concentrate on how symbols activate emotion. We must continue to ask what content is meant to achieve, who makes it, and to what ends. But we must reflect more on our susceptibility to affective mechanisms and perhaps look anew on concepts such as desire and the unconscious meant to help us speculate about their operation when better solid data eludes us. Propaganda does not rely entirely on unassailable evidence and impeccable logic to discipline its targets. Neither can the response.

Authors

Calum Matheson
Calum Matheson is Associate Professor and Chair of the Department of Communication at the University of Pittsburgh and serves on the faculty of the Pittsburgh Psychoanalytic Center. His research and teaching interests include disinformation, fringe communities, conspiracy theories, and Internet cult...

Related

Perspective
The United States is on the Cusp of a Digital Dark AgeNovember 10, 2025

Topics