Home

Russian Indictments Highlight US Limitations in Fighting Disinformation

Dean Jackson / Sep 13, 2024

Dean Jackson is a fellow at Tech Policy Press.

White people. Black people. Civil war. Free speech. Secret Service. Illegal immigrants. Second Amendment. Elon Musk.

According to Wired, these were among the top phrases appearing in YouTube videos posted by Tenet Media, a company which acted as the middle-man in a Russian effort to influence American audiences in the run-up to the 2024 elections. An indictment from the US Department of Justice (DOJ) lays out a scheme in which two employees of RT, a Russian state-owned news agency, engaged Tenet to pay prominent Trump-aligned social media influencers—including Tim Pool, Benny Johnson, Lauren Southern, Dave Rubin, and others—millions of dollars to create content.

There is more to this story than espionage and schadenfreude. The limits of DOJ’s response say a lot about the continuing fragility of US democracy and the evolution of “counter-disinformation” as a field of practice. Tackling foreign influence remains the politically easiest thing to do. But confronting the deeper, domestic sickness of the US media ecosystem is more difficult than ever.

Until its abrupt collapse after this fiasco, Tenet Media was owned by Lauren Chen (a contributor to Glenn Beck’s Blaze Media and a former contributor to RT) and her husband, Liam Donovan. The indictment is against two RT employees; Chen and Donovan are not named, though it is perhaps possible they could also be in violation of the Foreign Agents Registration Act.

As for the influencers, they claim to have been unaware that the lucrative deals they received came from the Russian government—though they also do not seem to have worked very hard to vet their benefactors, who approached them claiming to represent a fictional Brussels-born businessman named Eduard Grigoriann. As others have written, it was “nearly effortless” for Russia to “dupe” them. They would have done well to remember the old Russian idiom, “the only free cheese is in the mousetrap.”

There’s no evidence contradicting Tim Pool’s protest that “Never at any point did anyone other than I have full editorial control of the show.” (Other influencers made similar denials). That’s the point—the stream of racial grievances, hyperbolic partisanship, conspiracy theories, transphobic diatribes, and open catastrophizing fit so organically with Moscow’s strategy to weaken US democracy and stoke political turmoil that no marching orders were needed.

This is not the only Russian influence operation targeting the 2024 election, or the only recent effort targeted by the DOJ. But it is probably one of the more successful in terms of reach: consider for instance another operation, “Doppelganger,” which set up a network of fake accounts and pages to deceive US and European audiences. In more than a year, links to those pages collectively received less than a million clicks. According to the DOJ, Tenet’s videos were viewed sixteen million times.

Paid agitprop as lucrative free expression

The force of DOJ’s response to Doppelganger—the government seized 32 web domains associated with the operation—contrasts with its limited power in the Tenet case. Put simply: when it comes to information integrity, the US government is mightiest where it matters least. The influencers paid by Russia will continue to push the same messages, as will dozens or hundreds of their peers. The foreign aspect of the operation was less important and less enduring than its domestic elements.

There’s a compelling reason that DOJ and other parts of the US government have not addressed this problem. Even if those influencers are best understood as something different than journalists, and even though they became “useful idiots” for Russian intelligence, they were engaged in protected speech under the First Amendment. It doesn’t really matter if, as media commentator Taylor Lorenz rightly observes, these influencers “are often propped up financially because they defend the interests of the rich and powerful… cosplay[ing] as bold truth tellers” while they receive “millions of dollars to produce content that aligns with certain political agendas.” The United States has a legal tradition and a prevailing understanding of free expression that holds the speech of people who are professionally angry on YouTube equal in standing to measured consideration or thorough journalism.

Maybe that’s for the best—it would be terribly dangerous to hand the state the power to decide whose opinions are constructive, especially when voters may be on the verge of electing a President who has promised legal retribution for his political enemies and a “bloody” ethnic cleansing of immigrants. But these well-meaning concerns about free speech have prevented any solution to the rapid and profitable spread of angry exhortations and paranoia, far outperforming other forms of speech. Even if this is an old problem, new technologies have made the dividends extraordinary. Glenn Beck had to get a coveted spot on Fox News to peddle conspiracy theories to a mass audience. Now, anyone—including Tim Pool and Benny Johnson—can do it for a couple hundred bucks at Best Buy.

There is an old quote by Supreme Court Justice Louis Brandeis arguing that more speech is the remedy for bad speech, but its reciters usually omit that Brandeis believed this corrective required adequate time for discussion and education. Absent that, irresponsible rhetoric runs roughshod over objections. If there is one thing the twenty-four hour, ultra-online, engagement-obsessed media landscape does not give us in abundance, it is time for deliberation and reflection. The consequence is an America which is among the most polarized democracies in the world, concurrent with a political crisis of institutions that runs deeper than polarization.

Stalled reform efforts leave the field at risk of securitization

Disinformation researchers and practitioners of counter-disinformation find themselves in a bind. The 2016 election was a cold splash of water to policymakers who believed foreign influence and fringe media were minor problems. Russian interference in that election made disinformation a national security priority, but it quickly grew to encompass broader concerns about the health of the information ecosystem in the social media era.

Since then, efforts to improve that ecosystem have largely stalled. That’s unfortunate, because the scale of the problem practically demands regulation and legislation: as others have written, the type of backchannel, offline financial arrangements involved in the Tenet Media operation is very difficult for platforms to detect. The same is true of off-platform arrangements between influencers and political parties, campaigns, or action committees, an area where the Federal Election Commission (FEC) has unilaterally disarmed.

The stumbling blocks to reform are many. Proposals to rein in the attention economy have failed to overcome free speech objections, the social media industry has arguably become more opaque than at any point in the last decade, and powerful conservative figures have bullied and frightened watchdogs in academia and government who previously stood up for election integrity and other casualties of disinformation.

The result is a government unwilling and seemingly unable to tackle the larger, domestic elements of this challenge. In public remarks earlier this month, Jen Easterly, Director of the Cybersecurity and Infrastructure Security Agency, said it was not her agency’s role to discuss “content that is to be removed,” months after it stopped communicating with social media companies about election related threats. It didn’t seem to matter that during those months, the Supreme Court issued a decision that previous engagement between government and platforms had not been unlawful. The chilling effect prevailed.

As a result, the United States still has no ready response to a media environment that now regularly incubates political violence. Projecting this trend into the future, it is possible that many practitioners will follow the flow of government funding and policy attention away from broader information integrity issues toward a narrower, more securitized focus on foreign influence, centered on great power competition and devoid of real concern for democracy.

The consequences of that paradigm shift are already within view. The bipartisan federal legislation that could result in a ban of TikTok is the only significant piece of tech regulation passed by Congress since before 2016. Meanwhile, the United States has been caught running influence operations of its own—including a campaign in the Philippines to promote vaccine skepticism as a means of undermining Beijing’s “vaccine diplomacy.” This operation risked Philippine lives and damaged US credibility on information integrity issues.

Concerns about operations like that one led to an internal Pentagon review of US psychological operations in 2022. As part of its reporting on the Philippines operation, Reuters claimed that the review found the operation “employed sloppy tradecraft” and that “military leaders didn’t maintain enough control over its psyop contractors.” But the hunger for improved psychological operations continues, driving billions of dollars in military and other government spending.

But absent greater strides toward a healthier public square, the realpolitik mindset behind that operation could become the path of least resistance, and therefore greatest growth, for counter-disinformation. It would become the kind of competition between propagandists that Moscow always alleged, and which Washington insisted it was not.

Counter-disinformation as a pro-democracy agenda

In this scenario, it’s not that literally no one would continue working on reforming the media environment. But those scholars and activists would find themselves in a smaller field, with fewer resources, and less attention from media and policymakers. The zeitgeist would move on, and the window of opportunity for reform would be closed.

One way to escape this policy impasse, fix the counter-disinformation paradigm, and avoid that bleak future is to widen the aperture beyond technology and tech accountability. Stakeholders across this field—researchers, activists, philanthropists, policymakers, and the like—should and usually do recognize that the political trends often attributed to social media are not the result of tech alone. Increases in political polarization, for instance, precede social media by decades, even though teach appears to act as an accelerant. Likewise, to place the blame for growing illiberalism and democratic backsliding primarily on social media ignores the likely significant roles played by the “War on Terror,” the 2008 financial crisis, racism, and other world-historic trends and events.

Had the disinformation field fully internalized this early on, millions of philanthropic dollars which flowed into social media monitoring, fact-checking, and media education might have been spent on less rationalist pro-democracy strategies, such as block grants for community organizing, support for electoral reforms that encourage dialogue and moderation instead of extremism, support for labor organizers and union drives, investment in independent public-interest journalism, and support for authentic media voices speaking to the audiences courted by the Tim Pools and Benny Johnsons of the world—often disaffected young men—and offering them alternatives with which to identify. And it will probably be philanthropists and foundations who pursue these strategies; for all the reasons described above, government hands are often tied.

Essentially, anything that reduces the incentives and appetite for the kind of media content supported by Tenet Media should be considered a win for democracy and a successful countermeasure against disinformation. Anything that reduces affective polarization—and thus the appetite for political victory at all costs, even the embrace of violence and authoritarianism—should merit consideration.

This leaves decisionmakers with a huge array of options. That can be paralyzing. But observers have already begun offering ways to sort through these. For instance, Daniel Stid asks decisionmakers to consider where (locally? nationally?) and when (this election cycle? over the next decade?) they would like to have impact. Narrowing the menu further and helping decision-makers order from it will probably take more research: as Andrew Seligsohn of Public Agenda recently wrote, “how do foundations know if they’re investing in promising ideas? And how do they know if these initiatives are working? The short answer is, they don’t.”

As philanthropists navigate this strategic landscape, the technology space should not be fully set aside. One positive trend is toward a focus on platform design codes, which could alter the attention economy’s mercenary incentives by limiting the viral spread of content, penalizing users who engage in deceptive or manipulative behavior, attempting to differentiate news content by quality, and otherwise adding friction to the spread of information online. Advocates and, yes, government policymakers should continue searching for constitutionally permissible ways to encourage pro-social design choices and explore novel solutions like bridging algorithms. Campaign finance and political advertising rules also desperately need revamping for the modern digital media environment.

The current disinformation paradigm is unsustainable, but that does not mean counter-disinformation is a doomed enterprise. The imperative to reform America’s poisoned media landscape, a proximate cause of its democratic crisis, grows stronger every day. Exposing Russian influence operations is an important activity for the Department of Justice; but the more pressing issue for all of American society is to confront the challenges at home.

Authors

Dean Jackson
Dean Jackson is the principal behind Public Circle Research and Consulting and a specialist in democracy, media, and technology. Previously, he was an investigative analyst with the Select Committee to Investigate the January 6th Attack on the US Capitol and project manager of the Influence Operatio...

Topics