Home

Donate

Podcast: Don't Hype Disinfo, Say Disinfo Experts

Justin Hendrix / May 5, 2024

Audio of this conversation is available via your favorite podcast service.

One topic we come back to again and again on this podcast is disinformation. In many episodes, we’ve discussed various phenomena related to this ambiguous term, and we’ve tried to use science to guide the way.

But the guests in this episode suggest that in the broader political discourse, the term is more than over used. Often, lawmakers and other elites that employ it are crossing the line into hyping the effects of disinformation, which the guests say only helps propagandists and diminishes trust in society. To learn more, Justin Hendrix spoke with:

  • Gavin Wilde, a senior fellow in the Technology and International Affairs program at the Carnegie Endowment for International Peace and an adjunct lecturer at the Alperovitch Institute for Cybersecurity Studies at Johns Hopkins University.
  • Thomas Rid, a professor of strategic studies at Johns Hopkins University SAIS, founding director of the Alperovitch Institute for Cybersecurity Studies at Johns Hopkins University’s School of Advanced International Studies, and the author of Active Measures: The Secret History of Disinformation and Political Warfare.
  • Olga Belogolova, the director of the Emerging Technologies Initiative at Johns Hopkins SAIS and a lecturer at the Alperovitch Institute of Cybersecurity Studies. Belogolova previously led policy work on countering influence operations at Meta.

With Lee Foster, they are the authors of a new essay in the publication Foreign Affairs titled "Don’t Hype the Disinformation: Downplaying the Risk Helps Foreign Propagandists, But So Does Exaggerating It."

A lightly edited transcript is below.

Gavin Wilde:

My name is Gavin Wilde. I'm a senior fellow in the Technology and International Affairs program at the Carnegie Endowment for International Peace, and I'm also an adjunct lecturer at the Alperovitch Institute for Cybersecurity Studies at Johns Hopkins University.

Justin Hendrix:

Thomas.

Thomas Rid:

I'm a professor of strategic studies at Johns Hopkins University SAIS and founding director of said institute, and I'm the more importantly the author of Active Measures: The Secret History of Disinformation and Political Warfare.

Justin Hendrix:

And Olga.

Olga Belogolova:

Hi, I'm Olga Belogolova. I am the director of the Emerging Technologies Initiative at Johns Hopkins SAIS. I'm also a lecturer at the Alperovitch Institute of Cybersecurity Studies where I teach a course on disinformation and influence and I formerly led the policy work on countering influence operations at the company called Meta.

Justin Hendrix:

So the three of you are among the authors of a piece that is in Foreign Affairs just this morning. I have the opportunity, I hope, of being the first person to interview about it for a podcast where you're calling for a right-sizing of the scale of the threat when it comes to foreign disinformation. Why do you think right now is the moment to try to calibrate concern among experts and elites about foreign disinformation? We've just had, of course, this bill pass in Congress and get signed into law that could effectively lead to a ban of TikTok in part based on foreign disinformation. We've got tons of experts out there talking about potential risks in the 2024 election cycle. Why is now the moment to try to reduce the extent to which the threat is considered acute?

Olga Belogolova:

I think this has always been a problem and challenge in this particular space in particular because influence operations is about what's happening in our information environment and the more we talk about it, the more we amplify the particular problem set. But in the why now, we are in a moment where there are numerous elections around the world in the year 2024, and a lot of the headlines that we see about this are that there's an information apocalypse or disinformation apocalypse coming or there's a wave of disinformation coming our way in particular because of AI or certain threat actors like Chinese or Russian threat actors in particular. And we're seeing confluence of these types of story and commentary from researchers and journalists on this topic. And I think we have now as a group of people that have written this piece, have shared the concern that these conversations are getting to a point where it is dangerous and actually perhaps aiding and abetting the threat actors. They're trying to sow discord in our public discourse.

Justin Hendrix:

So we're seeing a degree of threat inflation, which goes beyond the actual threat itself.

Thomas Rid:

So we went from ignoring 2016 and all three of us had personal skin in the game because we were quite early watching the Russian election interference in 2016 before it was even publicly disclosed. And we saw the public conversation on active measures, disinformation, influence operations go from effectively ignoring the threat almost completely in the run-up to November 2016 to effectively do a 180 turn and overstating the problem within the space of approximately one year, a little less, I think by late 2017 with the Mueller investigation underway. And then at some point issuing the first indictment blaming the Internet Research Agency for effectively successfully shaping the outcome of the election that at least was how many people interpreted the indictment and the later the Mueller report. That I think was the turning point where we flipped from understating to overstating the threat. And as we say in the piece, both help the adversary, the only viable and the right thing to do is to be led by the hard evidence, by the fact as closely as you possibly can, falling into neither of those two traps, neither underestimating nor overestimating the threat.

Gavin Wilde:

I think all three of us can be considered Russian nerds. I think, for me, part of the motivation was also, and for all of us, the motivation was also getting the sense that understanding how Russian leaders have historically thought about information as a threat and using information and propaganda and active measures often as an excuse for their own governing failures, the degree of paranoia and conspiracism that tends to prevail among the ruling elite in Moscow, the kind of magical thinking that assumes orchestration behind any sway in public sentiment or assumes that public sentiment can't really happen organically in the first place. All of those qualities, I think we start to see those crop up here in DC more and more. And I think that's part of the concern is that if we're undertaking this work under the banner of protecting democracy, I think it's important that we don't try to fight fire with fire and that in our attempts to do counter-propaganda so to speak, we don't simply adopt the very mindset that the propagandists hold in the first place.

Justin Hendrix:

So I might ask each of you for a data point, something that you could point to in the world that would support the idea that the threat analysis is out of whack among policymakers, leaders, folks there in Washington that you are around and then perhaps the public more generally, a couple of more examples that essentially bring that to the fore.

Olga Belogolova:

So one thing that I think of, and Thomas was framing how this really came to a head over the course of 2017 and leading into 2018, and I can recall at the time I was a threat analyst looking specifically at Russian influence operations at Facebook. And my job was to be on the receiving end of a whole bunch of policymakers and others expressing concerns about what they believe to be Russian influence operations narratives. And most of the time a lot of what I got was domestic conversations that were believed to be shaped and led by Russian influence operations. And I would be looking into some hashtag that someone claimed was a Russian influence campaign and then finding that it is actually not at all controlled by any sort of clandestine foreign government or threat actor. And then we get to the midterm elections in 2018 and you actually see what the internet research agency did.

I think in some ways in response to that, we can't ascribe intent or know exactly what they were thinking at the time, but they put out this website right before the 2018 midterm elections that was easily attributable to the Internet Research Agency. In fact, the URL was usaira.ru, right? They wanted everybody to know it was them. And the website registration information was also easily attributable to one of the indicted companies, Azimut LLC. So it was screaming, please come find us and see that we are doing this. And in fact, in terms of screaming, their entire website said in all caps, "Citizens of America, you are powerless. The companies and governments are powerless. They cannot stop this disinformation threat." And in some ways, they were toying with us and in our own perceptions of this threat and saying, "We're going to interfere, there's nothing you can do. You can't trust anything. You can't trust elections." And fast-forward from that to 2020 and in fact there are a number of people that don't trust elections and electoral institutions.

Thomas Rid:

Effectively, a different way of putting this is it's really unclear who makes more money from influence operations, firms in and around Moscow exaggerating their effect to their sponsors or firms in and around Washington exaggerating their effect to their sponsors. So basically the point here is we know from history that operators, but even more so contractors have an interest in showing off how good they are. You want promotions, you want funding, contracts, whatever. And so they have an incentive structure to exaggerate the effects of what they do. This is not unknown. I think many of us personally may have encountered situations in our professional lives where people act according to these incentives even to journalists, academics are not immune. But here, we have to be very cautious not to help them by exaggerating their effect as well because ultimately we also have a self-interest in it. So sometimes to be really blunt, sunlight is actually not the best disinfectant. Sunlight so to speak makes the weeds grow stronger and spread. So sometimes it's actually best to ignore an operation, especially if it's really unclear if it had any impact at all.

Olga Belogolova:

And if I may add here, one of the things that when I was working at Meta, we published these reports and we tried to do them consistently and one of the goals was actually to make it boring, if that makes sense, which is you do this so consistently and regularly that people are used to it, and actually, they understand that people are regularly on top of it. But the other thing that you can do is you can also make fun of the threat actors. There were times where they were not so good at what they did and in fact instead of saying, look, we found this other Russian operation. What we did say was we found it but they actually didn't reach that many people or we found it and they're bad at their jobs.

Gavin Wilde:

I think a lot of my concerns started last year around this time, particularly around the tone and tenor and content of the debates around TikTok. I think we in the community that have studied this phenomenon for a lot of years are used to talking in terms about how the ultimate aim of some of these propagandists is to get Americans to lose faith in their institutions and their political leadership. I think the flip side that we're now dealing with, and that is sparking so much worry, is that a lot of the work that we are doing if we're not careful goes an awful long way into making those institutions and those political leaders lose faith in the American public and start to think about the public as sheep or empty vessels or blank canvases upon which anyone with access to a social media platform or some puppet accounts can simply dump some kind of narrative into the brains of a broad swath of Americans and make them act or believe or adopt attitudes that they wouldn't have already otherwise adopted.

And I think that kind of approach we've seen at least for over a century in the United States of this idea of direct effects or this hypodermic needle or the magic bullet of media influence that is again rearing its head and I think culminated in the legislation that we saw passed about TikTok, which we can debate the merits of, but the tone and tenor of that conversation seems to completely discount the idea that the American public is discerning as agency or is capable of dealing with a lot of ideas, a lot of bad ideas, a lot of misinformation, disinformation. They very well may not be, but that's not necessarily something that's being done to them from the outside. And that's not necessarily something that's purely foreign in origin.

Justin Hendrix:

In this country over the last couple of years, it seems to me there have been voices like yours, others that we've had on this podcast recently, Dean Jackson, Jon Bateman, who I think wrote an excellent report on countering disinformation and using science to justify the types of interventions that they were recommending. I think they had in mind the same kind of calibration of the threat that you have in mind here, but there's also another kind of effort that's underway of course, one that seems to be a confluence of actors on the right and on the left perhaps personified in the swirl of activity around Jim Jordan's select committee on the weaponization of the federal government. How would you distinguish your diagnosis of this problem from the diagnosis of the problem that seems to be popular amongst that set, for instance, the authors of the Twitter files or Elon Musk or other critics of the disinformation narrative who regard it essentially as this vessel for liberal intervention in the information ecosystem?

Olga Belogolova:

I would say that it's important that we made a point of making a distinction in our piece and in our conversations around we do believe that there is value in research into these influence campaigns. In fact, all of us that wrote this piece have dedicated a good portion of our careers to working on exposing and disrupting these influence campaigns. But we also want to make sure that there's rigor in that analysis and that we are making sure that we are not attributing to the Kremlin things that are not attributable to the Kremlin and that we are making a distinction between things like misinformation and disinformation, which are importantly separate problems. And so that is what makes our argument, I think, distinct from what is happening there, because at least for me personally, I think it's quite dangerous that there are individuals that are coming after these researchers that in many ways are doing really important work. What we're saying is that we just need to make sure that there is rigor in that work and the coverage of it by both politicians and journalists.

Thomas Rid:

We try to be very clear both in the piece, but even already in this conversation that both understating the threat, essentially saying disinformation is a hoax. That's a version of ignoring the threat, that helps adversaries because it's easier to influence you if you're completely unaware, but also the opposite, meaning overstating the threat, saying everything is a Russian influence campaign and a successful one at that, that also makes it easier for adversaries. So we are an open society where science and investigative institutions, be it investigative journalism or even law enforcement and the criminal justice or the entire justice system can only exist if we all collectively agree that we can trust certain facts and take action based on facts.

So that is essentially what we're saying, let's just be old school and maintain that basis and not jump to conclusions without evidence either way, saying there is no interference or no disinformation going on, which is what some of these radical, somewhat unhinged voices that you alluded to are doing or doing the opposite saying, Israel-Gaza war protests are inspired or somehow remotely controlled by Russian operators without any evidence, which is for example, something that Nancy Pelosi has done or saying that a certain obscure operation that happened on a social media platform that didn't get a lot of traction and a lot of clicks and a lot of impression in the first place, that it somehow had an impact on the outcome of an election or that it triggered street protests or something like that without evidence. That also is a problem.

Gavin Wilde:

I think insofar as there's maybe a meta critique or a lesson to be learned from the kind of unfortunate behavior and bad faith politics of a lot of folks that are harassing good faith researchers, et cetera. I think it is important for folks in the national security space, folks in the academic space, in the research space to recognize that these are ultimately debates that are political. They discuss the way communication imbues power, and it's very easy to talk yourself into if you're a researcher or certainly if you're in the IC that I'm an apolitical arbiter, I'm coming at this objectively, but when it wades into political issues, certainly around election, once you play referee like it or not, you become a player on the field. And that's something I think is worth bearing in mind, particularly as it wades into partisan issues or becomes a partisan football, that some of that is going to come with the territory for better or for worse. And I think being prepared for that in the first instance is going to be important.

Justin Hendrix:

There are a lot of experts that are raising alarms about the extent to which social media platforms are prepared for the various types of threats to the election in 2024, both here and elsewhere in the world. Just this week, we've seen the European Commission open investigation into Meta, your former employer Olga, over questions around disinformation, concerns about, I think, in particular Russian disinformation. We'll see what more details come out about that inquiry, I think, in the days ahead. And over here in the states, you mentioned the types of transparency reports that Facebook has filed over the years as of result partially of the environment here. We know that some of the information sharing between the federal government platforms has been chilled to some extent. How confident are you that the social media platforms have calibrated this threat correctly going into this election cycle?

Olga Belogolova:

So I will say I've gotten asked this question quite a bit and I think it's a mixture depending on the company you're talking about. I think we cannot generalize and I think there's naturally a lot of concerns because the company formerly known as Twitter are currently known as X, no longer really had the trust and safety team. But that's not to say that the other companies do not. So to my knowledge, and I am quite close to a lot of these people myself, there are still quite a few investigators and analysts and I think it would be unfair and inaccurate to say that things are worse than they were in 2016 at any of these companies that have invested heavily into this over the years. There is a wealth of expertise that exists both on the intelligence analysis side and on the policy development side on these particular topics.

And they are constantly monitoring both the US election and other elections around the world, but also just outside of elections, these particular threat actors. That's not to say that there aren't gaps in the work that is being done or other incentive problems that these companies might face. That's always been the case. These are profit-driven companies and they will always have profit motives just like any of the other players in the field that we've discussed, whether it's researchers or politicians. And I think that to me, I'm not concerned about that. What I am concerned about is what you mentioned around the cooling between the governments and these companies, because in the past, those relationships have actually proven to be quite helpful.

Now, there are some really good questions to be asked about the relationships between governments and social media companies and the pressures that they face. In particular, there's some European governments that I would meet with on a regular basis that were asking things that did violate freedom of expression laws in different countries and asking for individual accounts to be removed because they didn't like what they were saying. But outside of those very legitimate questions, there is an important relationship that exists and should exist between these companies and government agencies that can give them good intelligence that they themselves cannot find. And those relationships have proven useful in previous election cycles and during conflict.

Thomas Rid:

So I may just jump in with a sort of left of field response. The very fact that you were asking this question about social media companies relatively early in the conversation, and I think a lot of people interested in influence operations think of social media platforms first. When they hear disinformation, they think Facebook and Twitter and TikTok and whatnot, that is in fact a distraction. That is probably exactly what we're criticizing here, because if we look historically as well as at 2016, the more important type of operation was hacking and leaking or collecting and then leaking information to the public. That was the Guccifer 2.0 leaks, the DNC leaks, the Podesta leaks later in the cycle. And of course, there are many other examples even in a corporate context. Let me just give you that one data point which I think is personally extraordinary. One of our students here is frisbeeing.

He's an investigative reporter at Reuters. And Reuters did a large story on November 16th, 2023 about an Indian hacking company, Appin, that actually use this tactic of hacking and leaking and influence operations tactic really should look at it. In the context of lawsuits, the goal oftentimes was to influence the outcome of the proceedings of the trials or the civil proceedings. And the story was pulled because Appin through an Indian judge threatened Reuters the defamation lawsuit. And it has actually pulled, I believe it was the first time in history that Reuters pulled a major investigative story that they had high confidence in was factually correct. We have it all here, we have hack and leak, we have censorship, but for some reason, that story doesn't get the amount of political traction that some social media related disinformation aspects receive in Washington, which I find very frustrating, because really, we're looking at some of the worst tactics here that are very effective and really censoring American reports from because an Indian court interfered in domestic debate.

Olga Belogolova:

One of the things that we often ignore is the broader information ecosystem in which these influence operations can live and thrives. And another example was from 2016 hack and leak. There was a Stanford study that came out researching that and it found that the social media portion of the operation actually didn't get much traction. It was only when journalists picked up the leaked information that it actually got anywhere. And you can see a big distinction between that and what happened in 2017 in France with the Macron leaks campaign because there was a media blackout in France. In 2017 during that time period., As a result, the stories that were written were the second day story, which is who was hacking and leaking.

It wasn't here's the contents of the Macron leaks and it was a very different sort of dissemination mechanism because of that media blackout. And so we often ignore the sort of broader ecosystem in which people consume and produce information, but those of us, as Gavin said, we're all history nerds and Russian nerds. We can look back to a time period where these influence operations were run without social media platforms. And to just say that if you just got rid of the social media companies, then political polarization would be gone. All of the members of Congress would be supportive of aid to Ukraine is just a complete lie.

Gavin Wilde:

The paper that I sent you earlier that just went in the Texas National Review touches on this streetlight effect, if you will, the fact that media more broadly nowadays, but certainly social media just provides all of us with a ready-made vast pool of data to put under a microscope and say, "Hey, we've cracked how human persuasion works. We've cracked how influence works and we can monitor and track the colorful clusters of narratives and likes and shares and tweets, and convinced ourselves that's what constitutes influence." To Thomas and Olga's point, that approach tends to sweep a whole host of other factors, many of which have nothing to do with media consumption whatsoever under the rug.

So whether you're talking about your lived experience, your socioeconomic background, your culture, your religion, the conversations you have with the baker, the barber every day, those kinds of things also reinforce or help create this constellation of our beliefs and attitudes and behaviors. But those things are much harder to datafy and we don't have ready-made pools of those things to jam into an algorithm and say, look, this is influence and so I think the social media aspect of the conversation does suck up an oversized amount of the focus when we talk about propaganda at large.

Justin Hendrix:

You've already brought up a geographical component of this, the differences in the way that mis and disinformation work is phenomena outside of the Western context. Talking about India, I'm wondering, is there anywhere in the world where you feel like this calibration is actually wrong in the other direction where we're not as concerned about mis and disinformation as we should be?

Gavin Wilde:

I can offer an anecdote just from a recent panel that Olga and I were participating in. We had a lot of foreign-born journalists, investigative journalists from countries where they grew up in that had very constrained media environments where it was state-backed media exclusively. And I would say even in those environments, we tend to underestimate the power of human beings to discern when they're being lied to and to see through the garbage. And so I think even in those situations, it's easy for us to just convince ourselves that a lack of pluralism in our media diets is somehow itself also decisive when again, that discounts so much of the human factor. But I'll defer to Thomas and Olga for their thoughts.

Thomas Rid:

I think the word disinformation has become effectively useless because it's so often identified with just lying or saying things that are factually incorrect today that it has lost a more sharper meaning that it once had, namely that the description of foreign influence operations and we even study foreign influence operations. We don't study dirty tricks used here or people getting something wrong or straight up lying or trying to say the wrong thing for self-interested reasons, because if we study that, then where does it stop? We then study... We can't be experts on lying. That wouldn't make any sense to me. That I think is a really important consideration here because ultimately we have to clearly draw that line. We're not studying stuff that is simply wrong with studying systematic organized professional attempts by foreign entities, sometimes contractors, sometimes intelligence agencies to influence something that is happening in another country, not disinformation, that vague thing.

Olga Belogolova:

So actually that was the name of my class when I used to teach it at Georgetown University was called Lies, Damn Lies, and Disinformation. And the reason for that was this play on words because so much of what we are discussing often ends up just being people lying and they've been lying forever. And there's so many iterations of that. There are satirical websites that one of the examples I show my students is there was an onion story about Ahmadinejad getting more support from Americans than President Obama, and it was picked up by Iranian State Media because they thought it was a real story. This was many years ago, I think 2010 or something like that. And there's a lot of funny examples like that where there's something like that or is Red Bull telling you that it gives you wings, a disinformation campaign on behalf of Red Bull because you don't actually gain wings when you drink Red Bull.

If someone telling you that you can charge your phone using an onion and plugging it in, is that some sort of evil disinformation campaign or is that just someone doing a hoax online because they want to watch people do really stupid things on the internet, right? There's so many times and situations in which people lie and then there are misinformed individuals that are just sharing information because they genuinely believe it to be true. And then moreover, to drill this point even further, a lot of what we study when we study Russian influence campaigns, Chinese and Iranian and other threat actors is not actually a lie. Most of what these threat actors have done is amplify political divisions that exist and other particular issues. They create pages focused on those issue areas on different social media platforms, and they're not necessarily always using lies in order... They're living off the land and finding divisive issues that they just amplify.

Thomas Rid:

Any country, it doesn't matter where it is, even here in the United States has some journalists or some scholars or some intelligence analysts who are discerning and professional and to pay attention to detail who are creative, and it has journalists and other analysts and scholars who are lazy. You have it here and you have it elsewhere. And of course, if you're in a country that doesn't have a tradition and a profession of proper investigative journalism or where just the media and the press as such is under more pressure and under threat from the government itself, then of course you're more vulnerable to effectively all forms of disinformation especially foreign influence operations. But the problem starts here at home, and I remember, and I shall not name the journalists or the outlet here in this context, but I remember recently speaking with a journalist for a long time actually about the alleged effectiveness of some Russian influence operations.

This was related to this Marjorie Taylor Greene tweet about the alleged story that some of Volodymyr Zelensky's advisors had bought two yachts for $75 million, which is false. And it was quoted to me, including by that journalist as a success story that we discussed the details in our foreign affairs piece. And that journalist kept pushing me and said, "Well, but isn't that a clear example for success?" So the journalist wanted to believe in the success of Russian disinformation. Why? I don't know, maybe because let's just not speculate why that might've been the case. It was a long conversation. I said to the journalist, listen, and I didn't trace that story myself. I will actually do so now that you mention it to me, but I just don't want to speculate that this was even a Russian influence operation because we have to trace it first. Now, a couple of days later, the story comes out and the journalist repeats that claim, of course, that this was a successful Russian influence operation.

Yet again, it took me exactly two minutes, one search on Twitter, one search on Twitter, simply searching for Zelensky yacht, and only the date before Russian influence operators started peddling that myth, meaning I excluded any resurfing that was Russia-driven and only looked at what happened before that date. And I had a bunch of hits. Why has no journalist writing about that story? And it's really not hard. It requires very little creativity in terms of formulating a search. On X, it's like the lowest hanging fruit you can imagine because either they're not qualified or they're simply lazy. But I don't know what it is, either way, it's really disappointing and I think we should start calling people out more clearly and we should raise the bar and not have a lot of journalists who call themselves investigative journalists sometimes, simply repeat the claims that political actors or sometimes investigative outlets that don't do proper serious work that they put forward. You can't just repeat that. You have to check your facts.

Olga Belogolova:

I think there are a lot of places in the world where there is what one might call digital authoritarianism, and those are the places where domestic influence campaigns from governments or politicians aimed at targeting their populations are a problem and are perhaps not taken seriously because those governments are themselves, the perpetrators of these operations. Some examples I can think of are obviously Hungary. We can think about what the military in Myanmar has done over the years and what Duterte's government had done in the Philippines. These are domestic governments that targeted their own populations or political parties, and they, of course, didn't want to take this threat seriously because then they themselves would've been the ones on the receiving end of the punishment.

Justin Hendrix:

Maybe on this last front, I did want to bring up Ukraine a little more closely. This is something else that has occurred right in this country just in the last couple of weeks, alongside the measure that we've already referenced that put in place the forced divestiture of TikTok was the passage of foreign aid to Ukraine, I think, much needed foreign aid to Ukraine. And one of the articles that I noted this week that was interesting to me was about the sort of relative lack of backlash from Republican voters grassroots rank and file across the country for that measure passing.

I think there had been an expectation that Republican lawmakers might face some kind of revolt over it since there appeared to be a lack of popularity at the idea of continuing to support Ukrainian army with so many billions. But I don't know, is this almost an example to you of what you're talking about a little bit here, that there's this kind of disconnect between the media narrative, about to some extent, Russian disinformation, MAGA, the lack of support for Ukraine and what appears to be maybe a broader consensus in the states about what to do on Ukraine? I don't know if that question makes sense or if it ties to what you're thinking, but I'll put it out there.

Gavin Wilde:

I think I would characterize it as this debate certainly about aid to Ukraine. I think there's a need to distinguish between groups that are capitalizing on each other's activities and where their interests for different reasons may overlap somewhat. And there being some kind of causal chain between two groups sharing some kind of the same sentiment. So for instance, whether it's MAGA Republicans or right-leaning folks, at its core, bad politics aside, there is a necessary and legitimate debate to be had about the degree of US support to any of its allies right now. And as American citizens, we ought to want that, but it absolves our political leaders from their responsibility to keep the content and the tone and the tenor of that debate within bounds to blame Russia or Russian propaganda for that tenor. And I think it lends Russia way too much credit for essentially pursuing its own terrible interests.

And in the meanwhile, we stop having that good faith necessary debate about what is in our own American interest, which I certainly agree. I think it is in the US interest to give aid and support to Ukraine. And I think that the other part of this is through making this about Russian disinformation and propaganda, we absolved our leaders from their responsibility to communicate strategically and lay out a coherent case that's relatively, in my mind, easy to make about the necessity and how this overlaps with American interests, et cetera. But so from both perspectives, you've absolved the American policymaker from their responsibility that they were elected to do by making this a conversation about, look what Russia is doing to our national discussion.

Olga Belogolova:

And perhaps as Gavin was saying earlier, this lack of backlash is indicative of the fact that Americans are perhaps more discerning than we think they are. We hear from these very loud voices, but that doesn't necessarily always mean that they're representative of what the general public truly believes, and they are perhaps reading stories about what's happening in Ukraine and have made up their own minds about what's happening in Russia and in Ukraine and perpetrated by Russia.

Thomas Rid:

It's to be expected, obviously, that the public support for a war that is bragging on, I think it's just a factual description. At some point, it'll begin to go down. And as Gavin and Olga said, we expect that to be the subject of a debate. And of course, that's exactly what you would expect in any democracy. It's happening in other countries as well, not just here. Let's be very clear, Russian influence operators, both in the private sector or in government and the intelligence community, they don't usually invent social conflict, polarizing issues or wedge issues. They simply amplify and actually most of the time unsuccessfully amplify. Sometimes they get lucky, but they don't invent, only amplify existing problems. So let's not get hung up on them. But let me give you an additional sort of interesting take. There are some, but only very few historical cases where we see leak operations, intelligence agencies stealing interesting data, say from US government sources and then leak them into the public domain.

We have some historical cases where that happened and where very good journalists, investigative journalists in Germany, there is a example that I have in my book where they know, okay, this information is actually from a hostile intelligence agency, but we also know that it's highly likely factually correct. In this case, it was about nuclear American targeting plans in West Germany, in West Germany, actually not East Germany. So they said, okay, it comes from KGB. We have near certainty that this is coming from KGB, but we also know it's factually correct, so it is newsworthy. We should report on it. We should also make clear where it comes from, but we should totally report on it.

So there are certain situations where you have "disinformation" that is factually correct, and you can sometimes even prove it that it's still newsworthy. So we should be intellectually capable of understanding that yes, something may be a foreign influence operation, but it may still be factually correct and still be newsworthy. I think that's a really important baseline insight to have in order to inoculate yourself intellectually against that cliche that all disinformation is bad and effective.

Justin Hendrix:

I appreciate this call for specificity, for care, for professionalism in dealing with what is of course a complicated topic. I would commend my readers to foreign affairs this essay, Don't Hype the Disinformation Threat, downplaying the risk helps foreign propagandists, but so does exaggerating it. Gavin, Thomas, Olga, thank you very much.

Gavin Wilde:

Thanks so much, Justin.

Olga Belogolova:

Thanks for having us.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics