An interview with Heidi Tworek and Emerson Brooking
This piece is cross-published with Just Security.
Since Russia launched its invasion of Ukraine on Feb. 24, moves by governments and private companies to limit or ban Russian state media have rapidly spread from the European Union to the United States, South Africa, Australia, and elsewhere. We have seen similar efforts before – during the Allies’ war with Germany and Italy in WWII, and there is much to learn from that history.
This week, the EU issued a regulatory amendment banning RT (Russia Today) and Sputnik, declaring that the “Russian Federation has engaged in a systematic, international campaign of media manipulation and distortion of facts in order to enhance its strategy of destabilization of its neighboring countries and of the Union and its Member States.” Companies such as Meta, which operates Facebook and Instagram, have recently restricted or demoted access to Russian state media, while the UK regulator Ofcom has started 27 investigations into RT.
Meanwhile, Russia moved quickly to crack down on its remaining independent news sites inside the country and introduced harsh new penalties for journalists who do not tow the Kremlin line. On Friday, Russia officially banned Facebook and restricted access to Twitter. An “army” of volunteers are now working on behalf of Ukraine to reach Russian citizens with news of the war, making phone calls, sending text messages and posting to available social media platforms to try to enter the information environment and inform public opinion about the war, according to Ukrainian officials.
There is historical precedent for such events. During the World War II period, American and allied governments regarded German propaganda as a weapon of war and used tools such as short wave radio to reach citizens behind enemy lines to penetrate the Axis power’s internal information environment. (The BBC, as if to underscore this point, announced Thursday it would resurrect the use of shortwave radio to broadcast news into Ukraine and parts of Russia).
In order to put these new developments in historical context, I spoke to two experts on the role of information and media in war:
- Heidi Tworek, Canada Research Chair and Associate Professor of History and Public Policy at the University of British Columbia and author of News from Germany: The Competition to Control World Communications, 1900–1945, a book that details how the Nazis used news and information to advance their agenda; and
- Emerson Brooking, Resident Senior fellow at the Digital Forensic Research Lab of the Atlantic Council and author of LikeWar: The Weaponization of Social Media, which considers how social media is changing the nature of war and conflict.
Our discussion touched on interventions that the United States and Britain conducted to counter the Nazi media and propaganda apparatus. That included investments in public media intended to provide a compelling alternative. And while some observers have suggested that Russian disinformation – with respect to its domestic and international audiences – appears to be less effective in the face of the gruesome spectacle of the invasion of Ukraine, Tworek and Brooking suggest it is too early to assess what impact it will have in this war.
There are also unique aspects to how social media platforms function that make them difficult to compare to earlier forms of communications technology. Those differences include the capacity of social media platforms to serve as a conduit for the broadcast of messages as well as a medium for coordination, and the algorithmic and monetization systems that create secondary effects including by other actors who seek to exploit the public sphere. Understanding these social and economic dynamics in wartime may provide some clarity as we consider the next phases of the war raging in Ukraine and how these platforms should be governed in the future.
- For more on how the Russian public’s view of the war may affect Putin: see Influencing Putin’s Calculus: The Information War and the Russian Public by Viola Gienger at Just Security.
- For more on the different target audiences for Russian disinformation: see It Is (Often) Not About You: Russia’s 4 Target Audiences for Disinformation at Tech Policy Press (Clint Watts interview with Justin Hendrix)
What follows is a lightly edited transcript of the discussion with Tworek and Brooking.
I’m very grateful the two of you can join me today. Heidi, in your book, there’s a phrase in the introduction, “Patterns that often seem new, are actually quite old.” That’s really what I wanted to talk about today– the extent to which there are things we might learn about the current moment in the invasion of Ukraine by Russia, and the way we’re thinking about information, disinformation, media, from the past and from past conflicts.
In your book, you pointed to a British intelligence report in 1936 lamenting that Germany was “now daily flooding the world with news” while the Soviets condemned German news as a dangerous element for the interests of peace. You point to German investment in international news networks, dating back to as early as the turn of the 20th century. With a broad brush, can you explain what the German state did to invest in its news and propaganda effort, in the early part of the century?
So basically around 1900, many German elites came to believe that Germany should be a global power. And we’ve often thought about this in political or economic terms– acquiring colonies or amounts of foreign trade. But what I show in this book is they also come to believe that news is a fundamental pillar of global power, and this is across the political spectrum. It’s industrialists, it’s academics, it’s politicians who all come to believe that news is a fundamental part of the global power and that Germany is far behind.
So what they do, really from the turn of the 20th century onwards, is to invest in a new technology– wireless, which later becomes radio technology, to try to project German power around the world. What I show in this book is that this really continues all the way through until World War II.
Of course, the type of news that the Nazis are sending out is radically different than what came in the Weimar Republic, for example.But I think it’s really important for us to understand what happens in World War II and in the Nazi period as actually a longer 30 year story of belief that news is important for global power, investing in technology, and that the Nazis really are able to take advantage of a network that has been built over the previous 30 years.
So news agencies as means of influence, the ability to achieve geopolitical, geo-economic, cultural goals, means to build power, often preceding the use of military power.
Yeah, exactly. I think one other point to make is that one of the reasons that the Germans get really interested in news agencies specifically, is because they’re actually interested in this power being hidden. They’re not interested in everybody knowing that this is news from Germany, they actually really want to try and influence news a step further right behind the scenes, and they find news agencies to be this really efficient bottleneck.
So at the time, you have a news agency cartel, there are just a few news agencies who have global reach, similar to today, and so Germans identify news agencies as the most efficient part of this bottleneck to try to control. Then the second corollary is then when people are reading news in their newspapers, they don’t realize it’s coming from Germany at all, and that seems to be the most effective way to try to change people’s minds.
You talk about, in your chapter on the limits of communications, the efforts the Nazis did to also limit independent media in Germany, to crack down on journalists and control public opinion. And eventually all of that effort comes under the thrall of the propaganda ministry.
Yes. When the Nazis come to power, one of the first things they’re interested in doing is really trying to be able to control domestic media, and they do this on multiple fronts, really quite early. One is to try to use companies to buy up all of the independent media, so they do that through a whole host of things. For example, they forbid people from owning more than one newspaper, and they use a shell company to purchase as many newspapers as possible.
The second thing they do is they promulgate laws– one called the Editors’ Law, which removes Jewish people from being journalists, but also places certain requirements on who can be journalists. So you effectively remove all of the more left wing people outside of the journalistic space, and the third thing they do is they merge two news agencies that existed before, into one news agency called the Deutsches Nachrichtenbüro, or the DNB.
They restructure it so that actually most of the news they’re collecting doesn’t necessarily go out to the public at all, but is actually for internal party use. They basically start to create a color coded system, where some news is sent to Hitler, some is sent to lower echelons in the party, and then some of it only is sent out to newspapers. So we see multiple ways in which the Nazis try really quite quickly, to control media. And maybe one final thing, I’ll say Justin, is that one thing that the Nazis actually controlled from the very beginning, is radio.
This is for a deeply ironic reason, which is that during the Weimar Republic, radio was created. So from the early 1920s, you start to have public radio, and the man who’s in charge of it is a man called Hans Bredow, who believes that radio should be used to bring together the population after terrible ravages of World War I, so you should use entertainment and education to bring people together.
Bredow is concerned as the ’20s roll on, that there are considerable divisions within German society, and what he thinks you need to do to preserve democracy is actually to have state supervision of content on the radio. There were multiple reforms in the late 1920s and early 1930s that created state supervision of radio content. Bredow does this with the idea that, this will protect democracy, but of course, ironically, when the Nazis come to power, it means, bing! They have immediate supervision of radio content. So we have– actually from the very beginning, and even in August of ’33– Joseph Goebbels gives a speech where he says, “The Nazis couldn’t have come to power and seized it and kept it as they did, without the airplane and the radio.”
And this quotation is often used but misunderstood, because what Goebbels is saying is the airplane helps the Nazis come to power, makes Hitler appear like this modern leader who can go and give speeches all around Germany in one day, but it’s radio that helps them stay in power. So I think that’s an important bit to understand, the difference between the amount of time it took to control newspapers, and how they could really control radio from the very beginning.
So Emerson, your book, Like War, chronicles social media and the internet and its role in warfare and conflict, but you start much further back, with Clausewitz. How has information propaganda played a role in warfare over these centuries?
First, it’s excellent to get Heidi’s perspective on this. When my co-author and I started writing Like War, we knew the first thing we needed to do was a review of communications history– even as we wrote about the effects of the internet on war, to pierce through some of the more vacuous claims that social media has changed everything, or where every communications medium is different. There were some clear through lines we wanted to get at, but in our book, we do focus on Clausewitz’s famous maxim that war is a continuation of politics or policy by other means.
Clausewitz was an early 19th century Prussian military theorist. He was essentially saying that the rules of politics aren’t thrown out the window as one turns to political violence. The political violence was just another tool in the toolbox, after economic coercion or diplomatic pressure had been exhausted. Clausewitz tells us when you’re in a state of war, you realize your objective by focusing on the enemy’s center of gravity. At Clausewitz’s time, right off the Napoleonic wars. The enemy center of gravity was clearly their military, their ability to resist what you were imposing by force.
As we go into the early 20th century, we see the rise of military aviation, and there’s a new school of thought in strategic bombing– that maybe the adversary’s center of gravity is their civilian industry, that you could fly over at enemy’s armies and directly attack the civilian population, decimate their industry, remove their ability to resist that way.
But also, really starting in the ’20s, is this other competing school of thought, which focuses much more on more propaganda, and this notion that maybe even in a state of violence, you could focus on shaping perceptions, and targeting adversaries’ population in order to remove their support for a given conflict. That through a process of persuasion and psychological dislocation, you could still reduce the enemy’s ability to fight their true center of gravity, that you could realize your objective through that fashion, without ever firing a shot. Briefly though, the thinking of these early students of war propaganda, their ambitions often outpaced what was possible, with the given technologies.
In our book– and Heidi you know a lot more about this– we talk about some of the Nazi efforts to reach Irish nationalists. The way that they had hours of a Gaelic language programming, and were trying to open up a new front against Britain.
But even if there were individuals who were sympathetic to that broadcast, they had to have a radio. They had to have a tune to the right frequency. They needed mechanisms by which they could find each other and organize. That just wasn’t possible with that sort of technology. But the internet enables individual interaction– and of course, mass transmission– at incredible speed. So in that sense, it really does change the dynamics of warfare, and the importance of information manipulation toward realizing your objectives.
I don’t want to turn entirely away from of the historical perspective, so please Heidi, feel free to contextualize this as we go forward– but looking at the apparatus that Vladimir Putin has built over the last two decades or more, with Russia Today, Sputnik, TASS, and then a whole range of other outlets that are more loosely affiliated with the Kremlin, Russian influencers who are aligned with the Kremlin, and all of this covert activity that has come to light over the last few years, particularly, since 2016 perpetrated on social media– how do you kind of think of that in comparison to what the Nazi regime was able to build before World War II?
One way that I think about it, is exactly this point of how it takes two decades to get there. So that’s one really important context that I think we need to keep thinking about, is how do we get to this point? So that’s one. Two is that it’s usually a good indicator when a country is turning to this kind of information warfare, that they have other ambitions. It doesn’t usually stay in the information space, and we see Nazi Germany is obviously one example of that, but I think that’s also important to bear in mind.
Maybe the third part that’s quite interesting is to see actually prior to this war in Ukraine. So a few years ago, there were some suggestions about bringing back some of the types of laws and actions that were created precisely to go against Nazi information incursion into the US– namely the Foreign Agents Registration Act (FARA). So FARA was actually created in 1938, precisely because the US was worried about Nazi journalism, et cetera, so it required those who were foreign agents to register. I actually detail one case in my book about some news agency employees of a German news agency who don’t register and then they’re arrested. J. Edgar Hoover gets very interested in this.
Then eventually what happens in 1941 is that the Nazis actually arrest a couple of United Press, American employees in retaliation, and there’s a prisoner swap. And then this Nazi news agency is put on trial, but there’s no people, and then it’s convicted as an espionage agency. And this law remained on the books, but it was really sort of resurrected over the last few years, as a method to try and deal with not only Russian outlets, but potentially Chinese ones as well. I think there’s a question then about why we stopped there. I think there are question marks whether that’s particularly effective, but also as to why we didn’t really think about the dynamics of social media more when we were discussing FARA.
We have seen both RT, I believe, and also Sputnik, forced to register as foreign agents in the United States, just in the last couple of years after the Mueller investigation.
So something I find fascinating about the use of the internet in these contexts is that early thinking, even from nations, that the US would describe as adversaries today, like Russia or Iran, early thinking from these countries didn’t really regard the internet as a weapon, or something that’d be militarized in quite the same fashion as previous communications mediums. The military co-option of the telegraph or the radio, happened pretty early in the life cycle of these technologies. But in the case of the internet, it was developed out of US military research, but then the US military determined there wasn’t much utility in the project, so they commercialized it.
As it spread around the world, many countries that had extraordinarily harsh censorship regimes in other communications, still maintained an open and free internet for decades, because censors who’ve grown up with radio and television as the means of political discourse didn’t see that much threat in this new media. And that really only changed in the early 2000s. Specifically, when we think about Russia, it’s important to emphasize that RT was a channel for Russophiles, even in the mid 2000s. It was broadcasting Russian ballet. It was something you tuned into if you appreciated Russian culture, but it was high brow.
It was like an international NPR sort of situation, and that really started to change around the time of the 2008 invasion of Georgia by Russia. And then we do begin to see a much more deliberate effort to create an online influence apparatus, but this is a fairly recent development in the life cycle of the internet.
I’d love to just talk a little bit about both of your impressions of that apparatus and how it has changed. perhaps since 2008, or since 2016, when the klieg lights got turned on, when researchers and politicians and people concerned about the Russian effort in the 2016 election in the US and Brexit really started to pay very close attention. What do you make of this apparatus? How effective it has been, and maybe then we can get into what’s going on at the moment.
If I were to look at one turning point in the Russian approach to the internet, it might have been the 2009 Green Movement in Iran. Because the Green Movement– or the Green Revolution– was a precursor to the Arab Spring. It was seen as Western technology companies providing the platform for these pro-democracy activists. But if you weren’t in the pro-democracy camp, and if you’re already suspicious of Western intentions and technologies, you saw this as an act of information warfare, and you see a lot of Russian military writing, talking about this as an information attack. And of course, you see similar writing out of China and Iran. In the tail end of the Arab Spring, there was the most significant protest movement against Putin in his time and power, and he took that quite personally.
He took quite personally the fact that then Secretary of State, Hilary Clinton, had endorsed these protests against his rule. So not only was the US launching these information attacks abroad, but now it was specifically targeting Russia. So even before Maidan in Ukraine, Russia knew and was re-gearing for this information battlefield. And then events in 2014 basically cemented their course. When I think about the effectiveness of the Russian propaganda apparatus– in part, because events over the last week have shown some profound weaknesses in Russian propaganda and public messaging– it’s made me sort of look back on how effective the apparatus was all along.
Because for years, we talked all the time about Russian bots and trolls. There was this suggestion that Russian actors were hiding in the shadows, manipulating all parts of our political discourse, but as time has passed, I really think that the Russian operation targeting the 2016 election was almost a Black Swan event. It was an event in which the Russian military, military thinkers and the Russian government had spent years thinking about and preparing for this sort of information warfare.
The US and the West were blithely unaware. They had given virtually no thought to the way that their social media platforms could be weaponized in this fashion. So it was this mismatch of intention and preparedness, which made that moment of Russian influence and Russian information investment so powerful. But as I look at the tools they have available today, I’m pretty confident that the West can meet any of the propaganda that they’re churning out.
I’ll just add two points. One is that, of course, the ability for any of this relies upon a social media ecosystem as it currently exists. And I think it just spurs us once again to look at a system in which scale is everything, and in which monetization is key. That goes back to my point about Fara. You can make RT and Sputnik register as foreign agents, but they were still profiting from ads and they were being promoted by algorithms on YouTube, et cetera. And that’s what was so odd about that moment– the disconnect between the registration, and then what one would think would’ve followed as a broader discussion about how the ecosystem of social media was still enabling this stuff to spread.
So that’s one thing, but I think the other is this question about the effect that Emerson was pointing towards. I think that here, history is also helpful because we often see a disconnect between elite concerns and beliefs about what news and information can do, and what is actually happening on the ground. And this is a common disconnect, and I show in my book how there are evolving ways over the first half of the 20th century, and how elites try to measure what publics are thinking.
In 2017, I wrote a piece which was really talking about how I think a lot of the fear around Russian disinformation was really drawing on these much older beliefs about crowd psychology from Gustave Le Bon, and that some of that was actually quite misplaced. I will say that I caught a lot of heat for that piece, but I think now in 2022, if we go back and reread it, I think a lot of people might agree.
I just want to make a final point, which is, just because it might not have a broad based effect, it doesn’t mean that it’s not important. So that was never what I was trying to say. The fact that it exists and it shows these weaknesses in a social media ecosystem is deeply important, because we’ve seen all sorts of other people– whether for economic reasons or political reasons– exploit these ecosystems. So even if it doesn’t necessarily have the effects that we feared, it doesn’t mean that it’s not important to understand.
So in the dialogue that we’re seeing happen now– as I’m talking to you, we’re one week into this war– there are some columnists and thinkers who are saying somewhat similar things. Farhad Manjoo in The Times has a piece today, “Putin no longer seems like a master of disinformation.” Do you think it’s right to kind of come to these conclusions that we should perhaps– and I’m being maybe slightly hyperbolical with the arguments– count the role of Russian disinformation out from this point in this conflict?
Absolutely not. I think there’s a deep irony as we look at the course of Russian military operations now, because in 2014, with the Russian invasion of Crimea and Eastern Ukraine, the Russians showed how one could use disinformation and obfuscation to effectively cover a military operation.
They were the pioneers in this space. US military thought is focused now on hybrid warfare and the gray zone for the better part of a decade because of Russian activities. They were masters in this space. What’s truly remarkable is that they had this established playbook, which still might have worked quite well.
Economic coercion, the continual massing of forces at the border, political infiltration, propaganda– these things could have served over time to fulfill Russia’s strategic interest with regard to Ukraine, but instead an invasion with 190,000 soldiers, tank columns rolling over the border, the largest land war in Europe, since World War II. You can’t use disinformation and bots to mask something like this. So instead, what I think we see is a profound mismatch with the strategy that Russia adopted in 2014 and the strategy that Russia is adopting now.
That being said, there’s been quite a lot of social media euphoria in the last few days among Western observers with regard to the war and Ukrainians have been extraordinarily brave, but there’s also a conventional mismatch in military capability that will be very hard for Ukrainians to overcome.
As the fighting turns increasingly to urban warfare, as the atrocities mount and as Ukrainians in harm’s way are understandably exhausted, I think the current situation will become much more complicated as Russia begins to endorse sham pieces, as the Ukrainian army and Ukrainian partisan resistance potentially split into different factions, particularly if president Zelensky is assassinated as is the Russian intention, is the environment becomes more complicated and as the realities of war really hit home.
I think Russia will, again, turn more its focus on disinformation, the sewing of discord and its propaganda messaging, and at that juncture, I think it will be much more effective.
Yes. I agree that one cannot know what will happen after the first week of a conflict and we were only at the very beginning of this. It’s extremely hard to predict how it will unfold, but maybe a couple of thoughts are just that, I agree that there’s been a knee-jerk reaction then to move to the exact opposite, that the Russian disinformation is completely ineffective. But I think we’re going to have to take a step back and really assess whether that’s true and we’re also going to have to think about this on a global scale.
We’re going to have to think about what is happening in those 35 countries that abstained from the UN General Assembly resolution condemning this invasion, so I think that’s one point. We do not have enough information on that. “What is the thinking in India?” for example seems pretty crucial at a certain point.
And then the other part of it is, of course, we did see in the first few days some major figures– Trump included– coming out somewhat in support of Vladimir Putin. And how that unfolds over the next months I think is complicated. So that I think is potentially worrying for the US domestic scene. I mean, as Rasmus Kleis Nielsen from Oxford always says, we have to remember that disinformation also comes from the top, and so I think that may be something that could change. Trump has changed his tune now and praised Zelensky but I don’t think we know where that goes and that’s potentially very concerning too.
Is there any sense in which perhaps Putin believed his own propaganda in the run up to this? The image he was trying to build either of himself or of Russia played into his decision making. Is that a possibility in your mind that there is a media effect, but more on the purveyor than perhaps on the receiver?
I mean, Putin has, at this point, unitary control over the Russian state. Many people who might have checked him had either been neutralized or were fully in his camp. The images we’ve seen of Putin’s government meetings and him in consultation with ministers shows that he’s sitting 30 or 40 feet across from them. You can’t get a better metaphor for his remove from reality.
Actually, in our book, we did dwell– in the context of president Trump– on how the modern information environment was potentially dangerous for international relations, because leaders who were say, connected to online spaces, we were thinking of Trump, but a leader who was scrolling through just the adulation of his admirers and the broadcasts of friendly news networks could be led to believe a very different frame of reality.
I think that’s clearly a component here. It seems that Putin’s intention and something which some lackeys probably told him was true, was possible, was that the Russian army would go into Ukraine and within a week, the government of Ukraine would be replaced and the Russian military could step back out having declared a fait accomplice before the west could organize any sort of reaction.
Obviously, none of this has come true, but I think there is so much to blame on Putin himself and his isolated decision making, and I think the modern information environment plays a factor in that.
Don’t get high on your own supply, right? But I think more seriously, Fiona Hill has talked about this and emphasized that there are things like Putin going back and having his own version of Russian and Soviet history, looking at older maps, talking about how borders in Europe have constantly changed. So how much the information environment is merely solidifying beliefs that he already had. And I do think here, his biography is tremendously important.
The fact that he’s a KGB agent in East Germany in 1989 is such a formative experience that I think we need to account for that as well. I’m not sure that this environment is really changing his opinion. It might just be solidifying it, because historians and others have certainly traced how far back these kinds of rambunctious beliefs go.
In the 10 or so minutes we’ve got, I want to bring it back slightly to some of the policy questions here. We’ve seen Europe now essentially ban Russian state media, particularly RT and Sputnik. We’ve seen the social media platforms take some action on their own to limit the distribution of Russian propaganda and state media channels, particularly in Europe and in and around the war zone in Ukraine. What do you make of these policy interventions? Are they enough? Are they too much, and what should remain after potentially this conflict is over, whenever that might be? Heidi, you’ve mentioned Fara, you’ve mentioned that we took some actions in the 1930s to try to prevent what we saw as some dangerous intervention in our information space. Did we relax too much? Do we need to go back to a more power oriented conception of information, or do you think that ultimately that will not work?
I’m going to throw out a couple of points here, and I think one is that, of course our knee-jerk thing that feels good is to ban stuff, but it’s incumbent on us as people who are thinking about policy and the press to take a bit of a step back from our knee-jerk feeling that that’s a good thing to do, and really ask ourselves these broader questions about where we want to go.
In terms of thinking about the World War II context, what we see is there’s the FARA example, but there’s also the example of the UK, which actually does ban, for example, the radio propagated by Lord Haw-Haw, who was a man pretending to be Irish and so on. We see that instead, what happens in Britain– and this also happens when the British foreign office discovers that German news agencies are propagating news around the world– their reaction is not so much to ban, but to strengthen public radio. It is to invest in the BBC.
I think we need to remember that that’s an alternative here. That one of the things one can do, instead of media talking about bans, is actually to reflect on how we strengthen our own media. That was public media and the BBC. In the case of the US, World War II is the period of setting up things like Voice of America. So there are these alternatives that I think we need to keep talking and thinking about that are not so much relying on bans, but instead of thinking about how we strengthen our own media environment.
Having said that, I think it is important to be having these conversations about monetization on, for example, social media platforms. You can have something available, but it doesn’t mean that you have to promote it.
And that’s just the fundamental difference from this World War II moment. Remember when Emerson said, “You have to have a radio, you have to be able to tune into that station.” Here, what we’re seeing is sometimes something like RT or Sputnik being suggested to people through an algorithm. I think that’s where we need to have a bit more of a conversation in figuring out what kind of reach. So if we say, “Listen, it’s actually very helpful of the people to potentially to be able to see what RT is saying, but that doesn’t mean that we have to promote it, and we can label it in ways that make very clear what this is while simultaneously really thinking about, how do you strengthen a democratic media environment as a counter?”
Maybe two other points to add– one that was made by Canadian scholar, Vivek Krishnamurthy, is that actually under the ICCPR, the International Covenant on Civil and Political Rights, Article 20, you may have grounds to be banning things like RT and Sputnik for spreading war propaganda. That’s, I think, a really important and interesting point to throw into the mix.
And then one other thing which David Kaye has brought up– the former UN Special Rapporteur on freedom of expression– is that we also have to think about the potential retaliation within Russia and how do we ensure that there are actually, as far as one can– obviously there’s a limit to how much what the US or UK does is going to affect what Russia does– but thinking about how you ensure there are some lines of communication with Russia and ensuring that those who may be protesting or otherwise organizing aren’t completely cut off and I think that’s also a valid point to consider.
I understand the knee-jerk responses, but I think there’s a lot of other policy points to bring into the mix as we think about this in the broader long term.
I guess we have to distinguish too between bans by governments and a choice by a private platform to potentially limit, remove, or otherwise reduce the amplification of a particular venue. Emerson, I don’t know if you have a perspective on this question.
We’re in a moment of extraordinary turbulence right now. It reminds me of the 2016 US election and the series of actions the technology platforms were taking where it was a response to current events, and we’re still not quite sure where we’re going to land.
I think severe action against Russian media is justified because RT and other platforms are an accessory to the Russian war effort right now in the obfuscation of the extent of Russian activities in Ukraine, but pretty soon there’ll be more of a focus on the dehumanization of Ukrainians, the equating of the Ukrainian army and Ukrainian civilians with terrorists to justify the likely executions and reprisals that Russian occupation forces are going to begin.
I think action against those voices is justified. What I do worry about is really the manner and which the platforms describe the actions they’re taking and why. And this is something where small differences in phrasing, I think it matters a great deal for the president that we set. Earlier this week, Meta banned Russian state media properties in the European Union, following the European Unions’ ban on those properties.
Meta’s spokesperson, Nick Clegg, in a tweet said that after consultation with the European Union, Meta decided to ban RT in Europe. That’s technically true, but that sort of thing sounds like Meta is simply responsive to governments that have made content banning actions.
And any number of authoritarian or undemocratic states around the world who tried perpetually to ban their own domestic opposition and who frequently appeal to the social media platforms to do this work for them, I think they all took note of something like that and they will cite that kind of statement repeatedly in the future is to try to engage in their own censorship activities.
What Clegg could have said, and what I think the reality is, is that platforms routinely receive legal requests to remove different accounts. As legal environments change the strength of those requests changes. If the European Union had banned our RT, it makes total sense for Meta to look at that recent legal decision, refer to their own terms of service, and then make a determination to geo-block RT in that particular region.
Geo-blocking is the common response from platforms with regard to these sorts of legal requests and you can view all of them in the platform transparency reports. So that’s how we’ll established precedent, but it made it sound like an extraordinary wartime response, and I think the things that are being said now– which make sense maybe in the heat of the moment– will become much harder to defend as time passes and that we can go a long way toward preventing these future headaches and difficult content discussions.
If we try to ground decisions now in established precedent, in terms of service– in the “Supreme Court” for content that Meta set up not long ago, the Facebook Oversight Board– these institutions haven’t gone away and we need to reference them.
One last point I’ll make is, there’s always a negotiation between platforms and unfree countries because they often make odious requests for content removal. The platforms fulfill some of these requests, but they do so also because they can then stay open in these countries and provide a relatively free medium for political expression and one that’s relatively free of state surveillance.
The longer that social media companies can provide some service in Russia, I think the better it is for everyone. On the first day of the war, the most prominent platform for elite dissent in Russia against the conflict was on Instagram. These platforms play a valuable function in the country, and I worry that the more aggressively-phrased their actions are, say with regard to RT in Europe, the faster we’ll see these platforms totally removed from Russia.
I’ll try to ask one last question. This one may not suit, so I might end up in right there, essentially Emerson, but I guess when the dust settles and we look back on the last 15 years or so of Russia building its presence on social media, in particular, broadcasting at state media there, using it for covert operations, the rest of that, where do you think, if you had to guess, where history may net out on that, what will we regard as the platform’s role in leading us to this point or playing a role in leading us to this point?
Social media platforms are a foundational part of modern political life. You can’t separate their role from really any event that takes place today. But as we look back and try to understand the forces that led to this current horrible war, I don’t think that the role of the social media platforms would be front and center, as it has, for instance, with the genocide Myanmar, ongoing ethnic cleansing in Ethiopia.
If you had asked most Russians a week ago, their thoughts were Ukrainians, they view them as brothers and sisters. Maybe misguided, but certainly not all Nazis, not all genocidal maniacs, not people are deserving of the cleansing and murderous force, which Russia’s using now. I think that when we look back on the origins of this war, it will rest almost entirely in the head of a Vladimir Putin.
The platforms are one battlefield in which this war is playing out, but I don’t think they bear particular responsibility here.
Maybe I’ll just add and say, I think when we look back on this, we will also have to contemplate bigger forces and that is oligarchy, the role of money and that will be, I think something that is a much longer question, but I think there the social media platforms are part of it because it’s the question of who is invested in them.
I think if we take a step back from what we’ve been talking about the whole time, which is content, and we really look at who is investing, and that is part of this broader story of oligarchy, which runs all the way from Chelsea Football Club to properties, to yachts and I think that is a story of taxes and transfers of money that we will really have to grapple with in a meaningful way.
The final thing that I’ll say, as a self-serving scholar is of course, we’re going to need more research, to really get a sense of, in some cases, more broadly, on how attitudes to Russia have changed and whether social media platforms have played a role in that, but I agree in thinking about the origins of this war, less so maybe in the sense of how people now understand Ukraine.
And perhaps actually I’ll say one final thing, which is of course that Putin has been very brutal before and our lack of understanding and knowledge of that has nothing to do with social media platforms. It has to do with broader issues around education and what we put attention to and what the media focuses on, and that I think requires deep reflection. Putin began his career with a terrible oppression against Chechnya, flattening Grozny.
That was a long time ago and yet we see a sort of lack of understanding of that and we also see a deep lack of understanding of the history of Ukraine, something that the historians like Serhii Plokhy have written about for some time, so I think these are some of the other much broader factors that we need to think about that go far beyond social media platforms.
Heidi, Emerson, thank you very much.
Thanks so much.
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.