Home

Donate

The Sunday Show: State Media, Social Media, and the Conflict in Ukraine

Justin Hendrix / Mar 6, 2022

Audio of this conversation is available via your favorite podcast service.

In this podcast, we listen in on a panel discussion hosted by the Stanford Cyber Policy Center on state media, social media, and the conflict in Ukraine.

Convened by Nate Persily, Co-director of the Cyber Policy Center and James B. McClatchy Professor of Law at Stanford Law School, the panel considers the moves taken in recent days by governments and technology platforms, and the implications for the ways state-sponsored media and information will be regulated in the future. Guests include:

  • Nathaniel Gleicher, Head of Security Policy at Meta, which operates Facebook, Instagram and WhatsApp
  • Yoel Roth, Head of Site Integrity at Twitter
  • Marietje Schaake, International Policy Director at the Cyber Policy Center and former Member of European Parliament
  • Renée DiResta, Research Manager at the Stanford Internet Observatory
  • Alex Stamos, Director of the Stanford Internet Observatory and former Chief Security Officer of Facebook
  • Alicia Wanless, Director of the Partnership for Countering Influence Operations at the Carnegie Endowment for International Peace
  • Mike McFaul, Director of the Freeman Spogli Institute for International Studies, and former U.S. Ambassador to the Russian Federation.

What follows is a lightly edited transcript of the discussion.

For a historical perspective on state media in times of war, check out Moves To Ban Kremlin Propaganda Outlets Evoke WWII Anti-Nazi Efforts, an interview with Heidi Tworek and Emerson Brooking. And earlier last week, Courtney Radsch outlined some policy considerations related to this topic in Tech Firms Caught in the Middle of Russia’s War on Ukraine.

Nate Persily:

Welcome everybody. I'm Nate Persily, I'm the Director of the Cyber Policy Center here at Stanford. And this is an impromptu, hastily organized conversation on the role of state media and social media in the crisis in Ukraine. The goal of this session is to really just find out what is happening, what's happening on the platforms, what those who are informed about the region are thinking about, and to really sort of try to keep pace with what are very fast moving events. As many of you know just before this call, Russia has shut down Facebook in Russia.

So the all star panel that we have today, in the order that they'll speak, Nathaniel Gleicher, Head of Security Policy at Meta, then Yoel Roth, Head of Site Integrity at Twitter. Following him will be Marietje Schaake, our International Policy fellow at HAI International Policy Director at the Cyber Policy Center and former member of the European Parliament; Renée DiResta, our own Research Director of the Stanford Internet Observatory; Alex Stamos from the Stanford Internet Observatory; Alicia Wanless of the Carnegie Endowment for International Peace; and Mike McFaul, former Ambassador to Russia and Head of the Freeman Spogli Institute will probably jump in in between hits with MSNBC or wherever else.

So I thought what we'll do is, we will just start with Nathaniel and Yoel telling us what's happening, what measures they've taken so that everybody understands how the platforms have been dealing with state media in this crisis. Because every day there seems to be a new intervention and so if you could just sort of level set for people, what exactly is going on, I think that would be the sort of most productive use of our first 10 minutes or so. And then we can talk a little bit, get the European perspective from Marietje and then have a larger discussion.

So Nathaniel, let me turn it over to you and then to Yoel.

Nathaniel Gleicher:

Thanks Nate. Thanks everybody for joining. As Nate said, I’m Nathaniel Gleicher, Head of Security Policy at Facebook. I'll just give a bit of an overview on some of the things we're doing around state media and in the region generally and then look forward to the conversation. I would say in terms of just anchoring the actions we've taken, we've demonetized Russian state media around the world. We have also taken steps to down rank and demote Russian state media entities and removed them from recommendations. We have also down ranked and demoted links to Russian state media websites. Similarly, the goal here is to reduce the amplification and reduce the awareness of these platforms. We continue to identify and label new Russian state media entities, the teams actually identified several dozen in the last few weeks, as they've continued the investigation I expect we'll find more.

We also have our teams of third party fact checkers that are fact checking content on Russian state media. I think as many people know, not that long ago, the Russian government asked us to remove a number of these fact checks from content posted by Russian state media. We declined to do so, as a result, they began throttling our services in Russia. And as Nate mentioned, as of earlier today, I understand they've made a decision to fully block Facebook within Russia, as they're also doing to other social media. We'll focus on actions around state media in particular in this conversation, but I do think it's worth it to situate this in the broader conversation about what's happening in Ukraine and Russia. So just to say, a couple of things about some other steps that are happening there, we've taken steps to help people in Ukraine and Russia lock down their accounts or easily increase privacy settings on their accounts to protect people on the ground, both because of risk to individuals in Ukraine and as a result of increasing reporting of targeting of protestors against the war in Russia.

We've also reported on some sophisticated targeting efforts, targeting public debate in Ukraine, in particular, an influence operation with links back to Russian-linked actors and then also ongoing target of prominent voices in Ukraine. And I wanted to call this out, just here as we're starting in the conversation to say that, so on Sunday we mentioned there had been ongoing targeting of Ukraine military personnel and other prominent voices, attempting to compromise their accounts on our platforms and elsewhere. Our teams are monitoring and responding to this, but there are also some steps that everyone can take. And so I just wanted to call out that if certainly, if you are in the region and listening to this, but also if you have friends or colleagues who are in the region that you're regularly in contact with, I think it's really important for everyone to turn on two factor authentication, in particular, app based two factor authentication for not just your social media accounts but your personal email and related accounts.

Don't reuse passwords that you use on different services and in particular, if you're speaking with colleagues in the region, to use end-to-end encrypted platforms. Obviously, WhatsApp is end-to-end encrypted by default. Signal is also end-to-end encrypted, and it's another good platform to use. For Facebook and Instagram, if you're in the region, you can toggle direct communications on both of those platforms into end-to-end encrypted mode so that people can address risks in the region. That's sort of some of the facts that are on the ground. I'm sure they'll talk a lot about the trade offs here and the balances.

The other thing that's happening is a lot of engagement across social media into Russia and between Russia and Ukraine, as people are trying to get accurate information about what's happening in Ukraine. In fact, we've seen President Zelensky and others sort of directly appeal to people in Russia, in Russian through social media platforms to get accurate information out and try to drive towards a more peaceful resolution. And so as we all think about the steps that we need to take on state media and other steps, we're also trying to think about how we can make sure that those types of exchanges can continue to happen and we have teams that are working on that as well. So I'll pause there and I'll just say thank you to everyone who's on the call and to everyone who's listening, who I know are working around the clock on helping refugees, on protecting people in the region and doing everything that you can. I look forward to the conversation.

Nate Persily:

Thank you. Yoel, why don't you tell us Twitter's perspective here.

Yoel Roth:

Thank you, Nate and thank you to everybody for organizing this so quickly. I know this is a situation where there's a lot changing from day to day and seemingly minute to minute, including sort of as yet unconfirmed rumors that Twitter has been blocked in Russia. We're seeing the same news that all of y'all are, although we can't yet confirm it, but I actually want to step backwards in time a little bit and start the story of Twitter's handling of state media in 2017 when we took our first actions on RT and Sputnik to block them from advertising on the Twitter platform. We took this action after we observed their role in interference in the 2016 elections in the United States and it was at that moment that we took the global action of blocking them from running any advertisements and monetizing on our services. We donated the proceeds from their historical advertisements to supporting media literacy and pro-democracy efforts worldwide and that's been a policy that we've had in place since 2017.

If you set forward to 2020, where we rolled out labels on the accounts of state media outlets in Russia, China and elsewhere, to provide direct transparency about the specific accounts in question. To be clear, this isn't just about RT and Sputnik. Although that's been where the EU directive and other regulations have focused, we've labeled more than 100 state media outlets in Russia alone. And that's the specific accounts of the outlets themselves. Of course, RT and Sputnik are some of the most prominent and well known, but we've labeled state controlled media that operate primarily domestically, as well as a number of emergent outlets that have been more active on social media in the West, including Ruptly and RedFish. Some of these accounts that we see very frequently active in moments of crisis and strife, they've been labeled under our policies as well. And we've been rolling out state media labels to a number of countries.

What do these labels do? The first thing they do is provide transparency. In a lot of instances, it's fairly clear that something is a state backed media outlet, the R in RT is Russia, right? We generally know that, but especially for some of these novel outlets like Ruptly and RedFish, it can sometimes be a little bit more challenging to know who's behind it. And so by providing that direct context, we're helping people understand what it is that they're dealing with. But the second, and I would argue more important intervention here, is we substantially reduce the distribution of these outlets across Twitter and that's been in place since 2020. Whenever we label an account as being a state affiliated media outlet, we remove that account out from our top search results.

So if you click on a trend and you see the resulting tweets, they'll never show up there and we'd never recommend these accounts or amplify them through any of our product features. And again, that's functionality that predates any of the present directives and the present crisis in Ukraine. This week, we took an additional step. We observed by looking at some of the data on Twitter, that the vast majority of the reach that Russian state media were getting was organic rather than driven by their own and operated accounts. Certainly, RT and Sputnik operated accounts on Twitter that had a sizable following, they would get a number of retweets, they'd been labeled since 2020. But if you look at the overall volume of the conversation about state media, most of it is driven organically by real people who are sharing that content of their own volition. And we wanted to make sure that the same context and labeling, and also non amplification interventions that we have for the owned and operated accounts, are available for those tweets as well.

Since the start of the conflict in Ukraine, we've seen the volume of tweets sharing Russian state media skyrocket, from an average of about 20,000 tweets a day, to peeking at more than 65,000 tweets per day, sharing links to RT, Sputnik and other outlets. On that basis, we made the decision to roll out tweet level labels for all Russian state media outlets. Those labels have the same interventions as the labels that we apply to the accounts of the media outlets themselves. We'll never amplify them, we'll never recommend them. They don't appear in our top search results. And we have reason to believe the data is still emerging, but we see that these interventions on average, reduce the distribution of this content by more than 80%. So there's an 80% reduction in the number of impressions that these tweets and accounts receive on the basis of the interventions that we make here.

These are not Russia specific policies for us, as has been the case with our state media labeling, we're doing it globally, and we're expecting to continue to roll these labels out in the coming days and weeks to all of the other countries that we've designated and labeled state media for. Finally, I just want to note, echoing Nathaniel, state media is a key part of the interventions that we're making in this space, but they are by no means the only ones that we are making. We've offered localized security guidance to our users around the world in Ukrainian, in Russian, in English and we're continuing to make available up-to-the-minute guidance about how to secure your account, how to have conversations on Twitter in a safe way, and are working to ensure that we are elevating credible voices covering the conflict in Ukraine through or our curation products in our explore tab, as well as prompts in our search experience.

All of these holistically, add up to the interventions we're aiming to make to promote a healthy information environment, even during a conflict where the messiness of the information environment is perhaps its defining characteristic. But glad that we have the chance to connect about the work that we're doing here, excited for this discussion. And again, thank you Nate, for convening us so quickly.

Nate Persily:

Well, thank you. I've got a few questions for you all, but I want to turn it over to Marietje, to sort of give us what's happening in Europe. Among other things, the EU has banned RT and Sputnik right from these services as well as TV. So Marietje, let me turn it over to you and give us the view from Amsterdam.

Marietje Schaake:

Thank you so much, Nate. Welcome everybody. I want to offer a little bit of context with a few pointers to give you a sense of what I think the context in Europe is within which the whole question of the spreading of disinformation takes place. Obviously, I am devastated and I think most people here are extremely worried about how much worse Russian violence against Ukrainians will still get. We've seen what I think are unprecedentedly heavy sanctions imposed on Russia, but it should not be underestimated that the pain of those sanctions will also be felt for average Europeans. And it's very debatable whether they will actually change Putin's course. But of course, it is now a matter of principle to support these sanctions and a lot of companies are also going above and beyond to show where they stand in this conflict to help refugees and so on and so forth.

The heavy sanctions, I think, are politically justifiable. There is a risk of overreach. We've seen Russian musicians and concerts canceled, Russians evicted from their apartments, collaborations between museums in Russia and Europe ended, and so I think the backlash is something to push back against, but clearly Russia is now a pariah state, and I think it will become increasingly isolated. I do expect that there will be a growing pushback against the heavy sanctions and also more questioning of what the proper European reactions to this war might be, given the risk that people see of escalation. And certainly when the economic pain will be felt for ordinary Europeans, notably because of energy prices, it will become more contested. The exposure of the fallout of this war is very tangible here in Europe– think about the unprecedented amount of refugees that are coming in, and I think that really the only silver lining this week has been the outpour of support for these refugees unlike what we saw with people fleeing Syria when Russia was bombarding civilians there.

I think the past week has made the impossible possible, with regard to sanctions, but it also really made me wonder why the notion of reaching a moral limit about the role of Vladimir Putin as the aggressor that he is required this invasion of Ukraine. I think a lot of introspection has been happening among political leaders in democracies, but I think it's also really important for social media companies. Neutrality doesn't work, if it ever did, and I think all over the world, it is clear how much democracy, peace, the rule of law are under attack. And how much of a role disinformation, propaganda and lies are playing in fanning anything from polarization to direct violence.

The stance and the positions of European audiences are important for how the international community can continue to respond to the fallout of the war and hopefully pushing to end it. But the polarization of audiences and populations in Europe may well become a bigger issue, and it will be a very important point for Putin to be able to find more allies in Europe. It's a space to watch in France where Marine Le Pen, a long time ally of Vladimir Putin is polling high. Obviously, she's trying to walk back a little bit from the close ties that she has with President Putin. But when you look across Europe, the relationships are warm between the far right and himself. Another development of the last week is that prominent anti-vaxers, anti-vax movement leaders have morphed into pro-Putin mouth pieces.

Now on the blocking of state media, it is one step in that very, very heavy sanction package that basically cancels everything in relation to Russia. However, it is unusual to block media and it is also controversial. There are a lot of people who worry about backlashes and it is not undisputed. Russia Today, RT, had very small percentages of viewers on television, but online obviously was different. It was boasting about billions of views on YouTube, and is a best viewed television channel on YouTube compared to Western outlets like BBC and CNN. I personally campaigned for a yes vote in the referendum around the Ukrainian-EU Association Agreements, where I heard literal talking points coming from Russian state propaganda when I was campaigning. Similarly, in 2014 when Russia illegally annexed Crimea, the same narrative that we hear today is that Nazis rule Ukraine, as if that is a justification for the unjust and unjustifiable invasion.

So information plays an important role and will continue to play an important role in Russia. I also think there is a broader question that will flow from decisions made in the context of this war around state propaganda, which is how social media companies will handle handing a megaphone to dictators. It seems surreal, almost, that the Olympics in China ended only recently and that there was so little discussion about human rights violations there. There are many, many propaganda channels and dictators that still really enjoy the benefits of having access to social media platforms. So I do think there should be a broader discussion about where moral limits lie, and I'll leave it there.

Nate Persily:

Thank you, Marietje. I'm sure we’ll return to some of those themes. Renée, I think just this week, published a piece on RT. Obviously, it wasn't just about this conflict, or were you republishing it and I just saw it on Twitter this week? I think it was new research on RT and state media. Is that right?

Renée DiResta:

Yes. This was new, this was published with Dr. Sam Bradshaw and Carly Miller, who previously were team members here at SIO. As academic publishing works, we submitted it a year ago.

Nate Persily:

Right, right. Now it will become relevant again.

Renée DiResta:

Now, it's out.

Nate Persily:

Well timed.

Renée DiResta:

Thank you, thank you. So influence operations and disinformation campaigns have been the subject of news cycles for several years now, particularly those involving social media and particularly those precipitated by Russia, given this very high profile multiyear, multi actor operation in 2016, but those efforts are part of a broader strategy. And at SIO, we try to assess these campaigns in the context of what we call, a full spectrum propaganda model, arguing that states can run both broadcast and social media operations on a spectrum of attributability from overt to covert. So the kinds of social media bot and troll campaign fall on the spectrum of kind of covert social and then state media would in the case of our social media conversation today, fall in the realm of overt social, particularly as these pages are labeled and attribution is clear.

So while the covert activity on social media gets the bulk of media coverage– there's something kind of captivating about digital agents of influence and AI fake faces– the reach of those efforts are actually now usually quite limited, and particularly since 2018, they come down a lot faster. The inauthentic activity policies, teams like Nathaniel's and Yoel's impact the state actor strategy by making it harder and more costly to run those campaigns. And even in the case of Ukraine within 48 hours of the shooting war starting, Facebook and Twitter had taken down information war activity in the form of small networks linked to Russia targeting Ukrainians. But the state media properties as some have alluded to, are different and that's because they are permitted on platforms. And there's a variety of reasons for this and trade offs to this, which we've kind of begun to go into here and I think other speakers will continue to.

But over the years, these overt state properties have managed to attract audiences in the millions, tens of millions. In the case of China, hundreds of millions. And as our colleagues from the social platforms have noted, that was without so much as a label for a fairly long period of time. So as these pages have presences on social platforms, what we began to look at, in the context of 2020, was how they covered the Black Lives Matter protests. Now this was a case study that used BLM because this was a very high profile summer of protests. Ruptly, for example– one of the state affiliated broadcasting entities– was constantly live streaming the protests and what we observed was that footage was taken by other state-linked entities associated with Russia and was spawn into very, very different frames.

So there were some outlets that used a strongly pro Blue Lives Matter frame, and there were some outlets that used a strongly pro Black Lives Matter frame, and this is interesting. And we wanted to try to unpack the dynamics around these newer entities. As you all noted RedFish, not immediately obvious to a viewer as state media, some of the Maffick media properties, including the ones that actually sued Facebook when they were labeled as state media, they lost that lawsuit but they were putting out content, very video first content intended for millennial audiences that was strongly pro Black Lives Matter, while RT and Sputnik were putting out strongly pro Blue Lives Matter. And so we assessed about several thousand, three or four thousand posts, through this corpus of content that we obtained on CrowdTangle to try to look at the ways in which this playing both sides dynamic in some ways kind of hearken to what we used to see done by the covert social operations, by the Internet Research Agency, running some pages that were strongly pro Black Lives Matter that in fact were pretending to be Black Lives Matter activists, while others were pro-Texas Succession, pro-Confederate content, and looking at the ways in which Russia had adapted its messaging to the modern information ecosystem.

So I want to kind of now transfer the focus back to Ukraine. The thing that is relevant about the work that we put out in this context, is primarily that state media is an important tool in public diplomacy, in propaganda and influence operations. And so the substance is in some ways, kind of it's topic agnostic. The point should be on understanding the reach and the impact that these outlets can have and coming up with policies that recognize the unique role that they play in the ecosystem. I think my co-author Dr. Bradshaw put it as, "In today's digital ecosystem, propaganda posters have become memes and broadcasts have become live streams. And as the state avails itself of these tools, the dynamic of state media on social media is worth additional study and carefully crafted policy."

Nate Persily:

Wonderful, thank you so much. So let me turn it over to Alex Stamos, Head of the Stanford Internet of Observatory and then to Alicia Wanlesss, and then I want to have a larger conversation also integrating some of these questions. Alex, what's your reaction to what you've heard from the platforms and also where we are? You've been tweeting a bit about what you think might be the right... Well, this is not a unique time for tweeting on what the platform should do, but you've been tweeting about what you think they should do with respect to state media, so have at it.

Alex Stamos:

So I want to keep it short because I do want to get to the conversation, I think will be interesting. But so a couple things. One, yes, I believe state media has always been kind of a unique challenge for how the company should handle it because there's this asymmetry of especially state media from authoritarian states, that they are using censorship and other means to control their population. And then we Americans are providing them with all of this amplification on our platforms because that's consistent with our view of freedom of expression. And so like how to square that circle of, how do we stop authoritarian state media without using exactly the same techniques as these authoritarians themselves? I think it's like a legitimately big challenge.

I think where the companies have ended up is a pretty good place, it took a while. I think what we've seen is that teams like Yoel's and Nathaniel's have been on this problem for a while, but state media hasn't actually been the top of a lot of people's concerns. There's been a lot more discussion of the covert types of influence that Renée mentioned, of people who are the fake accounts and such, and kind of things that are related to what happened in the 2016 US election. And so overt state media has not been an area of a lot of discussion. I completely agree with Marietje– although, I am going to point out that our European friends have been much more friendly with Russia than Americans for the most part.

And so I do appreciate that there is a shift here in how governments are acting as well. When you have Germany, until the last second trying to stand up for Nord Stream 2 and such. So the fact that there's now kind of a massive reshuffling of how people treat Russia, of not trying to pretend that things are normal and okay, and that we are going to treat Russia as an authoritarian state, I think is good on a lot of levels. So anyway, where I think the companies ended up is a pretty reasonable place, which is the isolation and quarantine of state media. So that state media outlets still exist, but everything that is posted by them and then all their links are labeled and it is significantly harder to share them. I would like to see quantitative data on that. I have offered to Nathaniel and Yoel that we would love to see a paper in the Journal of Online Trust and Safety on the quantitative analysis of what happened. But realistically, I'd like to see exactly what the impact has been before I really have judgment, but I do think that's a reasonable balance there, to say they can exist, but then we're not going to give them the benefits of amplification.

I think for me, the big question now is… this was a totally exigent circumstance. The companies had to move quickly to deal with a real emergency in which all these lives were at risk. How does this policy get used in the future? And especially the country that's most interesting here is China, right? Because until this week, Russia has actually had a much more open internet than China, right? For the most part, Russian citizens had access to most or all of American social media. And so in China, this disparity is much more obvious, in that, you have the Spokesperson of the Ministry of Foreign Affairs, able to troll and lie about COVID and such, and has this huge Twitter account where he engages American politicians and such, but then his own citizens can't get to Twitter to see the opposing views, right?

And so I think that unbalance of, if you're a state that uses censorship to cut off your own citizens, that they need to be treated differently. And so I think as things come down, we're going to have to do that and I think that is the bigger context– that corporations all over the world, not just tech companies, but all kinds of corporations are going to have to reassess their activities in totalitarian states. And I think democracies are going to have to get to the point where we no longer think, "Oh, well, as long as we're making money and our companies, our multinationals and our citizens are making money operating in these authoritarian states, that we're going to turn our back because this is what happens."

And so I think, obviously we have to focus on Ukraine and the humanitarian crisis there, but when things settle down a little bit– unfortunately, we might have a multi-year insurgency there or battle between the Ukrainians and the Russians. So who knows how long this takes? But eventually, we're going to have to think about how this changes our behavior outside of Russia. And I think China is the clear country and that it's also a much harder country to deal with, just because they're so much more economically powerful than Russians.

Nate Persily:

Great. Let me turn it over to Alicia to round us out. And then I'll start posing some questions, first Nathaniel and Yoel but then to the group as a whole.

Alicia Wanless:

Like Alex, I'm concerned about what the interventions made in the information environment as a result of this conflict are going to mean longer term. So the idea of blocking state media and ultimately in retaliation, the Russians blocking access to Western websites and digital platforms, is going to essentially piecemeal and fragment the international information environment. I think what we're seeing here is finally what Russia and China have been advocating for at the UN level for a long time, and that's digital sovereignty, having total control over their own information space. In the short term, the blocking of state media has encouraged Russian response of blocking Western headquartered digital platforms, removing a key channel through which prominent Russians have been expressing their dissatisfaction to the war. So that takes away a key form of protest for them. But these blocks also restrict the availability of direct-to-public appeals, like those of Zelensky to the Russian people, to try to erode their support for this invasion.

And indeed, we've already been seeing this happen with Russia taking down, I think they've been blocking radio... Radio Free Europe reported today that BBC is down, Deutsche Welle is down, Facebook, Twitter, Apple and Google's App Stores were all blocked as of today. In the longer term, if the authoritarian states block these platforms from operating in their countries, not only will it potentially diminish revenue for Western companies considerably, but it's also going to give China and Russia based firms a significant advantage, an increased market share. And that will be what their citizens turn to that we no longer have access to communicate with them either.

So this will essentially create a splintered information ecosystem whereby the West has no ability to influence other countries at all. And indeed, of the five countries that supported Russia's war at the UN this past week, Eritrea was also one of the authoritarian co-sponsors, along with Russia, for a new resolution on disinformation that was passed at the UN General Assembly third committee in November, which I think can be a signal of what is more to come in terms of pushing digital sovereignty at an international level, that many, not so democratic states and maybe some of the countries in between, will see as a model for them to control their own information space. So I really do think that even while this emerging crisis is very pressing and we have to respond, I think democracies have to keep an eye on geopolitics and the longer term game here and make a response at the UN level for what governing the information environment should actually mean with democratic principles.

Nate Persily:

Thank you. Thank you so much. So now, let's get into conversation particularly, about what's happening in this crisis to start.I want to pick a little bit at what Alex was saying and direct it to Nathaniel and Yoel about, how do you know if these interventions are succeeding or not? I know you're not going to turn over data here, but I mean, literally, just something like labeling of state media– if you look before and after the imposition of labeling, is that in and of itself a tool that decreases engagement? Obviously you mentioned demotion, of course, is going to have that effect because then people aren't going to see it. But I was just wondering on the labeling side, because there's sort of social scientists who kind of go different directions on this, but I was wondering whether you're seeing a lot less engagement once you label these things as state media?

Yoel Roth:

I can kick this off and then Nathaniel will go ahead. One of the things that happens whenever Twitter labels a tweet, whether it's state media or misinformation or COVID 19 or anything like that, is when you go to share the tweet or engage with it in some fashion, we pop up what we call a nudge. It's a gentle message that just says, "Hey, this thing might be misinformation or it might be state controlled media. Think about it before you share this content." And we don't block the actions. We give people the choice to behave organically, but we see that these nudges are incredibly effective. Filing this one away under additional data, Alex would like us to publish in the Journal of Online Trust and Safety, TM. We see that in general, there is a 40%...

Nate Persily:

Nice plug.

Yoel Roth:

... decrease in the... I know, right? There's a 40% decrease in the engagement rate whenever we show these nudges on tweets. So that means without having to block somebody, without taking away their agency, we actually see that people voluntarily choose not to engage with labeled content, just because we pop up a small interstitial warning that adds one additional click of friction. Renée has written and spoken a ton about the value and importance of this type of friction in social media products. We have the data that substantiates that it works. And that's why we keep leaning into label based solutions because it takes you out of the conversation about censorship. It takes you out of this question of, "Are social media companies taking away entirely the distribution of content or not?" We're saying, "We're going to give people the context to make these decisions themselves." And we see that there's a 40% reduction in the frequency with which people share it.

The other question is long term effects, right? Like you then are asking, "Is there better awareness of what these informational dynamics look like? Do more people know that the Russian government is exercising editorial control over these outlets and that that's even a thing that's happening?" That's a much longer run social science question that I think we need to study and understand. But the hope, I think cumulatively, is that between social media labels, between the EU directive, between extensive press coverage of these dynamics, one of the things we'll start to see over time, is greater awareness of the fact that state media is a thing that exists, that it's an instrument of statecraft and that we see it deployed in times of conflict to try to advance the views of specific sides.

Nate Persily:

Right. Nathaniel, do you want to add a little bit about how you can see whether your interventions are effective?

Nathaniel Gleicher:

So there are a couple things that I would add just to build on what Yoel said. The first is, it's interesting because it's worth being precise. We're talking about different types of labels here, and there are a number of different steps in place, right? So there are state media labels, which exist on tweets on Twitter, exist on pages and posts in Facebook, that simply indicate that the entity that is behind a post, is a state controlled media entity, right? And that provides context to users about what they're hearing, right? Then there are, as you all describe them, nudges, we call them 'reshare friction,' which is, if you choose to take an action to share something like this for Russian state media on Facebook, you get a sort of, are you sure you want to do this note? And there the impact, since now you're measuring impact, is a combination of both the label and the actual friction.

The simple fact, it turns out that anytime you ask someone to click online, you impose friction on them continuing on their path and you slow down what they're doing. So you have both of those factors. Then in addition to that, there are... And then in addition to that on Facebook, we have labels from third party fact checkers. So if a state media entity, like any other entity, posts something that is false or misleading and a third party fact checker reviews it and determines that it is false or misleading, then there are labels on the piece of content as well. So you have these many different tiers of labels. And what's important about this and interesting here is, it's worth remembering when we talk about state media entities, that social media platforms are only one piece of the media environment that they operate in and that they have many other mechanisms by which they can amplify their messages and by which people can find their content.

And so the strategy here that we use that I think is similar to what Twitter uses is, we don't want people to stumble across Russian state media information unintended, right? We don't want it to be amplified in their feed– as Alex described, those are quarantined, we're reducing it. But the truth is, if someone wants to go out and find content like this, they'll be able to find it on other platforms, other places on the internet, broadcast, et cetera. If they find it on one of our platforms, it will have context around it. It will have a label saying it's Russian state media. And it will also have, if the claim is false or misleading, a label articulating that and linking to a detailed analysis from a third party fact checker on why that claim is false and giving that context.

And so this combination means that you have the demotions and the labels up front that provide the context. And then it also means that if someone seeks it out, when they find it, it still has that context around it, as opposed to them going off to another medium and finding it without that context. And so that's part of the trade off here, that I think is important to think about, especially as we talk about blocking versus sort of the quarantining action that we've taken. It's worth noting, even in the EU, where there has been directives to block this and the Ukraine where we, and I think Twitter have blocked Russian state media completely in response to specific government requests, that's only sort of certain state media entities. And then it becomes sort of a patchwork conversation about how you're covering it.

Nate Persily:

Well, I've seen Mike McFaul has joined us, between hits, I am assuming. I mean, he's probably the busiest man outside the region, I would think when it comes to this conflict. And so Mike, we've been talking a little bit about all of the different interventions that the platforms have taken, as well as some that the European governments have taken with respect to RT and Sputnik, but also disinformation generally, since Lord knows, you've researched these issues and certainly all the politics of the region more than almost anyone. What are your two cents that you can give us on your perspective of what's happening? I should say, right before this webinar, Russia banned Facebook from Russia. And so we spoke about that as well.

Mike McFaul:

Well, thanks, Nate. I apologize for being late. It was... I do have a TV hit in 12 minutes, so I'll drop when they call, when you hear the Skype chime in. But to be honest, I'm late because I had a choice of joining you on time or going to a rally with five or six hundred people here on campus with my Ukrainian students and they pleaded with me to come. And so that's why I made that choice. And I know I made-

Nate Persily:

Totally good reason.

Mike McFaul:

... I know I made the right choice with no disrespect to everybody here. It was a very, very emotional event. We have quite a few Ukrainian students here on campus, as well as Russians. Some who just spoke very bravely. I don't want to talk about... there's lots of expertise here. So I'm going to talk just for two minutes to flip it around, especially upon the news. So Nate, it's now confirmed– I saw it last night– that Facebook was going to be banned, was Twitter banned as well?

Nate Persily:

We don't know.

Yoel Roth:

Yep. There are rumors that we are blocked, but we haven't yet been able to confirm it.

Mike McFaul:

Confirm it, right.

Alex Stamos:

There are a couple of sensors that are showing that. There's a lot of people buying Russian VPSs with big Bitcoin right now. Perhaps, without IRB approval, I'm just going to throw that out there. Because my bosses are on here.

Mike McFaul:

Of course. Well, I tweeted last night late to ask people and people wrote to me and said those things. But I want to flip it around and maybe it's just a question to chew on for another day, but this is happening to your platforms. In the last several days, media that I know well, Dhozd TV and Ekho Moskvy. Dhozd TV is a television station that was pushed off the air, has been web-based but they just got closed down. Ekho Moskvy is the iconic radio station, but it's a media company, has been around for 20 years and everybody thought they're always safe because their owner was Gazprom by the way. And their director, Alexey Venediktov, he always played this very complicated game between the regime and opposition. They just got closed down. And I talked to one of their senior people last night, and we could go through all the other ones that have been closed down. I just ran into Roman Badanin, if anybody knows him, they've been closed down.

So the question is not what we do to stop disinformation? I think the question has to be, what do we do to promote information inside Russia? And I don't have the answer, but I know lots of people that are concerned and how could we, more creatively, think about that to flip that around? Obviously, Putin's shutting these people down because he's afraid, right, right? He wouldn't be shutting them down if everything's going peachy king, if they're not, this is an indicator of his state of mind. And I think that rather than just being a passive... I'm not an expert like everybody here but I've been to years and years of these meetings about, "Are we a platform? Are we a media company?" All those things.

I want to challenge those from the private sector to think more proactively about what you can do to support information. Let's be clear folks, this is a fork in the road in the history of the planet. This is a moment of good and evil. This is a moment that has giant implications, not just for Ukraine, but most certainly for Russia, most certainly for Europe. And so you got to get off the fence, there's just no more of this bullshit of, "That's not what we do and this is..." I really sincerely believe this, my friends are being bombed right now in Ukraine. My friends, well, some of my friends are already in jail in Russia, but they have relied on you all for a long time and now when that closes, we've got to figure out just something much more proactive. It's just not good enough to say it's not our problem and we're neutral or we're doing our…

And I want to be clear, I support all these things. I think, what has happened in the last 10 days, what's happened over the last years, I'm a huge fan and every time, I'm looking at Nathaniel and Yoel right now. Every time you guys do something, I praise you and I got 700,000 Twitter followers, Yoel, so I'm praising you to a lot of people, including a lot of members of Congress. We were just talking about your companies with a very prominent member of Congress. I better not name her, because that might not be appropriate on this live call. But I just... My plea is, the disinformation fight is important but now we need a more proactive information fight. And I'm telling you from the bottom of my heart, the Ukrainians want it and the Russians want it and we’ve got to figure out how to change that conversation.

Nate Persily:

One thing that was mentioned before you joined, but does not directly respond to this, but I want to shift in this direction, is how Facebook has now allowed for greater use of encrypted messaging inside, was it both Russia and Ukraine, Nathaniel? I can't remember. And we did get some questions in the chat about what we think is going to happen with Telegram and Signal in these countries. Nathaniel, do you want to, given what Mike said, do you want to maybe dive? And Yoel, I'll give you a chance as well.

Nathaniel Gleicher:

I would just say Mike, I actually completely agree with you on the importance of getting out as much accurate information as possible in two dimensions. First is how can we maintain access to the extent that it's possible for people to use the platforms they're using? Some of the most impassioned and powerful information sharing we've seen in the conflict so far, has been across social media platforms. People from Ukraine appealing to people in Russia, people in Russia speaking out against the war and the reverse. And obviously, we're going to see increased efforts to contain this from Russia, more and more efforts to shut this down. The more that all of us can do to enable that access, I think is very important. And I know that there's work being done for more encrypted communications, more access to the platforms. Alex mentioned VPSs, right? That's one piece of it.

And then the second piece of it, is what can we do to amplify authentic information or accurate information, both on the platforms outside of Russia? So I think there's a lot of questions about making sure people understand what's happening in this region and I think that's critical, but also into other platforms, into the public debate in Russia. And of course, the third piece is the hardest piece. And in some ways it's the most critical. And I think it's something that all of us, I think certainly all the companies I'm sure are thinking this, I know we are, how can we support efforts that are off of our platforms to try to take appropriate and important steps in this space? I think that there's been… and we've seen a big pull together from industry, from civil society and others, to get accurate information out around other crises. This is an unprecedented crisis, it's a unique and terrible situation. And my hope is we can learn from some of that and think about other steps we can take as well.

Nate Persily:

Yoel, is there anything you'd like to say in response to what Mike said?

Yoel Roth:

Sure. The one thing... I mean, Mike, I agree foundationally, elevating credible information and authoritative content to fill in the gaps that are left behind when platforms do what we have to do to remove the bad stuff, is an absolutely essential part of this. I want to highlight one thing that has felt markedly different about this conflict, which is the role of the OSINT community and of people studying the evidence coming out of the conflict in real time and in public, the amount of media verification that we are seeing take place live on Twitter has been an incredible asset, not just for the folks on the ground, but I think for everybody who's watching the conflict around the world. Encouraging those types of efforts and surfacing them to people so that they can use that information, is a key part of what we're doing. For instance, if we see a piece of manipulated media, let's say it's a video from a video game that's being shared as if it's footage from battles on the ground, which has happened once or twice a minute-

Nate Persily:

Hypothetically.

Yoel Roth:

Right? Yeah. Hypothetically. In those situations, we see people debunking that content and pointing to the original sources, that's happening on Twitter. And so when you see a manipulated media tag on a tweet, if you click on that tag, that actually takes you to a curated collection of expert content that says, "Here's where this came from, here's the situation. Know this is not actually a paratrooper landing in Kiev." And I think that's an incredibly important addition to just enforce and toss, just combating disinformation, because you're giving people direct access to the authoritative voices that can help them understand if that's not what's happening, then what actually is taking place on the ground.

Nate Persily:

Great, thank you. Can we talk a little bit about the other platforms, some that are owned by these companies, but I don't know whether Alicia, Alex or Renée, we've got some questions in the chat about Telegram and Signal and what we think that... So how easy would it be? I mean, are they going to be the next shoes to drop in these countries? And how easy would it be to do so? Obviously, they provide a particular service that we were talking about before. I don't know if you want to jump in on those.

Alex Stamos:

So I'm going to talk about Telegram real fast. So for all of our talk about the American companies, it's a little bit chauvinist. The truth is, the most important platform in theater right now, is Telegram by far, in both Russia and Ukraine, and probably the most important from a disinformation and as well as a true information perspective in both Russia and Ukraine. Some real problems with Telegram, Telegram is not really end-to-end encrypted. They have created a belief among activists and other folks that they are as protected as they would be on Signal or WhatsApp and they are not. Only private messages between two people, where you opt in or they're end-to-end encrypted, the people are mostly part of these big channels where they're having these discussions. In fact, the Russian military was dropping flyers in cities and in there it said, "Join our official Telegram channel."

So the Russian military is running official Telegram channels. And in there one, nothing is encrypted and then we really don't know what is being done with both the content that is falling through Telegram, and then most importantly, the metadata such as what are the phone numbers of these individuals and their GPS coordinates, which could of course, in war be used to kill people. Telegram is this weird entity, it's privately held. It has a bunch of Russian money in it, a bunch of people who run it are Russian. Now some of them have weird relationships and negative relationships with the Russian government. Although, most of that doesn't seem to be based upon politics, but upon fights of who owns how much of certain things. So Telegram is an extremely sketchy thing for people to be using in Ukraine, to be frank. And it is a really... This is a big challenge, I think.

What they do around state media to me, is not as relevant as the data security and data access issues, because so many people are using Telegram for their day to day communications and probably for even moving military units and doing things like giving updates inside the country. And what is happening to that content, that metadata, is a question that nobody really has a good answer for. Now, how do you get people on Signal or WhatsApp? So this is one of the reasons why Russia probably blocked the app stores, is I expect that they went to Google and Apple and they said, "Block these encrypted messengers." Google and Apple said, "No." So Google and Apple are now shut down. Apple said yes to that in China, just to be frank, right? So you can not download WhatsApp, you can't get Signal, you can't get VPNs in China on your iPhone, but Apple probably stood up to Russia in this case, and now the App Store is blocked.

If you have an Android phone and you're in Russia, you can side load those apps, but you're going to have to go get them. And what I expect we're going to see is on the major Russian search engines, like the Yandex, you're going to end up with backdoored versions and watering hole attacks for people who are trying to get secure messengers and then side load those onto their Android phone. So the fact that they don't have access to the trusted stores, is actually a really big problem. Now, Russia probably can't keep that forever because effectively, your iPhone is effectively useless if it can't get to Apple services for a long period of time. And so I'm not sure what the long term game play is there.

Nate Persily:

Alicia, did you want to jump in here?

Alicia Wanless:

Yeah, absolutely. Two points. What Alex is raising is really important around cybersecurity and the devices. So what's likely going to happen as you see, potentially cities fall, Ukrainians being captured, is that their phones will be taken by Russians and looked at to get more intelligence out of them. This is something that we saw in the Syrian conflict, and it's really crucial right now to start to get digital safety practices into Ukrainian, to those audiences. And back to your original question, Nate, about other platforms that are operating there, we've got a lot of smaller platforms like Viber, which have been widely used by Ukrainians in the Eastern part of the country to communicate. I'm not sure what the status of those platforms are in terms of what the measures they're taking are, but they're much smaller and will not have the resources of some of the bigger ones. So I think really key here is getting some standard operating practices out to Ukrainians to get their phones, their devices and accounts secured immediately, is very crucial.

Nate Persily:

Marietje.

Marietje Schaake:

Well, I just wanted to come back to the notion that Nathaniel and Yoel mentioned that they were going to push out more authoritative information. And I really think the stakes are so high in this conflict. Who are going to be the experts involved? How is that going to go? I think we really need much more transparency, a broader set of people looking at this and also a matching between the information environment that Alicia also mentioned, which of course, is increasingly being fragmented due to different policies all over the world. But there was never one information ecosystem when you looked at the laws that already applied in country, and this is particularly true for Russia.

So the consequences of speech were always very different, even if the use of platforms may have technically been possible in different places. And I think that should not be forgotten, how particularly now, with the severe crackdowns on speech in Russia, on top of how bad it already was. The consequence for people using platforms, even if they are available, can be significant and that needs to be taken into consideration. And I honestly think it would be really hard for private companies to make all these very, very high stake decisions in isolation and in the rooms of people that they always work with.

Nate Persily:

Nathaniel, did you say you wanted to respond to something that Alicia mentioned?

Nathaniel Gleicher:

Well, I think it connects to some of what Marietje was saying and what Alicia was saying, which is just this point about safety in region is so incredibly critical right now. I think one thing that really has been very good to see in recent weeks to your point, Marietje, is I don't see anyone making decisions sort of in isolation. I see a lot of conversation and collaboration and sharing of information between government partners, civil society, who can say what's happening on the ground and platforms that are trying to assess how to take the right and the most protective action. For people who are in country, I would just say... I mentioned earlier, we have a program called ‘lock profile’ that we have that is available for people in both Ukraine and Russia. It is by no means a silver bullet and there is nothing that is a silver bullet against these types of threats.

But I do think it's important for people to take the steps they can take to protect themselves, which includes both enabling additional controls around your social media accounts, ensuring that your accounts have two factor and other protections in place, and thinking very deliberately about the actions that you take and the implications that can have, right? I think that this is a... It's a fast moving situation and there are very serious threats on the ground. All of us are doing everything we can to help with that, but the most important message to get out is the tools that are available for people to protect themselves or to, for example, find resources or help, if they need to get across Ukraine to a more safe location, where they're trying to figure out the situation where they are in Russia.

Nate Persily:

Feel free not to answer this question, but I'm just get... Just try to get a sense of the number-

Nathaniel Gleicher:

Love it when you start questions like that.

Nate Persily:

No, this is not like a-

Alex Stamos:

Do you feel like you're back in law school? When Nate starts to-

Nate Persily:

I'm not grading him yet, but I… This is really just an informational question about the scale of your company's operations in Russia. If you feel com... There may be security reasons, you don't want to say something like that. But I'm sure someone knows this. I mean, does Facebook... I mean, are there are a lot of Facebook employees in Russia or is it relatively not? And for that matter, Ukraine?

Nathaniel Gleicher:

I don't want to talk about employees in region for obvious safety reasons.

Nate Persily:

Yeah. Yeah, okay. So that-

Nathaniel Gleicher:

Understand the question.

Nate Persily:

That’s why I was saying you wouldn't... I was just wondering if in the abstract, if there were people, because that's obviously one of the real concerns here. I mean, you see this in other conflict zones, that now that they've blocked Facebook, they can go after the people there. And so-

Nathaniel Gleicher:

I think it's worth saying for us, we don't have operations in Russia, and I can say that. I think it's worth saying that for any company, obviously one of the really important priorities is how do we keep our people safe and what can we do to make sure they're safe also. So that's a calculation that every company's going to have to be engaged in. You will also find that companies aren't going to talk about that in too much detail, because they could draw attention to the exact thing that they're trying to protect.

Nate Persily:

Right.

Alex Stamos:

As you say, in advocate, everybody focuses on companies actually having offices. And that is a big deal for the companies that are blended, right? So the Microsofts, the Googles, the companies that have large enterprise and cloud product sides have large offices. And I just wrote a blog post about how to shut those down. That's a big problem, that's not really just a tech issue– Russia was deeply integrated in the world economy, right? Like every major Fortune 500, a huge chunk of them had a Moscow office because it was just the normal thing to do, just like having a Paris office and a Berlin office. And so now you have the whole Fortune 500 having to figure out, "How do we take care of our employees? How do we even pay them?"

So a company I was talking to– you can't get money in anymore. So the amount of money that you have, which is now been massively devalued, if it's in Rubles and if you're holding Euros or Dollars in a Russian bank, they won't let you touch those. And you can't convert them at a normal rate. How do you even pay your people in Russia? How do you pay them severance if you're going to shut down? That's a huge problem. So that's all an issue, the other thing you have to realize, is even if a tech company like Twitter and Facebook don't have operations in Russia, every company in Silicon valley has a large number of people who are citizens of Russia and the People's Republic of China, and who still have families there. And I think that's something that has to be of a concern for everybody who does this kind of work, is them taking hostage family members. And that has certainly happened in the Chinese case.

I can't think of a Russian case, although it's possible. And I think that's something that governments really need to get on top of, companies can't handle that themselves, right? So making sure that the... It's a weird one, because you're talking about people who are nationals, who are not your citizens but they... If we want these companies to continue to act in a way that is aligned with democracies, then democracies are going to have to do what they can to protect families so that you don't have CEOs making this horrible choice, between somebody's mom in St. Petersburg and the company, doing the right thing.

Nate Persily:

On that happy note, I think we'll end. But thank you all very much. This was fantastic, exactly what we hoped for and exactly what we try to produce here at the Cyber Policy Center. So thank you again to everybody who participated in this and thank you to the audience for joining. See you next time.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics