“I report on internet disinformation. When Russia invaded Ukraine, it became very personal. There is more than one struggle. There is the war of bombs, the war that’s taking lives. And then there’s the battle over what can be done.”
Jane grew up in Kyiv. She moved to Canada at age eleven, but traveled back to visit her family and friends nearly every year, sometimes spending entire summers in Ukraine. Now– like nearly every Ukrainian, no matter how far from the land that is under assault in a brutal, illegal Russian invasion– she is part of the “battle over what can be done,” a battle of ideas, emotion and the way they are combined into political will.
I caught up with Jane to get her take on what’s happening in the information component of this war, including the role of the social media platforms and the news media in confronting disinformation, the role of myth making and the transmission of cultural information in this moment, and the role of citizen diplomacy.
What follows is a lightly edited transcript of our discussion.
The first thing I always do is ask people to state their name, their title, and their affiliation.
Sure. My name is Jane Lytvynenko. I’m a Senior Research Fellow on the Tech and Social Change Project at Harvard Kennedy School, Shorenstein Center. I’m also a freelance reporter, and I’ve been looking at disinformation for more years than I care to count at this point.
In your role there, you are thinking about how to create a curriculum for newsrooms, for academics to study disinformation and to report on it. What do you make so far of the news media’s handling of this conflict with regard to the question around disinformation?
I think that the investment in fact-checking teams and verification teams is largely paying off. We have seen some Russian disinformation flow through. We have seen some confusion in newsrooms, but I think by and large newsrooms have been doing really good work with verifying the sheer amount of information that is coming in. The reason I say that is because particularly when you look at U.S. newsrooms at a national level– but I don’t want to limit it to just U.S. newsrooms, so I think we’re seeing it with a lot of Western newsrooms. Over the last five years or so, there has been a buildup of teams that are specifically dedicated to disinformation. Some of those teams started after the 2016 election. Some of them started in response to the 2020 election.
Some of them were created out of necessity during COVID and some of them are these big, highly specialized visual Investigations teams like we see with, I mean, New York Times VI team, of course is the primary example, but we see the same in the Washington Post, Sky News, BBC, et cetera. What that means is that there’s now a body of well-trained professionals who don’t just verify or debunk disinformation, which they do, but also investigate it and understand the best way to present that information to their readers. As a result, we get these very high quality reports, explainers, debunks that pull apart both the junk that’s going around the web and the disinformation narratives that we see coming out straight from the Kremlin.
I think so far so good, but as we speak right now, we are on day 34 of a war, which for a crisis of this scale is quite early. I think we need to be careful to not overstate how, to not get comfortable with the success of the early efforts. I think we also need to understand the inequality in information environments when it comes to Western countries and primarily the English language, and the inequality in literally everywhere else where social media companies have not invested as much into moderation efforts, where newsrooms don’t have as much resource to invest into disinformation efforts, where information environments are quite different, and where Russian disinformation might find riper ground.
I do want to talk a little bit about the role of the platforms. You are tracking takedowns, removals in your work there and keeping an eye on them as well. Of course, they’re funding a lot of that fact-checking apparatus as Sarah Wiley at the Tow Center has said that Meta, the owner of Facebook and Instagram, is possibly funding as much as 10% of the ecosystem of fact-checkers out there. What do you make of their performance so far?
I’ve been describing it as scattershot, and I think that still applies. There’s a lack of coherence in approach from these companies. Our team here led by April Glaser and also with my colleague, Jazilah Salam, have been essentially trying to understand the moderation and the removals that social media companies have publicly enacted since the beginning of the war. What you begin to see as you scroll through this timeline is essentially just a completely different approach from different platforms. Some of that approach is informed by EU regulations. One example is the European Union, social media companies to block overt Russian propaganda outlets like RT and Sputnik. Those are the main ones. Social media companies obliged, but they did that initially only in the EU and not worldwide. YouTube then extended that ban worldwide, but if you go on the Twitter website, you will be able to see RT gleefully posting on its accounts.
Facebook made these, again, overt propaganda outlets more difficult to find on their platform, so if you were to search for RT on Facebook, it would be tricky for you to get it. But if you’re already a subscriber, of course, you still have access to it. To me, RT is an exemplar of Russian propaganda, but we need to be careful to not present it like the only outlet where Russia spreads propaganda outside of its borders. But the reason why it’s useful to look at RT is because we start to see those inconsistencies and enforcement in the most obvious outlet, in the most obvious field. To me, this is I think an incorrect approach from social media companies, is probably the most polite way I can put it. The reason why that approach is incorrect is because what we know from years of looking at disinformation is that it’s not platform agnostic.
Something that we saw during COVID, for example, is YouTube videos that went viral in Facebook groups, even though YouTube itself was not promoting those videos. There are different streams and different mechanisms to the way that information moves across social media platforms. Whatever kind of enforcement action, moderation action, these platforms are announcing, they need to be cognizant of how that information moves from one platform to another, from one language to another. This is a trilingual war, if we want to be extremely generous, but it is a war that is being discussed globally and so disinformation needs to be moderated across different languages as well, which we don’t particularly see. It’s very egregious because we’ve seen these problems in worldwide crises including in 2014 in Ukraine. To not see the problems and the gaps coming, to not work to prevent those problems and those gaps consistently and proactively is creating a really dangerous environment, a really dangerous environment.
One more thing I’ll say is that we will see disinformation begin to work. Undoubtedly, we will see disinformation begin to work. We’re already seeing the beginnings of that now. We’re already seeing willing information launderers spread the Kremlins’ talking points, but at the same time, after a month, after 34 days, we see positive public attention or I don’t know if positive, but proactive public attention recedes a little bit, right. People begin to take interest in other topics. They’re still keeping an eye on Ukraine, but they’re not glued to their screen and only reading news about Ukraine 24/7, like we saw in the first two weeks and like we saw before February 24th. What that means is as public attention withdraws, public accountability of social media companies frequently withdraws with it. We’ve seen this again in endless other global crises, but perhaps COVID is one that’s the most obvious to use.
Because again, at the beginning of the pandemic, we saw huge enforcement, proactive enforcement from social media companies and then as attention began to spread to different news items, as political pressure began to be divided, that enforcement disappeared as well. With Ukraine, it is more than likely that we’ll see the same pattern. As Ukrainians, we need to be able to preempt that pattern and continue pressuring social media companies, but it is exhausting to have to be telling social media companies, this is the pattern that plays out. One thing that I’m really, really worried about is for that cycle, for that pattern to repeat itself and for what we see of this scattershot proactive enforcement around Ukraine disappear as the war continues.
One thing that was certainly true of COVID that may end up being true of this particular conflict is the social media companies appeared to have some regrets about some of the things that they did early on in the fight around disinformation with COVID. For instance, banning talk of the possibility that COVID had emerged from a lab in Wuhan, which was later reckoned to be perhaps a more worthwhile debate than it had originally been seen as. Do you think there’s any action that the platforms have taken to date that they may regret later in this crisis or maybe would be seen to have gone too far?
I think that’s an interesting question, and I haven’t thought about it from that angle, but as we were talking, what came to mind is how newsrooms understand news. For newsrooms, the initial story of the Wuhan lab was not a credible story. The reason why it was not a credible story is not because there was a team of scientists who came to reporters and said, this is a distinct possibility. We need to consider it carefully. Please report it out. The story was suspicious because it was initially laundered by Steve Bannon and his associates. It was not laundered in a nonpartisan way, of course, as we can expect from these political actors.
Instead, it was presented as Bill Gates is funding these laboratories. These laboratories either had an accident or were developing biological weapons, therefore COVID came from these laboratories. That story was provably false. Not just because of the source, but also because of the details. As more details emerged and more reporting emerged, newsrooms altered their understanding of what the Wuhan laboratories have been doing, how forthcoming China is with information, what is, or isn’t possible to investigate. Of course, all of this with the understanding that figuring out where a virus comes from is an incredibly difficult science.
We do not see the same kind of nuance. I don’t want to overpraise the news media here. I think there’s been a lot of mistakes made from the news media side as well. But for social media companies to have this kind of nuance in their own moderation, again, it needs to be proactive. That is the danger with active earnest enforcement early on in a war or in a crisis, I refuse to call it a conflict. But not following up on those early reports because as reporters, as researchers, we understand that new information emerges, situations change and particularly in a wartime environment, very few things are black and white, which I guess is a very long way of not answering your question. But that’s why a lot of these decisions from social media companies can’t be black and white either.
They need to be dynamic and they need to be proactive. I don’t know, to go back to your original question, whether these companies will regret some of the enforcement actions that they’ve taken. I think a particularly sticky question is whether the Russian ban on Facebook properties could have been avoided because although many steps that Meta has taken have been steps responding to Ukrainian demands and to global demands, I don’t think we can, I don’t think anybody thinks that the banning of Meta properties in Russia is a good thing. I think that will be one of the many questions that social media companies will have to grapple after this war. Sorry, that was so winding.
No, it’s a complicated issue. I guess, I’m sorry to push you so much on the parallel, but it is an interesting one where you see that early on the platforms are trying to possibly limit the spread of disinformation, but also doing so in a context where people are perhaps being harmed. Of course, when the Wuhan lab leak theory was first being discussed, it was also in a context of hate and violence being directed against Asian people in the United States, Asian Americans. There were other, perhaps, contingencies on the platforms’ decisions with regard to that and I could see some parallel in this case as well.
But let’s switch gears a little bit to maybe another front in this information war. I’m really interested in how these various civilian corps of people have been organized to participate. We’ve got so many different types of examples. You’ve got this massive SMS campaign trying to reach Russian citizens with real news. You’ve got this Ukrainian “IT Army”, which is loosely coordinated with the government via Telegram channel that’s trying to encourage people to reply to Russians’ posts on social media, including on Russian social media sites to share information. Then news this week that there’s a group of Ukrainian Mandarin speakers who are trying to pierce the Chinese propaganda veil and reach Chinese people with ideas about the war and with information about the war. I’m just fascinated by what you might make of these types of efforts, whether you think that they work or have any influence in the situation and how you might compare them to other phenomena.
I think that we have to look at this in the context of online citizen diplomacy overall and Ukrainian online citizen diplomacy has undoubtedly changed global politics, particularly in the first two weeks of the war where support from any country has been critical, Ukrainians mobilized online for campaigns, the effectiveness of which even knowing the determination of my people, frankly, staggered me and continues to stagger me. Because that online pressure and also offline pressure on, particularly in Europe, the protests made out of thousands, if not tens of thousands of people, the letter writing campaigns to politicians. All of that is part of this mass effort to earn Ukraine allies and for those allies to contribute materially to Ukrainian efforts against Russia.
Most of them, yes, have been incredibly successful. Unfortunately, except for close the sky, which Ukrainians know is a big ask and is now being made in conjunction with just asking for equipment, asking for airplanes, asking for tanks, asking for more equipment, because it just seems like there’s no appetite to defend Ukrainian lives, even if it’s from the sky. But as parts of those efforts, the outreach to countries that are not traditionally Ukrainian allies is incredibly important. Why is it important? I don’t want to overstate how effective it is. That’s not something that in this moment we can quantitatively measure. I would argue it’s not something that we can really fully understand and I think it’s really something that we won’t ever fully understand, even when Ukraine wins the war.
But the reason why it’s important is because with many of the countries who are not allied with Ukraine, there’s not a clear understanding of what’s happening in Ukraine overall and even if these efforts target a handful of people and convince a handful of people, and even if those people don’t convince anybody else, they are afraid to talk to their neighbors about it because of censorship laws. They don’t know how to push this information further, but even if a small handful of people is reached, then the logic is that there will be some spark, some ember of support for Ukraine. The reason why that ember is needed is because Ukrainians plan on winning the war.
I think aside from my very hopeful rhetoric, we have seen Ukraine make a lot of gains on the ground as well. That is a genuine reality. When Ukraine wins the war, some of these countries may want to ally with Ukraine and they will only be able to ally with Ukraine if there is a public appetite of some kind for that. It’s also really difficult, particularly for Russia to fight on two fronts. Russia has a long and storied history of getting into wars that it doesn’t win and then it sparks a revolution at home. Ukrainians understand this history, particularly with Russia and so that attempt to seed a rebellion speaks to that history.
I am talking to you on a morning where it feels as if what you are prophesying to some extent is really possible. We’re seeing apparently the Ukrainian army push back in the suburbs around Kyiv. We’re seeing at least to some extent more tangible conversation about terms in the peace talks that are going on right now in Istanbul and all of that could change tomorrow. It certainly could go the other way in an instant, so I don’t want to overstate it. But even to hear you talk about when Ukraine wins the war, or when these hostilities cease, I hear you exerting a positivity into it.
I want to ask you a little bit about that because that’s a piece of this. That’s something that we’re seeing happen. An effort to not only spread facts about what’s happening inside that country, but also to spread a positive energy, if you will, about the need to support and the need to get behind Ukrainians. What do you make of that? Is that at all in tension, do you think, with this more clinical, fact-driven way that you’ve analyzed the information ecosystem in the past?
Yeah. I mean these myths, legends and in many cases, real stories of bravery are crucial during a war. They’re not just crucial now, if I can get a little bit nerdy. These are long Ukrainian traditions of supporting those who fight for Ukrainian freedom because of course, the fight for Ukrainian freedom has been going on for hundreds of years. As a part of that legend-making, myth-making and also support for key historical figures is part of a long tradition.
In the online information environment that is at odds sometimes with the sheer, “Only the facts, ma’am.” approach about the war, but I do think that it has a very important role to play. The important role is first of all, keeping up the morale of Ukrainian people. I think we’re all doing a great job at that ourselves, but we need some memes, we need some memes to share. We need to put Ghost of Kyiv on a T-shirt, even if every Ukrainian knows that it’s a myth, right. Every Ukrainian knows that Ghost of Kyiv is not one dude who’s sitting there in his airplane taking out Russians by the hundreds, right.
What it really is it’s like white propaganda. I think it occupies this really interesting space. A space that helps outline the character of Ukrainian people, even though it doesn’t always contribute to the facts as we understand them on the ground and outlining the character of the Ukrainian people is actually a kind of education because before the war, Ukrainians were not presented in a good light in most cultures. Right before this, we had that silly controversy with Emily in Paris, presenting a Ukrainian woman as a thief and a liar. That is pretty standard with what understanding of Ukrainians was like before this. Now understanding of Ukrainians is farmers who steal tractors, by the way, verified phenomenon, it’s grandmas who throw their jars of pickled tomatoes at a drone.
There was a Ukrainian outlet who interviewed the grandma who, although did not provide any photo or video evidence, recounted the story in great detail. Now understanding of Ukrainians is as a people who will do anything at any cost to defend the land that they live on. I think that this, I don’t know if you can even call it misinformation, although I guess it is misinformation, but the memes, the cultural approach to all of this, I think really feeds into understanding of what Ukrainians are like as a nation. Also, sometimes provide just pure relief, a small release valve from the genuinely disturbingly, horrible things that we’re seeing that no Ukrainian will ever forget.
Yeah. I hear everything you’re saying. I find it fascinating and I would love to dig into it more and talk to you more about this at some point. You have made a career of figuring out what’s real and what isn’t, but this value in sometimes what isn’t necessarily true is something that I think we have to contend with as well as we think about questions around disinformation and misinformation. How people internalize the interests of their fellow citizens or of their nation-state. It’s a really interesting dynamic that I don’t think is well understood.
Yeah, I think you’re right. I also, I want to say that we need to be careful with it. We really need to be careful with it because Ukrainians are wielding this power of myth-making with good intentions, but if there’s anything that we know about the internet, it’s that everything will be perverted and turned upside down and misused. I think that as we try to understand the pro-Ukrainian memes, the pro-Ukrainian storytelling, the mix of genuine and not genuine stories that Ukrainians tell themselves, tell each other, and also tell the world, I think we need to very clearly separate that from reporting. We need to very clearly separate that from fact. We need to very clearly separate that from our understanding of what’s going on, on the ground. I would argue for the most part, it’s been pretty successful in doing that, but it’s a blurry line.
Absolutely. Of course, the United States is a good example of a country whose myths about itself have not necessarily led to a good place in the long-term, I suppose.
Let me ask you a last question. You are in touch with a lot of people in Ukraine and still have, as I understand it, a family, close friends there. What are you hearing from them about the experience they’re having? Are they also beginning to feel some optimism at the moment, or what is the emotional mood of the people that you’re speaking to in the country?
I pause now because it’s difficult to describe what people in the country are feeling all at once. The best way I can put it is righteous rage. I don’t want to extrapolate to all Ukrainians what I’m hearing from some Ukrainians because I think one of the things that can get lost in these conversations about lionizing Ukrainians, about talking about how great we are, is our humanity. Is, how difficult it is to pack a plastic bag and go to a country you’ve never been to and try to establish a new life or how to completely reorient your business so that it does nothing but serve the army, or what it’s like to wake up one day and go from being a student to being a warrior. I think that there’s a lot of just human emotion in all of this. Human emotion that I think doesn’t come through in memes, human emotion that doesn’t come through in disinformation. But I think does come through in the stories that Ukrainians are telling, just purely telling on social media, like, here’s what my family is doing, here’s what it’s like in their bomb shelter.
But at the same time, the one uniting understanding from everybody that I speak with is that Ukraine is ours and Ukraine will not fall and it’s difficult to describe it as optimism because it’s not optimism, it’s determination. Every news development is seen through that determination, but not just news development. You have to understand that every Ukrainian is watching every blast, watching every tank, counting every soldier that they can see. We in the West have the luxury of being removed physically from the war. We are not in a war zone, Ukrainians are. We need to understand that when we ask about what the Ukrainian mood is because the mood changes with every development, but the mood also stays the same, which is, they will not take our country.
Jane, thank you very much.
Thank you for having me.
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.