Home

The Sunday Show: Social Media and Vaccine Misinformation

Justin Hendrix / Jun 5, 2022

Subscribe to the Tech Policy Press podcast with your favorite service.

Even as the COVID-19 pandemic continues to simmer, there is a good amount of science emerging about the relationship between the information environment and vaccine uptake. Today we’ll hear from two researchers from different disciplines about their work on social media and vaccine misinformation.

First up isJohn Alexander Bryden, Executive Director of the Observatory on Social Media at Indiana University, with whom I discuss the results of some recent research his team had conducted on the problem. And second, I speak with Kolina Koltai, who when I interviewed her at the end of April was transitioning from her position as a postdoctoral fellow at the Center for an Informed Public at The University of Washington to a role at Twitter.

Below are lightly edited transcripts of the discussions.

Segment 1: John Bryden

Justin Hendrix:

The first thing I always do is just ask folks to state their name, their title, and their affiliation.

John Bryden:

So I'm John Bryden. I'm the Executive Director at the Observatory on Social Media at Indiana University.

Justin Hendrix:

Tell me what you get up to at this observatory in Indiana.

John Bryden:

We do a variety of things. Our main research goals are to study misinformation and how it spreads online. We've got three things in our mission, which is to provide tools to study misinformation as we've got. Second thing in our mission is to actually do research on misinformation and the third factor in our mission is to educate the public about how to detect and study misinformation.

Justin Hendrix:

And you have just released this new study published in Nature, which was titled, "Online Misinformation is Linked to Early COVID 19 Vaccine Hesitancy and Refusal." So when did you kick off this work?

John Bryden:

This happened late 2020, early 2021. We started to gather vaccine tweets from Twitter. So we were interested in understanding how misinformation flowed and at the geographical scales, and were there any patterns of localized areas of misinformation. And we realized that with the vaccines being released, it would be a great opportunity to study the way that the vaccine's misinformation is geo located in particular counties and whether that correlates in some way to the uptake and behavior and attitudes around vaccines in those counties.

Justin Hendrix:

So you created your own dataset, is that right? This CoVaxxy data set.

John Bryden:

That's correct. It's an online stream of Twitter data. What we've done is we've been searching Twitter for any Tweets that are related to vaccines, using a specific collection of search terms, and we've been gathering all the Tweets that match those search terms since the beginning of 2021. And we've now got nearly 18 months of data, hundreds of millions of Tweets.

Justin Hendrix:

So I understand that you kind of took that data from Twitter and then used another set of information that was drawn from Facebook and kind of federated that then with some CDC data on vaccination uptake rates, is that right?

John Bryden:

That is correct. So at the same time as we were doing our Twitter search, the Delphi group at Carnegie Mellon University were doing surveys on Facebook, which were... I actually happened to me once, they pop up a little question on your Facebook wall, ask you your attitudes towards vaccination and mask query and other thing. They were gathering data for particular counties and looking at the levels of vaccine hesitancy in those counties.

At the same time, we also incorporated a large amount of other data, not just CDC data and vaccine uptake rates, but demographic data, because we needed to control for things like income, education, religious attitudes, political bias, all sorts of other things, rurality, education levels.

Justin Hendrix:

So you put all this data together and you actually ended up finding a kind of pretty strong association between misinformation about the COVID 19 vaccine and vaccine refusal and vaccine hesitancy. How would you characterize that association?

John Bryden:

That's a great question. Well we did find this strong evidence of correlation between online misinformation and vaccine hesitancy rates. I think one of the striking things is our data predict this 20% decrease in vaccine uptake across the States, according to the states with the lowest levels of misinformation and the highest levels of misinformation, which is quite a sizable change. And that's predicted by our model.

And we saw like, in Democratic counties that were having higher levels of misinformation, there was around a 70% increase across the range in the totals, all the counties in the United States. So we did see quite a larger factor. It's not just some small correlation. A lot of studies talk about significance and how significant and how likely we could have seen this pattern due to sort of random noise across the counties. But we had reasonable results on significance, but we think there's more to be investigated there, but we definitely found quite a strong effect. We even went accounting for all the other variables.

Justin Hendrix:

So this was only looking at Twitter data. Do you imagine that perhaps the kind of prevalence of vaccine misinformation, as expressed in Twitter data, is probably, I don't know, somehow evident of the general environment around misinformation in those particular geographies?

John Bryden:

That's a great question. Twitter is quite biased, actually, toward more liberal left-wing communities. So I think a lot of people still have used things like WhatsApp and Facebook to spread a lot of information and misinformation. So I don't think Twitter is completely representative of the population as a whole, but what we would say is our study is looking at these changes in quite a relative way. So, we do account for different factors such as the internal biases of the data.

Justin Hendrix:

And when you say Twitter's biased to kind of a liberal population, you mean in terms of its user base?

John Bryden:

Right. Yeah. Yeah. Sorry, to make that clear, yes. It's the user base is biased. But yeah, studies have found actually, we published one recently that found actually the content that is shown to users on Twitter and the types of users they are invited to follow are more likely to be right-wing, actually. Possibly if there's any biases towards the right.

Justin Hendrix:

One of the things I just wanted to ask you more generally about this, is that there's been a hot debate in this country about whether misinformation plays any kind of causal role in behaviors, and whether that means we should do more about it with regard to moderating the social platforms. What does your research suggest about these questions?

John Bryden:

It's very hard to answer that in absolute way. The proving causal link between one thing and another is extremely difficult. Many years, for people to prove that smoking was linked to cancer, for example. So what we do have is this association across the counties, and we did also look at a bit of a study of what we call a Ganger causality analysis, but in that sense, we were able to notice that when there was an increase in misinformation in one particular county, we would find a corresponding increase in vaccine hesitancy in the same county within about two to six days afterwards.

So again, that's like a link between misinformation and vaccine hesitancy survey results. It's not entirely clear that's a causal link. We are in the process of studying this in more detail though and we will be looking at links between anti-vaccination Tweets and misinformation Tweets and actual vaccine uptake. So, that's the next step we're working on.

Justin Hendrix:

You do conclude with a note around better moderation of our information ecosystem. Is there something that you think perhaps a product manager or someone on trust and safety at Twitter should take from this survey? Is there something they should be doing differently?

John Bryden:

I think there's a big open question about the way that Twitter should be moderating their platform. I think some of the studies that have been done out there by, for example, David Rand's group have shown that if somebody is given a warning that what they're posting might be misinformation, then they are more likely to reconsider and look carefully at what they are posting and that can reduce the levels of misinformation spreading online. So, there are numbers of ways that with this can be approached, rather than a kind of blanket take down approach. So I'm not sure what we'd have endorsed that anyway.

Justin Hendrix:

What's next for your lab? What are the next big research products that we can expect?

John Bryden:

Well, as I was saying, we are doing a lot of work on this causality analysis. So we're developing that causality picture. We're also interested in who are the main super spreaders of this misinformation, the vaccines misinformation. And who, whether they tend to be verified accounts on Twitter, because we noticed that quite a few of these accounts are verified. We're also looking at general surveys of vaccines misinformation over time. And that's just a small sort of subset of what the observatory's doing. We're also working on a study, which is looking at how we can promote vaccines confidence amongst the black community. We are doing a lot of work with our bot detection tools, developing those, enhancing those. And there's other work, which is looking at how to detect other misinformation sites and there's plenty going on.

Justin Hendrix:

John, thank you very much.

John Bryden:

Thank you. It was great talking to you.

Segment 2: Kolina Koltai

Kolina Koltai:

My name is Kolina Koltai. I also go by Koko. Up until a few days ago, I was a postdoctoral fellow at the Center for an Informed Public at the University of Washington, but I am now currently transitioning into a role as an experienced researcher, as part of the Birdwatch team at Twitter.

Justin Hendrix:

Exciting, and hopefully we'll get to come back around and talk a little bit about your upcoming or new experience at Twitter eventually. But let's talk a little bit about the work that you've been doing for the past few years, particularly around vaccine misinformation. I wrote to you after seeing you post a talk that you were giving on vaccine misinformation and COVID. Can you perhaps for my listeners just try to encapsulate your research career to date? What have been your key curiosities?

Kolina Koltai:

I'll start at the beginning of my vaccine intrigue if you will, which is all the way back in 2015. I was currently working as a researcher, actually at NASA Ames, looking at UAVs and designing cockpit systems for pilots, looking at large aircraft. We were building and designing technologies there, so something completely different, working on aircraft.

Right when I was deciding to go pursue a PhD, there was a big measles outbreak in the Southern California theme parks, this is 2015. There's suddenly all this conversation about, should we mandate... Not only mandate. We have a vaccine that's already mandated, but should we remove the personal exemptions option that we've had for childhood vaccines? There was just a lot of people just talking like, "Oh man, these people who are anti-vaxxer, they're just these big idiots or they're just granola moms."

There was all this conversation about it, and the question that really struck me was, why are people not vaccinating? I'd always grown up and thought that vaccines were good. They were safe. They're efficacious, they are necessary, so why weren't people doing it despite all the information or resources. Because you can go online and look, here's all the research that says vaccines are safe.

I had actually emailed my advisor at the time. I hadn't even moved to Texas yet or gotten started. I was still packing up. I was like, "Hey, I know I told to you, I was going to study automation transparency and trust in automated technologies, but I am actually completely shifting and I'm going to go on this vaccine thing." Luckily, he was very kind and said, "Sure, we can explore that. You're beginning the program."

Now seven years later, from that moment 2015, I'm still intrigued in this question. Of course, there's been a lot to unpack and obviously a lot has happened in the past few years when it comes about the question about vaccine hesitancy and this relationship with vaccine misinformation, particularly on social media.

Well, a lot of the things that I've been interested in, that I've been focusing on the ways that people use different tools or different social technical platforms like Facebook, or Instagram, or even cases like Twitter, TikTok, you name it. I've been interested in almost any sort of communication technology in this way and the way people use these type of platforms to navigate a space, to make a decision about a scientific topic.

I particularly focus on vaccines, but a lot of this work is applicable to GMO products, to climate change. The way that we try navigate around a topic that you think about that, like there should be in some theory and answer. The way we think about science is usually like, "Oh, the scientists know a way." This is what I found particularly interesting. I initially came at this, I had this idea about what information are people being exposed to, or maybe people aren't getting the right information and that model is completely very old, outdated model on that. Then I thought about values and the way that was important to us.

Justin Hendrix:

I teach graduate students. A lot of times people come to this question of misinformation and the first instinct is to ‘get people correct information.’ I assume you've ultimately found that's not necessarily the case.

Kolina Koltai:

Yeah. For anyone that's familiar with something called the information deficit model, that's basically that idea. Is that people simply just don't have the information, and so therefore they're making bad decisions. This could be applicable in some ways if you think about maybe the way we make other health related decisions. Maybe you don't know everything about the best ways to take care of our heart or about maybe preventing cancer or something like that. But in the case of vaccines, most people have at least heard of some in of their lives, like the good things about vaccines and something that we see as a pretty like mainstream thought and that there is because of the internet, a heavy amount of access. Particularly in the past few years, a lot of public health campaigns everywhere you can see, to encourage people to vaccinate.

I would say most literature nowadays says... The information, there's a model. Simply people have not seen the information, they're not getting exposed to get information. Is simply insufficient in understanding the reasons why people decide not to vaccinate and other scientific controversies. If we accept that's not the case, then you're like, "Well, what the heck is going on?" A large chunk of my work then thought about values. When I say values, I would say, values are the things that we kind of think are important to us. Things that are more than just a belief, but core components of who we are, of what we think is important. For myself, I really do think a value of education. I always think education is highly important and that's just one example.

There's a lot of different ways we can operationalize or categorize values, but if you think about like, "What do you think from people's families?" Family is so important to them, it's a high value for them. I thought about like, "Well, maybe values has a relationship in a way we think about science." There is some literature on that, and that some of my work explored some of that relationship with what people think are important. Then I do think there is some work there, because I don't want to say that there are some people who are going to be more pre-dispositioned to being vaccine hesitant, so I think that's like a really sort of curt way of saying it. But I do think that there is a trend with some people who value this idea of questioning if you will, and that's a really controversial way to say it. But in a way, there is sort of an appeal to wanting to buck in component, more so than we would imagine with the... How would I try to phrase this in a way that doesn't make me sound like I'm on Joe Rogan. There is a-

Justin Hendrix:

You're talking about the kind of’ just asking questions approach’ to this thing.

Kolina Koltai:

Yeah. One thing when we look at the rhetoric, because sometimes is the thing behind the thing. You'll see they just asking questions which I think does two things. It one, kind of like attracts people who kind of like just asking questions, but it's also used as a guise to avoid content moderation. I think it does, the just asking question has multiple uses when you see it online. Because I think, it's like the person who's playing devil's advocate, but you're also searching for the person who's also a devil's advocate with you, but you're doing it so out, they're just like, "Just a prank bro,." Or like, "I'm just asking questions." It's sort of that same flavor, and so I do think there is a little bit of it of it. Some of the people I've done interviews with about why they became vaccine hesitant, how they came into those communities and those spaces online, none of them really identified themselves ahead of time as saying like, "Oh yeah, I've always been a conspiracy theorist."

People say like, "No, I haven't been a conspiracy theorist. That's not my bag." I think for a lot of people, it was, just trying to find an answer to something that didn't make sense for them. For some people, when we talk about say childhood vaccines, nearly every single person I talked to said that they were just trying to figure out an answer online to why maybe their child was sick or why the child had a reaction that way, and it was often really centered on that. I think, there is opponent of at least women talk about childhood vaccines, not being satisfied I think with the answer they were given. Like, "Oh, it's just bad luck or because it's something else." I think it's like they wanted a different answer, they wanted something more so was questioning what they have been told.

Justin Hendrix:

Let me kind of jump in and I want to characterize the question, so I want to kind of say, so you recently gave a talk on what you call the socio-technical success of vaccine misinformation. Which to a lay listener, that might sound a little odd. You're talking about the success of misinformation. I think you're starting to get into that now. What makes vaccine misinformation successful on social media? It's not just the sort of... I don't know. It's not just bots and AIs pushing the stuff, it's not just political interests. What are that set of robust reasons that it ultimately succeeds?

Kolina Koltai:

The talk I titled, The Socio-Technical Success of Vaccine Misinformation, is because if I go onto like nearly any social media platform right now I can find misinformation. There are accounts that are both big brand name, OG anti-vax, sort of activism leaders like everyone, the names that we know who still have accounts on social media platforms. But there was also a lot of other smaller sort of micro influencer accounts that are spreading vaccine information in a sense that the way that we have set up, the way that we talk about vaccine misinformation, the way that a lot of platforms have thought about content moderation in that space has led it up to continuously be successful, not just prior to the pandemic, during the pandemic and then probably well into after the pandemic.

I think this has been a combination of a variety of things. Part of that is this, one is the way that we talk about, so we think about the rhetoric of trying to navigate the space. Sometimes people use the word dog blitz link, and in a way there is a language to talk about it. Sometimes it's a language that is meant to entice either your current followers, or bring in new followers. But also I think to avoid moderation on platform, because we often think again, this moderation has the way to remove content and a way to minimize sort of the spread of misinformation. If you put up something that might hint at misinformation or imply something, or say you should question X, Y, Z, but then say, "Hey, I'm just asking questions." That content is not going to get removed.

Not only are you avoiding moderation from platform, you are also continuously building out your fan base, your followers and I think there are a lot of baked-in affordances, that are built in to increase engagement, that are not necessarily on the whole bad. I think that's the hard, part. It's like anything that you have with the social media platform, the tool or the affordance or anything like that isn't on its surface as bad, it's just the way that people end up using it. It can be used for good, it could be used for bad things. We even think about things like ephemeral content. When I say ephemeral content, these are content that's temporary. You might be familiar with something like on Instagram, but they have Instagram stories. You can post videos, images, links, all sorts of content that's gone within 24 hours.

Facebook also has this, I think TikTok is even playing an idea with it. Ephemeral content in my opinion, and even Twitter has done this too, I think they have fleets for a little bit and they got rid of fleets, has little to no moderation in that space. Some of my colleagues at the University of Washington, particularly Dr. Rachel Moran, and one of our PhD students, Izzi Grasso, we were all working on looking at Instagram as a particular platform because it is something that's really easily on there. Through a variety of ways, we end up collecting a list of highly active Instagram accounts that shared vaccine misinformation, encouraged vaccine hesitancy type behavior, anything from along the lines of say vaccines are bad, or here here's all the people who are protesting, why you got to protest against vaccine mandates and the whole sort of a spectrum of sort of vaccine content you can imagine.

When we looked at content that was posted to the grid as you call, it as a post that's there. It's permanent unless it gets removed by the user or by Instagram. Oftentimes it's very benign, and you wouldn't even know that an account is anti-vaccine or a vaccine hesitant whichever term you want to use. But then whenyou go into the ephemeral content, it is a not only drastically much more content, but content that is far more extreme towards pushing vaccine hesitancy, vaccine misinformation and things that's not being taken down. Every day when we went to go check the Instagram stories, there'd be content from these accounts. The counter up during this time, there's I think only one account that went down while we were actively collecting which was a massive big name account, who was a part of like the Center for Countering Digital Hate, whose handle at the time got censored.

I think that also adds to it a little bit. It's saying how much they're being censored, how much they're being moderated also adds to it. We were looking at the ways that people navigate around these platforms and I think because some of the tactics that we see are used inconsistently, that one, I don't know of those tactics that they're thinking that they work or it's more performative for everyone else to say, "Look, we're being moderated by this platform. We need to navigate around this." Sometimes people will do things like misspell the word vaccine or code. People might have even seen this in their day to day, but in images, they'll like cross out the word vaccine and they'll sometimes cover things with stickers and in some cases, use the... Some of these affordances even promote it.

Instagram had this thing where if you put a let's get vaccinated sticker on your content, you get promoted and be like, "Oh, it's like people who are promoting it." Except people who are vaccine hesitant would use this sticker to promote their content. There's so many these baked-in components, in a sense like that would be normally good, but it's being used in this really negative way. That's just sort of a little bit about how these social technical platforms, these social media companies have all these wonderful tools to sort of integrate and connect with people and find the community and all these things that are really wonderful. But they can end up being used for less pro-social reasons and sometimes in this case, vaccine misinformation.

But I shouldn't say that shouldn't mean you necessarily completely... Just because you're adding more moderation, I feel like any sort of moderation thing you're going to do, people are going to find a way to work around it, so moderation isn't the only answer when it comes to trying to solve it.

Justin Hendrix:

I've talked to multiple people working on any number of issues on social media from the harassment of women online, through to COVID disinformation, to political misinformation and they all will tell you that moderation intervention is simply not up to snuff. It's missing some extraordinary percentage of the infractions. On some level you have to wonder, is there even any point to it? Can we expect these social media platforms, or I should say, will these social media platforms ever be successful in stemming the tide of all of this stuff?

Kolina Koltai:

I think there's always going to be content on these platforms, but I think one are the big things that I push for because I know that content moderation is a difficult, difficult problem, and I don't add anyone that's working on it. Is we need to think about who is disproportionately spreading content that is going viral. We think about even our big names. The fact that some of the largest and biggest names that we see in vaccine misinformation, people who are anti-vaccination activists are still on platforms. Why does Robert F. Kennedy still have a Facebook page? Why does Children's Health Defense still have an account on there? Why is Joseph Mercola's books still being sold on Amazon? I think those are simple things, so I think that when we say there's not enough, I'm like, "There is at least bare minimums."

I think we could do that, we're not doing and so I think there could be more and I feel like I'm not asking for them, I'm not saying you're going to be able to trend every single vaccine related misspelling and categorize all of them, we're asking for things that are very easy. Or even on Facebook, you have a link that you're ready to flag for misinformation, but you don't put a fact check on it because it's shared the comment, things that to me seem simple enough to do. When I think people talk about moderation to be insufficient, it's because currently it is insufficient and not things that are unreasonable. But when we think about even all that being done, there's still going to be holes. I think part of that is we need to think about them.

What are the affordances? It depends on also how much blame, so if you're thinking of what are the things a social media company can do? I think that's really difficult because, I think at the heart, a lot of social media platforms, it's also going to come down to a bottom dollar. With all those Facebook files, we learned that there are things that Facebook could absolutely do to minimize the spread of vaccine misinformation, but people are going to spend less time on the platform and less time on the platform means less money. I think it's also... We can't fully rely I think on a social media platform, to do all the things that are like truly the best things for society.

Justin Hendrix:

With Rachel Moran and Izzi Grasso, you wrote recently about vaccine hesitancy and you appeared to conclude that we've got to get beyond thinking about just social media platforms. That we have to ask some difficult questions about why society has lost trust in institutions and how we kind of address some of the more fundamental and underlying issues. Is the focus on social media as the focus on tech distracting us perhaps from those things? I'm speaking to you on the day that Barack Obama is set to deliver and address on misinformation and disinformation at Stanford, and then already seeing those types of critiques online from certain quarters that this focus on misinformation and disinformation is a distraction from underlying social issues.

Kolina Koltai:

I think you can't have one without the other. To make fire, you need heat, fuel and oxygen and each of those I think are an important component. I think vaccine misinformation and anti-vaccine sentiment has been around as long as there's been vaccines, well, well, well before the intervention of social media platforms. But I will say what social media platforms have done, they've made it much easier for people to be able to connect with each other, share information, find communities and carve out digital spaces in which these sort of ideology can thrive and flourish and taking something like a global pandemic gives it the sort of push to really spread anti-vaccination sentiment on a bigger global scale that we've been able to see before in a very long time. It's not all social media, but I think it's very easy way, because you see vaccine misinformation and vaccine hesitancy in other spaces beyond social media.

You see it in books, you see it on TV, not to put anyone under the gun, but you look at Tucker Carlson's programs and you see a ton of vaccine misinformation happening there, that's not a social media thing, that's that's on the television. But I think when I again pulled from some of the work that I did prior to the pandemic because I think just was all this work, laying the foundation for why we've seen also so much success, why the movement has seen so much success, is because one I think it was really ramping up over the past decade. I think it's been slowly growing, figuring out what's going to work, making connection with politics, things like that. But I think when I talk to the people in those spaces, a lot of them were just I think, particularly angry at the world, angry about the way that healthcare is set up.

If you think about the way that... Our relationship with how we trust pharmaceutical companies, some of the criticisms, people who are in this space bring up about a pharmaceutical company are really valid. Because even think about what's happening with the opioid crisis or the cost of insulin and the other things that regardless of your vaccine stance, you say, "Yeah, insulin shouldn't cost $700." Or the Sackler family being part of promoting the spread of painkillers and leading to an opioid crisis. These are criticisms within how we developed trust with a pharmaceutical company.

When I used to go to events and talks, like meetups where other anti-vaxxers go to, and it used to be a large, so I would go to the presentations that big named people would go to and be like, "All right, here's how you help convince someone in your family to not vaccinate. Here's all these reasons why... " They would actually pool a lot of like historic and current examples of reasons why we shouldn't trust the news, or pharmaceutical companies, or things like that and I was like that... I'm joking, I can almost turn anyone at the end day against vaccines, because some of the core questions or components of why you start chipping away trust are things that are really valid things that are true.

I think once we think about the way, particular women experiences in healthcare and oftentimes there are symptoms or their needs being dismissed by healthcare professionals, that is a very real thing that has happened for many years, still happens today. Disproportionately I think happens to women of color, but then when you think about a woman bringing in like, "Hey my kid is have new symptoms." They get dismissed by their doctor who is like, "Oh, it's not a big deal. Or don't need to worry. Or do XYZ." It adds to that problem, so that talks about revitalize... We got to rethink really difficult problems that we have in society, the way we think about, what are the other large reasons why people might be more pre-dispositioned to becoming vaccine hesitant.

Justin Hendrix:

I want to just kind of think a little bit about what you might like to see happen with regard to the platforms taking action. You clearly believe that despite the fact that they're not perhaps fundamentally responsible for the situation, well actually let me back up a second. One thing you just said really resonated with me, which is the idea that these movements have built power for a long time on these social media platforms. We see that in a number of different contexts, I'm thinking about particularly election misinformation and the QAnon phenomenon, which was allowed to kind of fester on social media platforms for some years before ultimately the platforms took action. You've got a kind of similar dynamic where once the harm becomes apparent as it did in the COVID situation with vaccine misinformation, it's almost too late to do the big moderation effort.

The ideas are already well disseminated and the communities have formed. There's a lot of membrane and connection between those communities that even if you stamp out certain tweets or Facebook posts or maybe even kill a few accounts, it's hard to kill that ecology that these various communities have formed. I don't know, I mean to some extent, I don't know what do you make of that? Is that a right assessment of it, that we're still living in a world that social media built a couple of three or four years ago, even if they've moved to take more severe action of late.

Kolina Koltai:

Yeah. I think part of that is also thinking about, that it's not just one social media platform. It's more than just Facebook, or Instagram or Meta. I'll use this as an example, Pinterest. Pinterest is a great example, because if you think 2016, 2017, they were suffering from a vaccine misinformation issue. Is that they had users, creating boards, creating community and it's like a different type of social media platform, people don't often think about Pinterest. You're like, "Oh, it's bunch of moms." But I think about that in relationship to it. Vaccine misinformation, it was a potential real big hub and there were papers coming out about, look at this happening on Pinterest. You could make a collection of links and accounts. Anyway, they decided I believe in 2018, that they were just going to shut it down.

Initially like blocked, like anti-vaccination. Constantly couldn't search for anti-vaccine or vaccine misinformation stuff, just made it and then they eventually update their policy all within that year to say, "Look, we are only going to accept vaccine information from internationally recognized health organizations." They took a really extreme sort of cut at it, and I think some people were upset, but I think because it wasn't 2022 at that time when they made that decision, I think it went rather under the radar for a lot of people, I think when the anti-vax community was a little small, they took a really strong point. Even today, it's really hard to find vaccine misinformation on that platform. But that's just one platform, and so I think part of it is not just what does each individual platform do, but what do all platforms do?

Particularly, I think about our major platforms. We are never going to get to the point where we can really moderate space like Telegram or Gab or other platforms that are really sort of saying we're against, but we're not going to do any moderation of those, or even WhatsApp. But I think we need to think about where one where other people are, and we know that there is this cross platform component. I'm kind of losing the thread on what your question was. But I think social media is undeniably a part of this equation. But even if you were to say delete, every single... If I had big button right here that I could hit and delete every single social media platform, and it'd be off everyone's phones, there were no logins, it's gone. You would still have people who are vaccine hesitant.

I think there would still be people who are going to be predispositioned or still have questions about vaccines. I think there is a lot we need to do to address that issue as well. Certainly social media has aided in the way that we formed communities and spaces, and connect with people and get that access to information. The thing that I often explain to people is that, it's not just exposure to vaccine misinformation. Because me and my colleagues see vaccine misinformation all the time, every day. People at the Center for an Informed Public, see misinformation on a variety of topics all the time, and it's not just exposure to it. If it's not just exposure, which we do know... A, because people see something, people maybe not.’

That's why we need to think about this as a multi-facet thing. A lot of people like to focus on digital literacy and digital skills and I think that's absolutely really important. But I need to think that like, there's also focusing on social media platforms, thinking about the way... Might even think about, what are the consequences? What are the repercussions for people who disproportionately share vaccine misinformation? I'm not talking about your aunt who shared a meme the other day that was like, "Oh, this vaccine." We're not worried about it. I'm worried about people who are making money and profiting about making a brand off of vaccine misinformation. What repercussions are there for that? There aren't. I think we need to think about how we think about healthcare in this country. I think if we really want to fix the issue of a misinformation, we need to fix everything.

<Laughter>

Justin Hendrix:

We need to fix everything. Yeah, I mean I don't think most people believe we'll ever fix everything, that we'll ever be able to totally remove this disinformation from the ecosystem. But I think most people would agree with you that we'd like to arrive at an equilibrium where it doesn't look like there are massive multi-billion dollar companies that are profiting from it, and that are exacerbating the problem. That seems to me to be the line we should draw.

Kolina Koltai:

Yeah. I mean, did you ever read or see National Enquirer magazines? You go in the grocery store, you see the National Enquirer-

Justin Hendrix:

Sure.

Kolina Koltai:

I remember as a kid seeing Bat Boy on there, and that is some sort of entertainment. There are I think less... There's definitely malicious disinformation, things that are less of. We think about our parodies or our jokes, and I think it what was easier I think for some people to kind of tow the line on that. There's surely maybe someone out there who believes Bat Boy is real. But I think most people who saw that were like, "No, there is no Bat Boy that's hiding out in a cave or the basement of a celebrity's house." But it's a really tough time out there. I think it's not just an information issue, I think we need to think about it as a community issue as well.

I think a lot of people who end up even falling into the anti-vax space find a really loving, kind, wonderful community of people and find it really difficult to leave. I've talked to people who were in that spaces and then chose to leave and I said, it was really tough because you find a group of people, friends, people that you trust in your life. At some point you have to decide if you're going to leave and decide the vaccines are good and safe and efficacious, you have to also leave a community, your friends. I think that is a component. Your modern day conspiracy theorist isn't a dude at downstairs or a basement with a bunch of reds drink, it's friends and family. It's people who have found also a community with each other, who have these things.

I think people are generally really worried. I think anyone who is vaccine hesitant is genuinely concerned either about their own health or the health of their loved ones, or the health of their kids. I think that is everything about trying to fix something. I don't think it's just saying calling other people are idiots and other people they're wrong, that's not the way to go about it. I even had a friend of a friend who I've known for years who only recently we were able to convince to get vaccinated, and he's someone who's always been familiar with my work. He's like's, "Oh, anti-vax is ridiculous." And I just couldn't get him to get to vaccinate, even knowing what I do and what I've done. He's like, "Oh, you're going to have your biases." I think he kind of got sucked into a community of other people who are also vaccine hesitant and he found solace in that.

I think about, we need to continue, so I think. Even if you could... A lot of people ask me, like I still have a loved one, so this is the big question I get, what do I do? I have a loved one that shares vaccine misinformation. I have people in my family and my friends who still won't get vaccinated, and I think it's really easy to just want to say like, "Oh, just cut them out of their life." I don't think that's the right approach, because you have the most power in that relationship. Like in the sense that, me as someone who doesn't know that person at all, it's not what I say that has you as a friend or a family, as a love, someone with a personal tie, you're actually going to carry more weight and try to get them out of that space.

Continuously try to extend the R, listen to them. Don't just say that they're wrong or that they're getting bad information. I think it's really tough to try to facilitate a healthy productive conversation, to try to get them to look at other sources or maybe get them questioning the sources that they are trusting and really try to continuously encourage that person to get vaccinated or look at other things. I think it's really tough and that's not going to happen overnight. In the same way that I think that no one becomes anti-vaccine overnight, you're not going to give us somebody's anti-vaccine to become pro overnight. It is a long trying conversation as it took a friend of mine a year to do so. I don't think people are entirely lost in that space, but I think it's a lot of work.

Justin Hendrix:

I recognize that you have just started your role at Twitter and this interview isn't endorsed by Twitter or relevant to the work you've actually done there. But if you will, just what enticed you about the opportunity to work on Birdwatch and how do you imagine taking your research forward into your new role?

Kolina Koltai:

Yeah. I think for many, many years, I have sat comfortably in the Ivy Tower if you will-- I didn't go to an Ivy League-- but I think many academics including myself, find it really easy to criticize a social media platform and there's a lot to criticize by all means. I will always tell you all the things that these companies have done wrong, but when the opportunity came up to potentially work on Birdwatch, which was something that seemed really interesting as a way to help combat misinformation, provide context. Because what we know about misinformation, is not always as clear cut as this is true and this is false. There's a lot of content that stays up, because it might be true, but maybe de-contextualized from the original source or it's meaning, and so sometimes you just need the additional context of it.

You imagine someone put a quote out, and the quote makes a person look a particular way. But you're like, "Oh this was a quote because they were maybe reading or talking to kids and maybe that's why they're not using the most technical language." Instead of that's like, "Oh, this person is an idiot because the way they're describing phenomena." It's like, "Oh, it's because they're explaining that phenomenon to a kid and things like that." I think given the opportunity to then take what I know about misinformation in social media, particularly about vaccines, but also other science topics, I found it really exciting to be able to try to apply that in a domain that I felt I could potentially do some good.

I think I will continuously still be critical of social media platforms, but if I can potentially take what I do know and use that towards good in trying to help fix some of the issues we see on social media platforms, why not?

Justin Hendrix:

Well, I will hope that we can catch up again at some point as you get further into this new experiment and we'll be able to find out how it's going.

Kolina Koltai:

I'm really excited about it, and I am not a representative or spokesperson for Twitter or for Birdwatch or anything like that. But from what I do know, everything is open. Everything that I know about Birdwatch currently is what you see online, all the data is online. It does seem really promising and I'm excited to be part of a team that is trying to do something to help solve the issue.

Justin Hendrix:

Thank you so much for speaking to me today.

Kolina Koltai:

Yeah, thanks for having me.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics