Home

Donate

Breaking the Social Media Prism: a conversation with the Polarization Lab's Chris Bail

Justin Hendrix / Apr 2, 2021

Chris Bail is professor of sociology and public policy at Duke University, where he directs the Polarization Lab. He is the author of BREAKING THE SOCIAL MEDIA PRISM: How to Make Our Platforms Less Polarizing, which arrives on April 6th, 2021. I spoke to Chris ahead of publication for the Tech Policy Press podcast, The Sunday Show. This is a lightly edited transcript of the discussion- click here to subscribe to the podcast.

Justin Hendrix:

So can you just tell me how you got to the Polarization Lab, a little bit about your journey to this place?

Chris Bail:

For about a decade now, I've been studying social media and political polarization. My first book was about how fringe ideas become mainstream and in the process of writing that book especially, I learned that we need to understand the role of social media in this process. I mean, it's clear as day and everywhere we look there are examples of fringe ideas becoming mainstream. And at the same time there was this remarkable moment in social science. People are calling it the golden age of social science. We have more data than we've ever had before. We're inundated with data. It used to be, years ago, that you'd survey a 1,000 people or you'd talk to a couple of dozen people and then you'd tell your story.

But now, we have the ability to collect data about massive networks of people in relatively short order, so that's a huge opportunity, but there are immense challenges, as well. And so this lab is really just an effort to corral all the data across disciplines with as many different perspectives as we can, because we think this is an inherently interdisciplinary problem and to be frank too, it stretches beyond academia too. We need to involve advocacy groups. We need to involve journalists like yourself. We need to involve government. And that's very much what we're trying to do right now.

Justin Hendrix:

So one of the things that you do in this book is to take on some of these popular narratives about how social media shapes political polarization, but I guess, can I step back a second and just ask about polarization itself? Where are we at? In the US and maybe in other democracies, we think of polarization as being at an all time high- is that the case?

Chris Bail:

It depends how we define it. For the first time ever in the US, out party hate has surpassed in party love. So we hate the other party more than we like our own party. We discovered that in some research that I did with a blue chip panel of academics, social psychologists, political scientists, psychologists. And so we know that affective polarization is growing. That is intergroup animosity. What's more complex is actual policy disagreements. And there's evidence both ways, but probably it's not growing as much as this affective polarization, which many of us think is actually more sinister than the policy-based disagreement.

Justin Hendrix:

And so a lot of folks have looked at social media and look at changes in the media ecosystem as playing a key role in this, but you challenge some of those narratives.

Chris Bail:

There's just an absence of high quality research in the debate right now. The strongest and loudest voices are tech leaders or former tech leaders and then policy makers. And none of these people have really in my view, brought enough research to bear on the question. So to give just three popular examples- first, the idea of the echo chamber. It's pretty pervasive that tech companies and social media platforms have trapped us in filter bubbles. Everybody knows the story. They prevent us from seeing the other side. Well, some of our research shows that stepping outside your echo chamber could make you more polarized, not less.

Another strand of research we did really tried to figure out what's going on with misinformation campaigns. If you read the popular accounts, Russia really tore this country apart in the last four years. We had some unique opportunities to actually study that with pretty sophisticated methods and couldn't find any evidence that people were changing their minds when they interacted with at least the Russia linked IRA in the year that we studied. Finally, the idea that algorithms radicalized people, it's a super seductive idea, especially when it's delivered by these kind of like apocryphal tech leaders. People who profited from allegedly, these kinds of algorithms that feed us increasingly radical content. Even there, there's not a lot of evidence so far to support that hypothesis, but of course, we need much, much more data before we can rule out any of these popular narratives.

So I wouldn't want to say, platforms are blameless or these things don't matter. But what I would like to propose is, we've got so much energy on these three things that the evidence suggests is probably not going to move the needle very much and it might even move it in the wrong direction. And we're doing so little in the space of the supply problem- the actual people producing the vitriol on the platforms. One of the main things I wanted to do in this book is develop a theory of how social media shapes uncivil behavior, and then secondarily try to develop some solutions to try to counter these all too human tendencies from the bottom up.

Justin Hendrix:

So let me just press you on a couple of those things. One of the things I often think about is, if you were staring down at the planet from the moon, it might be reasonable to observe that some of these things don't have too much of an effect in the bigger picture, if we're looking at large scale data or network analysis of social media. But if I'm standing on the street in Texas in the middle of a protest that's been convened by the IRA, with both sides in attendance at each other's throats, or if I'm watching what's unfolding on January 6th, it's hard to separate the evidence before my eyes, that social media has led to something untoward. So I don't know, methodologically or intellectually, how do you separate the anecdotal and the event driven stuff from that large-scale analysis?

Chris Bail:

Yeah, this is exactly why I think we need a new analogy, and this is why I called the book Breaking the Social Media Prism. So the social media prism is the idea that social media reflects back to us a very distorted view of reality. We will always harp on the most extreme parts of the continuum and moderation and moderates in general will seem pretty much invisible. We've looked at this with data, something like 75% of tweets about politics are created by about six or seven percent of Twitter users. And these people have for the most part extreme views. Meanwhile, the average Twitter user never texts about politics.

And so what we're seeing reflected- and through this distortion, created by- what I call the social media prism is really obviously exaggerating extremism and muting moderation. Yes, there are absolutely examples. January 6th is a great one, where we see the impact of how social media can mobilize extremists. And I write a lot about that in the book too. How this can really create momentum. On the other hand, it's a totally different thing to say, that social media alone is causing this. And I think that's where a lot of the blame is being cast right now. And I think that's dangerous because, if we don't understand the broader societal context that's shaping political polarization, and also the individual level social psychological factors that are driving polarization as people interact with each other on social media, then I think we're really, not only get a distorted view of the problem, but we're going to come up with the wrong solutions too.

Justin Hendrix:

So if you were listening to the Congressional testimony last week on Capitol Hill, you heard Mark Zuckerberg make an argument that doesn't sound entirely dissimilar from what you're saying right now. Is he off the hook?

Chris Bail:

Definitely not off the hook. I would never go that far. Certainly, we've seen leaked information that suggested that Facebook has been trying to look at this stuff. And not just Facebook- we've seen other platforms embroiled in this type of controversy. So again, I would never say they were blameless and I do think platforms have a role to play. Even if an algorithm is creating a small effect and it's easy to fix, then let's fix it. The problem is, I think nobody can see what's going on. There's no evidence. And there is where I really put blame on the companies. The lack of transparency. Facebook employees- dozens and dozens, maybe even hundreds now of really talented social scientists- are doing really great work. They used to share that work publicly and that's really dropped off lately.

And so those of us on the outside can only kind of try to peek in and see what's going on. And in the meantime, this creates a massive vacuum and nobody really knows what's going on. So of course, speculation is going to rise up to the top. Speculation by tech leaders and speculation by politicians who are very often self-interested.

Justin Hendrix:

For the first time we saw politicians really take on the idea of the business model and what they refer to as surveillance advertising, or one or more of the representatives last week used that term. And this focus on radicalization, extremism and polarization came out of the mouths of multiple Representatives in many different ways. Do you feel that lawmakers have gone too far in their critique or where do you think the balance is?

Chris Bail:

Well, I just want to see some actual evidence. I'd like to see actual evidence that involves real social media users. So in this book, one of the things that I discovered is that people can just be profoundly different online and off. And again, this is the power of the social media prism to distort what we see, but it also distorts how people act. I think I can provide maybe a unique lens onto extremism because we were able to both talk to people. We did interviews over time for several hours, with a large group of people, then we also surveyed them. And then we followed them online. And so we're able to compare the person that people present themselves as on social media and the person they are offline.

And there's several cases that stick out, but the one that sticks out most for me, is this guy I'll call Ray. And Ray, when we spoke with him, he's the most civil, even deferential guy who goes out of his way to say, "People online are just way out of control. They are feeding each other's vitriol. I try to avoid politics altogether." Then we linked this data that we had collected about this guy with his Twitter data. And it turns out this guy is just the biggest troll on the internet. I mean, some of the most reprehensible stuff I've ever seen, and I've been studying this stuff for 10 years. Meme after meme depicting democratic leaders like Nancy Pelosi, use your imagination in the worst way.

And so the question is like, how does this kind of Dr. Jekyll and Mr. Hyde transformation happen? And there, I think, those are the types of data points we need, but we need more than the story of one person. We need to be able to put it in a broader societal context. And that's where again, I think the chief role of legislative reform should be to create that data. Once we know the data, we can answer questions like, are algorithms really radicalizing people? Or on the conservative side, is social media really censoring Republican voices more than Democratic ones? This might be, somewhat ironically, one of the few places where there's opportunity for bipartisanship.

Justin Hendrix:

One of the things that's become clear about Facebook's approach to this issue is that they are referring to outside research. Nick Clegg, VP of Global Affairs at Facebook, had this widely shared Medium post this week that had a section on polarization. It refers to multiple bits of research around it from Stanford, from Harvard, from Pew and the Reuters Institute. We know from Buzzfeed reporting that Facebook has prepared an internal manual or memo for its employees to talk about the issue of polarization. And yet in both the Clegg piece and what we know of that memo from Ryan Mack and Craig Silverman, they never refer to their own internal research. And yet we see that when employees leave Facebook, they often will point to the fact that they were aware of research that was suggesting that the platform was in fact increasing polarization. So I don't know, what do you think that they're looking at there?

Chris Bail:

The big problem is, there's an optics issue here. We've had efforts. A lot of people don't know about Social Science One. Social Science One was an effort by some leading academics, Harvard and Stanford, to try to create an opportunity for academics like me to go into Facebook and get data. And the original idea was amazing. The idea was that there would be an independent panel of academic experts who reviewed requests to do research on Facebook with Facebook data, including experiments. And that this would be vetted by the experts and the experts alone and no one inside the company.

Unfortunately, as far as I can tell that didn't happen and is unlikely to happen. What has happened is Facebook's begun to share one dataset that describes who clicks on URLs. And you can do some stuff with that. For example, you can see how many Republicans aged 18 to 30 were looking at Breitbart in the month of April in 2019, and that's a great data point. But what we really need to know, we need to get inside the guts of platforms and understand how they work. We need to experiment on core features of the platforms. And one of the things that we've been trying to do in the Polarization Lab, and I discuss it in this new book is, a platform that we created for scientific research, because we basically gave up. We said, "Look, the experiments we want to run. It's just not going to happen. It's too much risk to Facebook, in terms of PR, in terms of legal stuff." After all, they have an obligation to their users. They can't like suddenly make Facebook anonymous to see what would happen, if we made Facebook anonymous. But yet, a lot of the research suggests that's something we should be looking at.

So we're trying to kind of create a new path. Create a social media platform for scientific research, pay people to use it. Like we'd pay people to do studies. And then we, the researchers, can turn on and off different features of social media, try to figure out which ones are most polarizing and also control how people are kind of brought into contact with each other.

Justin Hendrix:

So you've built a bunch of apps and bots and tools. They've got great names, Troll-a-meter, Tweetology, Echo Chamber Explorer, and the Polarization Pen Pals, which I think I'd quite like to try. These are similar to tools and platforms we've seen from other researchers, perhaps the NYU Ad Observatory or things the Markup has done. How do you come up with these? And how often do you roll one out? And what's the life-cycle on one?

Durham, North Carolina - Thursday April 25, 2019 - Christoper Bail, Ph.D. Photographed at the Duke Center for Innovation and Entrepreneurship by Alex Boerner

Chris Bail:

We've been working on these for about two or three years now. And one thing that's a little different about our effort is, we don't just want to show the disparity. So a lot of blue feed, red feed, was a tool that allowed you to see what Republicans are seeing versus Democrats are seeing. These were efforts to kind of promote awareness about, for example, like what ads people might be seeing on Facebook. And that's great, but we really wanted to try to change behavior, empower people like the social media users to actually change things. Because I've become pessimistic about the prospect for transformational top-down change. But I also think that the key driver of polarization is after all the people. So that's super depressing. That we are all a part of this, but it also means that we collectively have some potential to transform that.

And so these tools are really directed at social media users to do really two or three things. The first thing is to become aware of the social media prism. So these tools will help you, like the Troll-o-meter will help you try to figure out if you are interacting with a troll. And we do that by training fancy machine learning models on the data that we've collected over years from thousands of social media users. And we look at who actually engages in uncivil behavior and political trolling. And then we use these to calibrate the tools. So the idea is to promote a little more awareness about the possibility that the extremists that most people are interacting with might actually not be represented members of the other side.

At scale, if more people become aware of this false polarization, this false sense of polarization that we have, then the hope is that they might not feed the trolls more. So on the one hand, let's try to avoid extremists, let's empower people to avoid extremists. But on the other hand, we want to also empower people to understand how their own behavior contributes to this. So some of our tools will read in your Twitter feed and compare you to a broad sample of Twitter users to help you figure out, what about you? Are you an extremist? Are you in the middle? And maybe give people a little more self-awareness about how they themselves might contribute.

Now, we don't think people are going to use the tool and say, "Oh, I'm an extremist. So I'm going tone it down." What we really need is more moderation because the data is clear, most people are moderates and most moderates don't talk, so we need to boost moderates. And so the second set of tools is really about boosting moderates and even more than that, creating an incentive for moderation. So right now, the incentives in my view are all messed up. We have every incentive to say something sensational. The easiest way to get likes on Twitter is to say something anti-Trump or anti-Biden. There's just an army of people that will just chime in and start liking that kind of content.

What we really need to incentivize is, people who actually produce content that appeals to a diverse group of people. Now, that could be about politics, that's the way I conceive it, but it could be about demographics in general. That would actually make social media more effective at creating consensus. And so, one way that we try to do that is to create a kind of status around moderate behavior. And so we have, for example, bipartisanship leaderboard that ranks prominent elected officials, journalists, advocacy groups, and we kind of rank them according to how much their tweets resonate with both sides. And not to embarrass you on your own show here, but you are on that list, so you're doing something right. We could take a look at the data to figure out exactly what it is, but you're in the minority. Not everybody is resonating across party lines. In fact, most people aren't.

But we want people like you to do more of that. We want to boost you and tone down the extremist. So we have the leaderboard, and then we have bots. We have two bots named Polly.. And these bots retweet people on the bipartisanship leaderboard. One of the Polly's will retweet conservatives who our research shows have resonated with liberals, and then another Polly that does the same for conservatives. So these are all tools, are they going to create the transformational change in and of themselves? Of course not. This is about nudging people to become more self-aware to try to recognize that moderation is out there, you just got to find it. At a minimum, just create something like a new kind of civic education.

We can talk all we want about self-awareness, but what we really need to do is make it habitual and that's where the tech comes in, I think. Like, exposing yourself to a bot makes this like habitual, it doesn't mean there's a lot of effort involved or looking at the bipartisanship leaderboard and seeing where you're at. That's kind of a simple thing to do, we hope.

Justin Hendrix:

So you draw a distinction in the book between polarization and extremism. Even as you have suggested here you might not see a proven connection between the social media environment and polarization, you do see the social media prism driving extremism. Can you just touch on a couple of points that are in Chapter Five?

Chris Bail:

Yeah, definitely. We just have the wrong model of why people use social media. The conventional model- and this is something you see in Mark Zuckerberg, Jack Dorsey, other tech leaders- is that social media is really a competition of ideas. We go on there and in its best form, it's really about people deliberating about what's true. And at the end, the truth kind of rises up and we're all better for it. Way back in 2000, a lot of people thought that was possible. And now, it's so clear as day that's not what's happening and that's not why people use social media. Yes, we might get a little bit of information here, but really what social media is doing is a lot more profound. I think it's actually beginning to shape the way that we understand ourselves and the world around us and the way we create our identities.

And we know now that identity is so central to polarization. The “us, them” mentality. What we don't know a lot about is how social media shapes that behavior. Sociologists have known for a century that every day we present a different version of ourselves. We observe how other people react knowingly or unknowingly. And then we cultivate those identities that make us feel good about ourselves, that give us status or whatever it is. This isn't necessarily a conscious process, but we all do it all the time. And so the really interesting question and the one that fascinates me is, what is social media doing to this process? And I think it's doing two things.

The first thing is, we have unprecedented flexibility in what type of identity we can present. I could be a middle-aged female breakdancer on social media, but anybody who meets me in person would laugh at that. That's one thing. And then the second thing is, that we have these powerful new tools to monitor what other people think of us. Like counts, follower counts, and so on. And of course, these things are profoundly misleading. If we were getting suddenly getting lots of likes and we say something anti-Trump, or anti-Biden, very few people are thinking like, "Well, who's liking this stuff and why?" It's just that kind of sense of validation, especially for people like that guy Ray, I was talking about, the Dr. Jekyll and Mr. Hyde guy.

This guy is a middle-aged divorced man who lives with his mother. There's not a lot going on for Ray. On social media though, he's getting a kind of micro celebrity in status. And yes, it's artificial and yes, it's other extremists liking his stuff. But for this guy it's actually really important, it keeps him going. Now, if we take that as our assumption, that social media is really not about a competition of ideas, but it's about a competition of our identities, then it really helps us understand the growth of extremism, people like Ray, but also why moderates don't want to get engaged.

Justin Hendrix:

So Chapter Seven, maybe I'll give you one of these kind of congressional yes, no questions. Chapter Seven. Should I delete my account? Should I delete my account? Yes, no?

Chris Bail:

No.

Justin Hendrix:

Okay.

Chris Bail:

And it pains me to say that. The irony of me writing this book, I'm not the hugest social media user, and I see all the negative. I can see how that idea is seductive. We just all need to go bowling together again, we all need to get outside together. Kumbaya. Like, great. That would be wonderful. But when we actually scrutinize the delete your account campaigns, we see a few interesting things. So first, yes, delete Facebook trend it briefly, and lots of Elon Musk, Will Ferrell, other famous people deleted their accounts. But in the book, I describe discovering that one week later one of the most popular search terms on Google was how to undelete your Facebook account. So people came running back.

Chris Bail:

Why did they come running back? It's because social media is fulfilling for us, this human instinct. We all need to know what other people think of us, constantly. I mean, some people more than others, and obviously some people care a lot more about what happens on social media than others, and that's important. The addiction isn't just shiny lights and cute cat memes, it's really about a really hyper efficient way of monitoring our social environment. And that's where, I think we're never going to get completely away from social media, especially because ... Obviously, young people online in unprecedented numbers, but then also in the political space.

Chris Bail:

A recent study just came out of Harvard, suggests that the vast majority of Republicans and Democrats live in areas where they're basically never going to encounter someone from the other party and certainly under COVID, these trends are only getting worse. Like it or not, and again, I don't really like it, social media is probably going to be one of the last places where cross-party political deliberation is possible. So the question is kind of like, I think, how can we make it better? How can we make it possible rather than, should we delete our accounts or not?

Justin Hendrix:

And that's where, in the book, on a better social media. And I presume you work with students on this question as well. Once the dust has settled 10 years from now, what do you think tomorrow's social media networks look like?

Chris Bail:

Yeah, it's the million dollar question. It's important to have that historical perspective because I'm always telling my students, "There used to be this thing called MySpace." They have no clue, what MySpace was, but I barely understand TikTok, so there you have it. There's always going to be a generational displacement, but also, if we take the long view, something comes along and replaces the dominant platform, every three, four or five years. Even Facebook, had to buy Instagram to stave off that surge. So is Facebook going to go away tomorrow? No, of course not. It's got amazing market power. It's not going away anytime soon, but are there a lot of people dissatisfied with social media who would be open to trying a new social media platform? I mean, absolutely in my view. There's going to be a lot that fail. The graveyard of social media is now getting heavily populated. And who knows what will happen with Parlor or Trump's platform or all these kinds of things, but it's clear that people want something better. So yeah, the question is, how do we make that and who makes it?

Justin Hendrix:

Chapter Six focuses on the moderate and how they're muted by social media. What's that all about?

Chris Bail:

So let me tell you the story of a woman I'll call Sarah Rendawn. Sarah is a moderately conservative woman. She's from New York, she's half Puerto Rican. Her dad was a cop. She's married to a guy who owns a gun and likes to shoot at a local range, responsible gun owner, very moderate views. Reads the New York Times, reads The New Yorker. And even if she doesn't agree with it all the time. And so. one night, late at night, she tells us she's on Twitter and the NRA posts something, a bunch of people are piling on the NRA for being generally terrible. And she says, "Hey, responsible gun owners like my husband deserve respect." Blah, blah, blah. Within minutes people online had discovered that she had kids by looking at her Twitter feed and someone says, "I hope your kids find your gun and shoot you." That's pretty extreme, it freaked her out. She deleted her Twitter account right away. She kind of ran offline, she changed the locks on her house. I mean, she was that scared.

The tragedy is for me, getting to know this person… this is a person with really nuanced and important views about things like race and policing. Our debate about race and policing right now, is just probably as polarized as any other. We need people like her to find middle ground. If there is middle ground, it's going to be coming from people like her. And yet, because of experiences like this, she is completely invisible online in terms of politics. You won't ever see her talking about politics. It turns out the story of this woman, Sarah, is the typical story. The number one reason people get harassed online now, according to a recent peer report is for political views.

A lot of people have an experience like Sarah, and if they don't, they might have an uncle or an aunt or an in-law who doesn't share their political views. And for them, talking about politics on social media just makes Thanksgiving dinner even worse than it would be. There's all sorts of offline dynamics that are shaping what's going on online. And I think the single biggest problem is people like her have no incentive to put out moderate views on social media. They're only going to get her in trouble, not only with the other side, but with her own part. If she says, "Hey yeah, we should have background checks." Which is something she believes by the way. Or we shouldn't have an assault weapon ban by the way, guess who jumps right on her? It's the extreme right, right away. For people like Sarah, there's just no incentive to engage. And so thinking about how we can get people like her involved in the public discussion, I think is really paramount.

Justin Hendrix:

Are you optimistic or pessimistic, ultimately on this question?

Chris Bail:

I have been described as a dystopian idealist. I'm cautiously optimistic only because we know that the scale of the problem seems a lot worse than it is. And we've known that for a long time, by the way. False polarization, our tendency to exaggerate the ideological extremity of the other side and downplay the ideological extremity of our own side, this has been documented since 1980. It's just something that we do. And social media has kind of set this into hyper-drive. So my hope is, if we can develop tools to counteract these tendencies, and it's going to be a multi-pronged strategy. It's going to be users becoming more aware of their behavior and how it contributes to polarization or their lack of behavior for people like Sarah.

Then from the top down, it's going to be about creating different status incentives that incentivize people whose views resonate with diverse groups of people, instead of those who are only kind of preaching to the choir. So yeah, I'm cautiously optimistic. We've got really, really smart people just jonesing to get a hold of this data. And I think the ultimate solution would be a purely evidence-based platform that could adapt. We know that Facebook is running experiments internally and other platforms are running experiments internally, but there's no oversight, there's no public discourse about what's working and what's not, and we're not seeing the data. Again, platforms can be more transparent with the data, create a better opportunity for everyone to figure out how we solve this problem, which I personally think we'll be talking about for many, many years.

Justin Hendrix:

Well, Chris Bail. Thank you very much.

Chris Bail:

Thanks. Thanks so much for having me.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics