Election Misinformation Thrives on Major Social Media Platforms
Justin Hendrix / Oct 2, 2022Audio of this conversation is available via your favorite podcast service.
The former President and his supporters continue to sow doubt in the outcome of the 2020 election, and in the election system more generally. Now, with the the 2022 midterm elections just a month away, a number of observers are perplexed at the posture of large social media platforms, where false claims continue to fester and efforts to mitigate misinformation always seem puny compared to the scale of the problem.
This week we hear from three experts who are following these issues closely:
- Nora Benavidez, Senior Counsel and Director of Digital Justice and Civil Rights, Free Press and one of the individuals leading a coalition of groups advocating for tech firms to do more to confront election misinformation;
- Paul Barrett, Deputy Director, Center for Business & Human Rights, NYU Stern School of Business, and the author of a recent report, Spreading The Big Lie: How Social Media Sites Have Amplified False Claims of U.S. Election Fraud.
- Mike Caulfield, Research Scientist at the Center for an Informed Public, University of Washington and one of the authors a recent post for the Election Integrity Partnership on the factors that shape the virality of a rumor.
What follows is a lightly edited transcript of the discussion.
Justin Hendrix:
So we're here today to talk about election misinformation and the intersection with social media platforms. And it's just a few weeks now until there is, of course, a major election in the United States. Elsewhere in the world, there are major elections about to happen in Brazil, of course, in October, and in multiple other countries as well. So a timely moment to do this.
I want to start with you, Paul. You've just published a report from the Center on this issue, and to some extent have tried to, I think, take in to consideration how do we define this problem and what are potential solutions? But before we get into the solutions, can I ask you just to sort of state what you think the problem definition is at the moment?
Paul Barrett:
Sure. In a sentence, I think it is a lack of urgency on the part of major platforms which purport to be concerned about mis and disinformation related to elections, but are not acting in a way that meets that purported concern. In this country, in the United States, I think the big issue is election denialism. The spreading impression that many of voters, particularly on the right, have that our elections are corrupt generally. This is a sort of expansion on the big lie that Joe Biden was not legitimately elected in 2020, and that's now metastasized into this conclusion, even an article of faith I would say, in a significant part of the Republican party that elections generally are corrupt. And if my side does not win, then the election was illegitimate. And that idea is being spread in a variety of ways by a variety of actors. And one of the ways it's being spread is via social media. And I think those companies simply are not taking this growing crisis seriously enough.
It's very important that this phenomenon is particularly pronounced in swing states, in the handful of states that actually determine presidential elections because they're truly up for grabs. So states like Arizona, Wisconsin, Pennsylvania, and Nevada. So I think that that's where the heart of the problem lies at the moment.
Justin Hendrix:
Mike, I want to come to you next. I want, perhaps if you can, to give the listener just a sense of the election integrity partnership and also your research agenda, and to see if perhaps that you agree with Paul. I know that the Election Integrity Partnership has done some of the most comprehensive work in terms of detailing false claims about the 2020 election and looking at phenomena that arise from the data sets that you've gathered there. And you've got some new results that you've just shared in a blog post about the virality of such claims. But maybe I'll just turn it over to you and ask you to hold forth.
Mike Caulfield:
Yeah. So what we look at primarily is election rumor. A lot of people think that election rumor is universally bad, but that's not the case. Rumor has a lot of really important social purposes, right? People spread rumors as they're trying to make sense of things. People engage in rumoring sometimes to illicit official denials. People say, "I think this thing is going on" because they want to hear someone say, "No, we'll show you that it's not." And all that is very healthy.
But at the other end of that, you have a set of claims where rumors and misinformation are really leveraged, not to engage in sense making, not to call attention to real problems of people in power, but to sow doubt in the legitimacy of the election itself, as I think we've seen over the past couple years, in ways that have really serious social effects.
And so what we do is we look at and analyze the variety of election rumor that flows through social platforms, and we look at the way it spreads and we look at the nature of the rumors themselves. And we try to understand those patterns so that we can address both legitimate confusion of voters and citizens, which is often leveraged against their interests, but also in many cases stuff that is more maliciously oriented and targeted. So it's really the whole range of that.
Justin Hendrix:
And might I just, before I go to Nora, could I ask you perhaps to respond to perhaps Paul's problem definition with regard to where we're headed in 2022? Are you already beginning to see the effects of big lie related rumors?
Mike Caulfield:
Yeah, so I guess a couple really short points on it. It's really common for people to cast doubt on the legitimacy of an election in a broad way. It's not historically unique. You can go back to any election, you can find people saying, "Yeah, I don't think that election was quite fair," or, "It wasn't fair," or, "There was some cheating." That's not particularly unique.
However, there's a couple things that are unique about our current moment. One is the specificity of these claims and just the sheer volume of them. It's not a few people focusing on one state in a general feeling of fishiness. It's thousands. It's literally thousands of individual claims that people are making. Two, to Paul's point, there's a systematicity, at least on the right, there seems to be a growing systematicity of these claims. It's not people making this claim or that claim, but building really this larger complex interlocking narrative about the election that incorporates all of these claims. And that happens on the left too to a much smaller extent. You saw some of that in 2016 in the primary race with Clinton and Sanders.
But that brings me to the third point, which is probably the most important point. The real place where these things do harm is where these things have pathways to respectability. These sorts of claims that are misleading or completely fabricated. If they have pathways to respectability, if they begin to be discussed among people that have power and make decisions and make laws and have opportunities to intervene in various parts of the election process, that's where the real worry is. I just really do want to stress that. If your Uncle Bob thinks 2020 was stolen, I really don't care. Uncle Bob can think whatever he wants. The real worry is it's starting to corrupt actual election integrity. Because as those, in many cases, lies seep into the process and the people that run the process, it's having adverse effects on the process itself.
Justin Hendrix:
Nora, I think of you as being both a leader of coalitions, and also an advisor to groups that are working at the grassroots on these issues. From your vantage, how do you consider this problem going into the midterm?
Nora Benavidez:
Well, I really agree that we have an urgency messaging failure. I really agree with Paul in many ways that this is largely overlooked. I would expand in one way, and I think it's just because of how I personally have seen as a movement we began so many years ago in the 2016 and even 2018 electoral context. Think about bad actors and the infiltration of maligned narratives. That as that happened, there were these seeds that would do any number of things, divide people, wedge issues like immigration or abortion that helped define how voters then behaved despite being premised on lies.
And at that time, I personally, and I think a lot of people, took a more kind of content specific approach to defining the problem. And what we've learned, and certainly what I've now seen over the last couple of years working more with and trying to pressure social media companies, is that the ecosystem has changed. Not just because social media companies themselves don't think this is urgent, but because they refuse to come to the table with civil society, with elected officials, with frankly any other sector, researchers, others, to engage in the ways that our platforms are actually shaping the information we consume.
So it isn't just that there's this bad stuff that exists, and we might see it, which is in many ways I think how some of us conceived of the problem years ago. But now it's that we've actually seen the business models, some of the algorithmic recommendation systems, other machine learning across platforms that feed us problematic stuff. And I say that in the most blanket terms. Whether that is hate, calls for violence, content that might suggest for certain voters that immigration officials will be at the polling locations. That's the kind of narrative that then, even if someone is a US citizen, they might be worried about going to the polling locations in their respective state. We saw that kind of content in 2020 on Twitter and on Facebook.
And so I just sort of think the problem isn't just an urgency one. It's the complete unwillingness of social media companies which now play a large and not neutral role in our information consumption, and they're inability and unwillingness to engage to really put people and safety first are threatening, not just the US elections, they are threatening elections around the world. We have seen that play out in the Philippines, in Kenya. We are seeing it play out in Brazil. It's no longer something that we can take a bite size piece and say, "Oh look, here's what's happening in the US. They should know better now." This is not a dress rehearsal.
We've now seen four major elections in the US alone in which these companies have known about it, been given evidence, and I think are now up in what I view as the larger set of major actors. It isn't just the disinformation campaign producers. It's the arbiter of our information consumption that are really concerning to me now. And I think that plays out across the grassroots partners that I work with. That groups are beginning to really see the issue isn't just a content one, but it is a production and systemic failure from a lot of different actors. And so that means that the solutions are going to have to be very complex. And it's kind of daunting at this moment.
Justin Hendrix:
Paul, does that correspond to what you learned in assessing the platforms' current policies and responses? I mean if Nick Clegg were on this call, he'd say, "Nora, what are you talking about? We're spending $5 billion on content moderation and trust and safety initiatives. We've got the largest voting information platform on the internet. We've made unprecedented investments in this area."
Paul Barrett:
I think that's a good channeling of what Nick Clegg would say. And I certainly don't disagree with anything Nora said. She can speak to her experiences and her colleagues' experiences in dealing with executives and employees from the various social media companies. Obviously from her firsthand experience, which is much more extensive than mine.
In my own dealings with them, which I would compare more to those of a journalist than to an activist, even though we do engage in advocacy at the center, I would say that for the most part the lack of urgency, as I phrased it, or if it's worse than lack of urgency as Nora describes it, again, I don't think that distinction is huge. I think there is a desire to minimize the problem and to basically throw out a bunch of policies which sound fine in the abstract, but which tend to have a very significant, if at this point, now well known, as Nora suggested, well known gaps and flaws.
Mike emphasized the significance of influential and powerful people embracing false narratives about elections. That it's not just his Uncle Bob that he's concerned about. It doesn't matter what Uncle Bob really says at Thanksgiving dinner. But when most of the Republican candidates for significant state offices in a state are making election denialism a essential theme of their campaign, that sets up a real echo chamber, at least on the Republican side of the aisle in that state. And I think it basically continues to dig the hole that we started digging collectively in 2020 even deeper when it comes to whether our elections are going to be seen as legitimate by a wide swath of society. Some of these failures I think they're not secret anymore. We've discovered them and people have debated them, and yet they persist.
Facebook does, as you were suggesting, Nick Clegg would also talk about their extensive fact checking operation. Whereby they have relationships with, at this point more than 80, I think going close to 100, outside organizations around the world that do fact checking for them. When those organizations determine that a given piece of content is significant, not just something frivolous, that its has some virality to it and that it is demonstrably false, than Facebook has committed to appending that finding to the piece of content, and demoting it so that it's much less likely to be seen by many Facebook users. All fine as far as it goes. I would urge them to consider removing the content altogether and keeping just a record version of it for people to study as needed, but in a way that wouldn't be disseminated in any way. But leave that to one side.
So you have this fact checking system, and you have a remedy that flows from the fact checking system. You are applying that during election season basically to everybody except some of the most influential speakers at that moment in time, political candidates and incumbents. So Facebook has made it clear since 2019, that is their policy. That politicians can distort the truth and disseminate falsehoods. And I think that's a central flaw in their whole fact checking apparatus. There are similar flaws at other platforms.
Twitter is just one more example that I'll offer. Twitter has a so-called civic integrity policy, which again, in theory sounds fine. They put it in place, however, only during, as they define it, the pendency of elections. So that in March 2021, the civil integrity policy was basically turned off and they stopped enforcing it. And then they announced in late August of this year, "We're turning the civil integrity policy back on." So for a 16 month period, they just weren't paying attention to that issue. And it's precisely during that period when the falsehoods about so-called ballot trafficking, corruption of election machinery, and other components of the larger election denial narrative are gaining momentum on places like Twitter, often in the voice of candidates who are very influential.
So I would agree with what Nora said. I think the longer you look at this, the more unnerving it tends to get. And we'll have one set of problems this year, 2022 midterm elections. I think that the real danger here is that we're going to see some number, doesn't have to be a large number to be significant, some number of election deniers elected to office in a handful of swing states, the states that will determine who gets elected president in 2024. And rather than just barely hanging on by our fingernails as we did in 2020 because a lot of state officials actually did their job and upheld the popular vote and certified electoral votes and sent them to Washington, where Mike Pence did his job. Instead, we're going to have in several key states, I fear, people in key jobs like secretary of state or attorney general who are looking forward to undermining the election if the other side wins. And I think we could have a degree of chaos that will look make 2020 look tame.
Justin Hendrix:
Mike, just back to the Election Integrity Partnership, it did issue a set of recommendations in its 292 page final report in 2020, which is an incredible document.
Mike Caulfield:
There's an abstract too. I just want to point out there is an abstract. But yeah. No, we did issue a bunch of recommendations, and some of them I think were taken seriously. But ultimately, the most important stuff is still broken. So we'll talk a couple things. One, talk about the sort of fact checking model that we have. I mean think it's good that the platforms work with fact checkers. But the fact check arrives 36 hours later. And the average, if you look at the actual plots of how these things take off and die out, it's over. It's over 36 hours later. We see in chart after chart that we plot out, we see that quick rise up, the slow decay, and then over here in the very trailing edge of that curve we see fact check. And so that doesn't really work.
It's a model that could work with some other types of misinformation. If you're looking for misinformation that's not around specific events, but sort of stuff that's on a longer burn. Certain types of health misinformation that are just persistent, that don't kind of have these sudden event peaks, that fact checking model can work with stuff like that. But for stuff where there is an event, and then there's this quick rumoring and conspiracy theorizing activity around it that boosts it up and propels it sometimes into a trending topic, the fact check comes too slow.
Paul mentions another thing that has been incredibly important. The idea the platforms have that the election period is where they have to look out for election misinformation has done so much damage. So much damage. Just incalculable damage to our democracy, and I'm sure to democracies around the world. One of the things that we found is that during the election a lot of stuff kind of came in piecemeal. There were all these little separate events, all these separate little some of them kind of almost akin to urban legends, that kind of hit people one after another on a daily basis. And that's not good.
One of the things we found after the election though, after the 2020 election, was people came in and took all those individual events and they built them into this large, as I said, complex interlocking conspiracy theory, right? And so you had all this stuff that was of laying out there at the end of 2020. And then Twitter and other platforms shut down monitoring of the election and just allow people to take all those little pieces, all those little Legos, and sort of build this larger election conspiracy. Which has now become, as Paul said, for some candidates, this is now the belief system. And you allowed people to construct this incredibly false but also dangerous belief system online because the idea was, well, the election is over. We don't have to deal with this.
We see right now more new claims, new misinformation about the 2020 election now. I mean we're getting closer to 2022, but we see more new misinformation about the 2020 election now at this point in September than we were seeing in September of 2020. I mean I think there are some things that you can look at. Is it hitting the trending topics as much? I mean I think there is some stuff that's going on behind the scenes that is dealing at least with some of the push into trending topics and things like that. But we don't have visibility into that. And so I can't say for sure if that's just luck, or if that's something more systematic. I can certainly say on the sort of variety and volume of claims, we see at least as many, probably more, claims in September 2022 about the 2020 election, the past election, than we were seeing about that election in September 2020.
Justin Hendrix:
Nora, I think some of the coalition work that you've done, obviously it's been very, very helpful and useful to have that work happening and important. But I think of it sometimes when I wear my pessimist hat, it’s slightly free labor for the platforms as groups are flagging information, flagging posts, flagging narratives, flagging problems on the platforms for the platforms. And I remember observing things like the Disinfo Defense League in the 2020 cycle, and it was all hands on deck. People were working 24/7, kind of almost like being on a sort of sinking ship and bucketing out the water as quick as it comes in. The platforms are addressing elections in an episodic way, but civil society groups don't have the budget or resource to deal with these things all year long either. I don't know. How do you square those things?
Nora Benavidez:
Well, to be a little tongue in cheek about it, I do laugh when companies write back or after meetings that we've had, which I can detail a little bit here, and their final comment is usually, "And of course if we've missed anything, flag everything for us. It'll really help us." And I have to laugh because it's always sort of the CYA clause, if you will, to make sure that if they're not doing their job, at least someone's doing their job. "And thank you so much Nora and all of the civil rights groups that are focused on safety and democracy." So all I can do is kind of chuckle at that point.
Here's what we've been thinking about, and this is across a lot of different coalitions, I think there have been some really great moments over the last couple of years in which solution conversations have helped us move a little bit beyond the hand wringing. And that's really happening in the policy and regulatory space, which has been great. The corporate space, working with companies, has been so hard. And to that point, just see all of the things that we've talked about so far today. Getting companies to meet with us, getting companies to respond, opening up what many have called this black box where a lot of times researchers, journalists don't even have access to some of the most critical and basic information and data about these platforms.
So where we've moved in some exciting strides on the policy front to think about how we can protect users, how solutions and other kinds of longterm reforms could begin getting drafted, conceived of, how we build consensus. On the other hand, we've been trying to think also about, well, what can we push companies to do themselves? Because not everything can actually be legislated. There have to be some things that companies do. And that's where a lot of the coalition work has been, betwixt and between. I am not going to mince word. I think it's really hard when we know that these companies essentially have no good faith in coming to the table or sharing information. It's really hard to try to bring them along any more than exactly as you frame it, Justin. They're response of, "Do my job for me. It'll be great."
And so over the last many months we have felt across different coalitions, it was important to help build a record that shows the companies, major social media companies, are not treating these threats as anything more than anecdotal. They are. And I remember in 2021, I attended a sort of civil rights meeting with Twitter as just one example. And it was supposed to be a look back at the 2020 election. And there were some takeaways about how they're hoping to do better in 2022, and all of the other things like the Birdwatch program and what that might mean for accountability. Again, a consumer and user centric approach to moderation.
But I at one point sort of spoke up, and said, "I think whether it's COVID or election, or any other content that has the potential to be weaponized for people to believe something false, these are not threats that wax and wane, at least for you as a company. You have to treat these things as evergreen threats. Even if it feels like there may be surges, the surge doesn't come out of nowhere. There is no vacuum." And human doubt, human curiosity, human suspicion is an element Mike certainly I think could even talk more to, but that also isn't something we turn on and off. So if you hear just a tiny little whiff as a consumer online of something false, or concerning about a COVID related official doing something, there's no barrier to then prevent you from believing another kind of official will do something. Those are really porous human behaviors, right?
And so in talking with Twitter, I remember trying to kind of lay out this argument. You have to begin thinking about election integrity, civic integrity, all of the other ways that misinfo may infiltrate people's feeds as evergreen. And the response was, "Wow, that's really interesting. We hadn't thought of that." That was in 2021. Knowing that the midterms were coming, part of what we've tried to do across these different coalitions, namely Change the Terms, which is a coalition of about 60 civil rights and consumer advocate organizations, is pull together meetings and demands for companies at the highest levels to try to synthesize some of the problems, the evidence of failures, the evidence of inaction.
And for us to present a united front to say, "Listen, stop referencing hate. Stop showing people disinformation or calls for violence. Also staff up and protect people all the time year round across the globe." And then finally increase transparency for people like Paul or Mike or others so that we really have a better sense of the whole pie of the problem. And over the last many months we've met with companies, we've presented these issues, we've tried to talk with them about the urgency.
And largely the broad strokes takeaway is that the companies have provided us very little intel, almost nothing beyond their public announcements from August or September, depending on when they came out with these election plans. And have committed to largely what they were doing in 2020. A little bit more in terms of language protection and trying to gather data across languages. But the issues haven't changed. They've only become more complicated, and the companies have also not changed. They've in fact stagnated and resisted engagement.
So the basic question you pose, how do you reckon with these episodic moments? Civil society can't do it alone. We can't. And yet we've been forced to build enough of a paper trail to try to show that failure is not something that happens in a month before an election. It is too late. It is September. Well, it's the end of September right now as we're recording this. It is too late for any of the major platforms to implement any meaningful reform ahead of the midterms.
So let's just for a reality check understand that we are now dealing with largely what we're going to deal with over the next five weeks. And we are ill prepared. The companies are ill prepared. They don't come to the table with an interest in really rigorous debate about what it means to be more prepared. And we tried to do that in April. We tried to do it in May. We tried to do it in June. We couldn't even get meetings with some of these companies until July and August. And to me that speaks volumes to their lack of interest to this being and remaining a problem that they string solutions together in the final 11th hour.
Justin Hendrix:
So I want to ask a question, and this might be more for you Paul, but Mike please feel free if you want to kind of get in on this as well. But is part of this the political context that there are different concerns about misinformation across the political isles? And of course some of the individuals who may end up in powerful positions if Republicans win the House in November. I'm thinking particularly of Jim Jordan who is one of the most ardent individuals advancing the big lie in Congress before the 2020 and after the 2020 election. Are the platforms on some level kind of bound to the politics of the situation? They know that the Republicans might have subpoena power, and they don't want to upset them by seeming to pitch in with potentially civil society groups that may be regarded as working counter to their own interests?
Paul Barrett:
Rick Hasen at UCLA, the election law guru, offered that speculation, and I quoted him on it in my report as one possible explanation of what the motives are here. And that's certainly maybe an element of what's going on. I don't think it describes the entirety of it though. The sort of perplexing malaise that Nora described, which I encountered as well. The seeming almost obtuseness about the urgency of the larger situation, I think has a number of explanations.
I think we've seen documented particularly in connection with Meta, with Facebook, a hyper sensitivity to attacks from the right. Such that the company has for years now sort of bent over backwards to try to protect itself against accusations that it is biased against the right by taking a pass on internal company proposals for various types of reforms that might have the practical effect of moderating more content down ranking or resulting in the removal of content that would piss off people on the right more than the left.
So that that's a phenomenon that you can see illustrated over and over again. Whether it's Zuckerberg's trips to the Trump White House, or even his recent somewhat bizarre three hour long performance art with Joe Rogan. A significant part of which consisted of his basically saying, "I'm really not that interested in these issues anymore. I'm focused on the metaverse." Which is a tendency at that particular company I think we've seen articulated repeatedly over the years. This notion that if only you guys would just kind of leave us alone, we would do our business, and somebody else would take care of these questions of truth and falsehood and political bias.
But I think there are other corporate issues at play here that deserve to be acknowledged. In connection with Meta, I mean the company is seriously trying to pivot toward a new product line captured by this vague term metaverse, 3D hyper-immersive platforms that Zuckerberg sees as the future. Moreover, the company is really rattled by the degree to which it is losing market share to TikTok and short video, a product where it's just not competing and certainly not competing for younger users. And also just general mediocre to terrible financial results. It's stock price is way, way down. All of those things, sadly, weigh on the degree to which the company is giving priority to the issues that we're talking about.
Similar things are going on Twitter. Everyone knows about the Elon Musk situation where he made a sort of a hostile bid for the company. The hostile bid became a non-hostile bid, and then he decided to pull back and say in the end, "I don't want to take over the company." And now they're engaged in litigation in state court in Delaware, which could determine the future of the company. Twitter is seriously distracted by that whole situation. And its economic results are poor. I mean Twitter has never been a financially stable company to start with, and now it's really in great peril.
The degree to which all of these things are distracting from focus on the civic integrity issues, the election issues, I think has to be taken into account. It helps explain what to me otherwise seems like a bizarre indifference to slipping into a situation where we could have a repeat of 2020. It is bizarre at one level, and the companies still need to be held responsible regardless of their other business troubles. But their other business related troubles are very relevant to what we're talking about and explaining why the things that we're describing are going on.
Justin Hendrix:
Spoken like a true former business journalist.
Nora Benavidez:
I'd like to jump in, Justin. Can I say just a couple things?
Justin Hendrix:
Yes.
Nora Benavidez:
Because I think the political question is just so interesting. And my own personal worry and attention is never on the disinformation arc alone, or that it is dividing people. Because that's a favorite line. Though it is in many ways. I think the more troubling issue is the ways that lies help convince people, namely policy makers, to develop structural inequities. And that's something that's now happening at every level of government here in the United States. It's happening in our election laws, which are giving state officials the ability to usurp local election results. It is of course the early bird on the Republican side, if you will, that has claimed censorship, which is I think in many ways a misnomer here. And the idea that tech companies censor political viewpoints is inaccurate from a First Amendment perspective.
But together all of these sort of claims have helped create a climate and a kind of fever pitch of victimhood. And it's really persuasive. So that those say they are censored, those that say they are unheard, must rise up. I mean it's a very compelling story. And what it's led to are barriers and limits on basic rights. So the end result is not just that people have more hate in their hearts, but that the ways we engage with democracy and with each other are harder. And they will necessarily, I think, give government ultimately the ability to pick and choose who has the right to speak, the right to vote, the right to do any number of things.
And I'm not sure what that political moment now should be calling for from others, though it points to the need for a kind of existential consensus building. Because this is not the tobacco era regulatory moment. This is something else. And so comparisons have often been made to that kind of many decades ago issue where policy makers grappled with it. But this is so much more insidious and sneaky that I think ultimately it's going to take a while for that consensus building to reach the fever pitch it needs.
Justin Hendrix:
Mike, I'll put one last question to you. As a scientist kind of studying this as phenomena that's occurring in these systems and occurring in this broader political context, I don't know, do you see a trajectory here? Can you imagine 2024, 2028 being better somehow if certain decisions are made? Or is what Nora is saying, and to some extent Paul as well, ultimately are these platforms and the incentives that they create and the sort of architecture of them of incompatible with running elections that aren't tainted by disinformation and misinformation?
Mike Caulfield:
All right, I'm going to just deal with the first half of that. The second half of that's like a seminar course. But I don't know if people are familiar with this term from science fiction, terraform. People ever heard that term? The idea of a terraforming machine is you want to go and make a planet livable for a species. So you kind of throw a box on there, and it slowly changes the environment so that it becomes suitable for habitat, right? Rivers start flowing, trees start growing up, and so forth. And of course some science fiction writers take that and they turn it on their head and they imagine scenarios where an alien force comes and tries to terraform Earth for their own purposes. And of course to us, that was a dystopian hell, right?
I bring this up because I think, when we talk about trajectories, I wrote a post a while back that said misinformation terraforms its environment. The misinformation, disinformation that gets out there and it's produced, we sometimes talk about playing whack-a-mole with these individual items and so forth. But what happens over time is disinformation pushes a sort of ideology in an environment that makes it easier to push more disinformation. And that's kind of what you're seeing now, right? You see in 2016, some early signs of this. You see sort of a broad embrace in dissemination of this through 2020. And now you start to see that's moving up into the institutional level of these things, and that's part of what the ballot trafficking post was about, we're starting to see a lot of this stuff in the institutional levels.
And of course Twitter now has this problem. It wants to preserve the speech of elected officials and whatever, but the environment's slowly been terraformed to be really suitable for these conspiracy theories. And so when we talk about trajectories, I want to be optimistic, but we're never starting from zero. Every day we let this go on, the environment becomes more and more formed around these myths, more and more formed around these practices. And we not only have to start doing things right, but we've kind of got to undo where we've gotten.
And I think from a policy standpoint, I wish that was something that people would take more seriously. I think sometimes people hear researchers say, "Well, it's not about the individual items." It's not. It's not about the individual items. It's not about the game of whack-a-mole. But at the same time, you've got to learn to both play the whack-a-mole, which is building the next step of this thing, while addressing all the harmful practices and structures that have formed because you haven't been addressing that over time. And that's a really big challenge. I haven't seen any platform take that on to date. I wouldn't be in this business if I was a pessimist. Yeah, but it's a heavy lift.
Justin Hendrix:
Your comments remind me of listening to certain platform executives crow about their efforts to remove Qanon from their platforms, after allowing it to fester and thrive, and seeing of course the harm that that created it. Almost sort of taking credit for taking down this network after allowing it to get to that point. Allowing so many people to find purpose and connection inside the Qanon conspiracy theory and community, really boggles the mind on some level.
Mike Caulfield:
And this is a great example. I know we got to go in a minute, but this is a great example of it. Because what people say is, "Well, oh well all the problems now are on Telegram." Right? The execs will say, "Well hey, we're doing good compared to Telegram." Well how did all those people on Telegram meet one another, right? Did they meet one another on Telegram? No, they did not. They did not meet each other on Telegram. So by letting that fester and letting these people make their reputations and their living off of this and creating these networks, you created an environment that, yes, once you pulled some of those people off, a lot of that moved over to Telegram.
But you own that as much as Telegram. You're the people that put that whole party together. Just because you kicked it out at the last minute doesn't mean that you're not to blame. And so there's a sort of misunderstanding about these things of that way. The idea that every piece of content forms a network around it, which then forms and promotes more pieces of content. It's a cyclical thing. You can't escape that. And if you don't start to address it at that level, and start to de-terraform this environment, we're just not going to get there.
Nora Benavidez:
When I was a litigator years ago, I saw there being a kind of episodic attention, almost chaotic attention and fervor around elections. That's just sort of par for the course in every nation I think. And we're seeing it in the social media context, and in the misinformation and hate context, of course. How do we bootstrap ourselves across all of these sectors out of that? I'm not quite sure is the truth.
Another phenomenon that is equally fascinating to me is the lame duck period where we're all convinced we can get everything done in the lame duck period, and we can't. And so I think just it's sort of a human behavior that we stuff everything in, and think, oh, well now is the moment. We procrastinated enough. But 2024 does loom very large. And as much as 2020 was a pretty terrifying experience, especially for those of us on the ground that were monitoring and hearing from voters, 2024 is going to be all the more heavy. And to me that poses the question of what can we do in the interim across every sector to gird ourselves against the interventions, the interruptions, and what's ultimately the distraction from making sure more people are engaged across the board?
Justin Hendrix:
Paul.
Paul Barrett:
Okay, well I hesitate to follow these very good summary comments, but I'll offer a couple. One is something I should have mentioned earlier in this conversation, which is I think of as kind of the Yochai Benkler caveat. He doesn't see it as a caveat. I do. There are a lot of sources for the problems we're talking about. And some sources that are more direct in that they are both the source and they disseminate the lies and the misleading statements. Fox News is hugely powerful. Were it not for Donald J. Trump, some measure of this would not be going on. His singular corrosiveness and contribution that he's made to our information breakdown cannot be overestimated. And then the endless podcast and websites and on and on and on.
So all of this is does not come from social media. Justin, as you and I have written, our problem with hyper political polarization, social media is not the sole cause of that. It couldn't be because we've had polarization long before social media was invented. Nevertheless, social media is a central amplifier of what we're talking about. And therefore well deserves all the attention we're giving it.
And a final final thought is that actually we cannot solve these problems without the collaboration of these companies. Because of our First Amendment understanding, the government cannot weigh in on content policies, let alone content decisions. And we don't want it to. There's wisdom to the First Amendment. And because of the complexity and ever evolving nature of the technology, the government wouldn't be able to keep up with these problems as they change and morph from month to month and year to year. So we're stuck to some degree. And we need these companies to regulate themselves to a much greater degree, even if we get government regulation that many of us think would be wise as a partial, and only a partial solution. So I think there's reason for pessimism. Unlike Mike, I'm a dyed in the wool pessimist and I'm quite worried.
Justin Hendrix:
Well, we've heard three different perspectives, and certainly expertise, if a mix between optimism and pessimism. And we'll leave the listener to decide where they land on that. But Nora, Paul, Mike, thank you so much for speaking to me.