Home

Understanding the People Who Turn Lies Into Reality

Justin Hendrix / Sep 1, 2024

Audio of this conversation is available via your favorite podcast service.

Renée DiResta, who serves on the board of Tech Policy Press and has been an occasional contributor, is the author of Invisible Rulers: The People Who Turn Lies Into Reality, published by Hachette Book Group in June. Reviewing the book in Vox, A.W. Olheiser writes:

DiResta’s book is part history, part analysis, and part memoir, as it spans from pre-internet examinations of the psychology of rumor and propaganda to the biggest moments of online conspiracy and harassment from the social media era. In the end, DiResta applies what she’s learned in a decade of closely researching online disinformation, manipulation, and abuse, to her personal experience of being the target of a series of baseless accusations that, despite their lack of evidence, prompted Rep. Jim Jordan, as chair of the House subcommittee on Weaponization of the Federal Government, to launch an investigation.

I had a chance to catch up with Renée last week to discuss some of the key ideas in the book, and how she sees them playing out in current moment we're in headed into the 2024 US election.

The cover of Invisible Rulers: The People Who Turn Lies into Reality, by Renée DiResta. Hachette Book Group, June 2024.

What follows is a lightly edited transcript of the discussion.

Justin Hendrix:

Renée, I was thinking, reading this book, that it's like one of those ones where I feel like I lived it in many ways observing all these phenomena that you are writing about and theorizing about. But also, I've followed your work obviously very closely, I've been in conversation with you over the years on these things. At the end of the book, you end in a place that I feel like has been the shared project of so many lately, right? This idea that we desperately need consensus, you write, when we face profound collective challenges, addressing issues like climate change, pandemics, and technology transforming the workforce is impossible if we remain locked in factional warfare. Something has to change.

That seems like the project, I think, many tech policy press listeners have been on, thinking about the problems of social media, thinking about the intersection of tech and democracy. I want to get into what you think should change. But I thought I'd just step back for a second because littered throughout this book, there are references to thinkers, and some of them from a hundred years ago or 50, 70, 80 years ago. I want to talk about your priors first. Chomsky, Bernays, Canetti, Dewey, Lippmann, all these names that keep coming back up. Who are the kind of thinkers that I suppose are top of mind for you that you find yourself continuously going back to?

Renée DiResta:

It's a great question. So first I guess I should say I wanted to write a book about propaganda, and that was why the folks that you're mentioning mostly are these pivotal scholars in propaganda theory over the years. And I wanted to write about propaganda because after maybe almost a decade now of looking at social media manipulation, ways that narratives spread, ways that consensus has changed, I felt that a lot of the frames that we were using were actually not great. I really started to dislike misinformation about five years ago, maybe around election 2020. And so much of what I saw around in the ether as we were looking at these things over the years, starting with... I opened with the anti-vaccine movement in 2014, 2015 timeframe, was that it wasn't misinformation so much as it was information with an agenda. And we've always had a word for that. Propaganda, it's a great word for that.

And also the systemic nature of propaganda, the idea that there is an outcome that you want to see in the world, there is a phase shift that you want to see in the world. What do you do to get there? What is the network that you build? What is the activist movement that you create? What are the... We say memes now, but what are the slogans that you come up with to make ideas sticky in people's heads? What are the facets of their identity that you appeal to? What had changed was the process of propagandizing and that it had become very democratized that now anybody could do it, and that actually everybody was.

And so I set out to write a book about propaganda. And it's funny because when I wrote the first pitch and handed it to... I met my agent because she reached out to me after some Ribbon Farm essays I'd written, and I gave it to her and she was like, "Yeah, no, nobody wants an academic treatise on modern propaganda." So the idea of orienting it around the influencer came about over maybe a three- to six-month period of talking about it with a lot of folks and saying, "Who is really new? What is really the most obviously distinctly different thing about how public opinion is shaped today?" And I decided that I was going to really orient the book around the relationship between the influencer and then the algorithm and the crowd. So this sort of triad, this system where one can't exist without the others, and I wanted to write it around that.

Justin Hendrix:

So how does that bring in Bernays? How does it bring in Lippmann and Dewey? How does it bring in Chomsky? Chomsky in particular seems to occupy a lot of your mind.

Renée DiResta:

Lippmann and Bernays are contemporaries, so they're writing... Works of theirs that I draw on were right after World War I. Lippmann is writing about whether it is in fact the obligation of the state to help people make up their minds. And it sounds very paternalistic when you say it today in the current zeitgeist, that sounds quite creepy, but he's arguing that people think and feel in different worlds even though they live in the same one. And he advances the idea of the stereotype. That's a term that he actually coined. And it's the idea that people have these quick heuristics, these things that they grab, and they don't have time to go deep on any particular topic. And in order to be an informed citizen participating in a democracy, the idea is you should be well-informed. And so he says it is actually the obligation of the government to convey information to the people. It's the obligation of the government to digest the opinions of experts and then to communicate to the people not how they should think, though that's how we would probably interpret it today, but what they need to know.

And he uses the phrase, this is, again, the early 1930s, the manufacture of consent. The purpose of this is to manufacture the consent of the governed. That phrase also now in the modern era has something of a sinister tone because as we'll talk about Chomsky pulls it forward into the 1980s and describes a very different media ecosystem than the one that existed pre-radio and pre-television. Chomsky is taking it forward to what happens as media really becomes mass media. Whereas Lippmann is writing about it in the sort of early nascent age of some of these technologies, and once very much still a print media kind of environment too.

And what you see from Bernays is he also works with Lippmann doing propaganda in World War I, sort of selling the war to the American public. But what he takes out of that is that not only should the government be doing this, but also this is advertising, this is public relations. This is how, by appealing to people with sticky phrases and slogans, that by to them as members of a group, by really emphasizing that notion of group identity, you can actually sell people products. You can create demand for cigarettes by making it seem like everybody else around you, everybody else who's like you, the people that you admire also want cigarettes. And so he has a number of these kind of campaigns that he undertakes and he writes this book Propaganda about them. And again, the word is not a pejorative yet. That doesn't really happen until World War II.

So you see him describing his capacity to shape public opinion and the term invisible rulers comes from him, and what he says is there are invisible rulers who control the destinies of millions, and our minds are made up by people we've never heard of and never seen. It's not the politician who is the person who is shaping public opinion, it's the person behind the politician, it's the person who is thinking about how to give the politician the words to say, the person running the campaign, almost, is the person who is really responsible for your association with that candidate. And I was thinking about that quite a bit in the context of influencers because what he's describing essentially is marketing. These are marketing campaigns, marketing campaigns for ideas rooted in identity. And I think so much of what we see, particularly in influencer culture online today, does do exactly that.

Again, the first really... the influentials, as they're called, when they begin to emerge as social media becomes prominent, the first people who realized their potential are not the politicians at all. It is the brands. It is the brands who recognize that this woman can sure sell you some shoes. That this young mom is the perfect face for Tide. And so there is this recognition that the same means, the same tactics, the same styles, the combination of rhetoric and identity and relationship and resonance, these things are very much present in the modern social media ecosystem today.

Justin Hendrix:

The thing I'm thinking about here, that is the theme of course running through this book is the battle between elites over perception, the role of individuals, the role of social media platforms in mediating that battle. But you also make important distinctions between terms, propaganda, misinformation, rumor. That also seems fundamental kind, understanding your point of view.

Renée DiResta:

Propaganda, again, information with an agenda designed to serve the interest of the person creating or in this case spreading it. And one of the things that I spend a lot of time on is the way in which that used to be a very top-down process, very much kind of controlled. And this is maybe a place to allude to Chomsky. He's very much associated with the understanding of propaganda in the modern age because he takes the idea of manufacturing consent and he finds it very disturbing, something sinister that the government is doing. The government is not informing you. The government is actively misleading you. And Chomsky of course is writing after the Vietnam War, after a number of instances where the government quite demonstrably did in fact mislead the public, and the mainstream media, that sort of top-down propaganda machine, he sees as quite complicit in that.

And the book that he writes explains the incentives of modern propaganda toward the manufacture of consent. And that is he goes into the incentives that I think are still present today, which is again why I wanted to update that model for the modern day, for the influencer ecosystem. So he's talking about ownership, how does who controls and owns the media influence the coverage, advertising. Advertisers influence, not only by virtue of you don't want to critique the pharma company putting the ads in the paper, but also the pharma company wants to reach a particular type of audience, so you're also going to create content that attracts the audience that the advertiser wants to reach.

So there's a double layer of impact that funding model has. They're sourcing the way that media actually can turn people into experts simply by quoting them as an expert. And so they do have relationships, particularly in the '80s with credentials to people, with academics, with government officials, but the media does participate in taking elite opinions and ensuring that they are what is heard. They're not really necessarily going down to the random independent person there. They're much more focused on the relationships they have and then also maintaining those relationships so they don't criticize their sources or their sources' organizations to the extent that they perhaps should.

And then there's flack. He describes it as media doesn't want to cover things that attract controversy for them. They're fine creating controversy for other things but not for them. And then the other thing is media creates enemies. It has to tell a story. And one of the things that is very common in storytelling is who do we all not like collectively? And so in the age of mass media, when Chomsky is writing about this, he's describing a media that is attempting to reach a mass audience. And I think that this is one of the really big shifts that I try to get at, which is the incentive structure is incentives to create sort of a hegemonic identity is what he calls it, a national identity and national audience. The sort of small voices are left out and the media manufactures consent by shaping what the public sees and framing things in a particular way. That's how I would describe propaganda, and we can talk about modern propaganda in a little bit.

Misinformation, the term that I really started to really strongly dislike actually, is information that's inadvertently wrong, and people are sharing it because they believe it. And misinformation, it assumes a particular intent. It assumes you're accidentally wrong. That's how it's differentiated from disinformation where there is a deliberate campaign to spread false or misleading claims or content to make people believe something that is actively not true. So disinformation, there's a person who is, or people who are back there trying to manipulate the public.

Misinformation though, it became a term that was applied to things where the fact was we really didn't know if it was true or false at all. And one of the things that is very distinctly different about social media as a system, as an ecosystem, is that something is going to go viral long before the fact can actually be known. And this is a little bit different from broadcast and print media because back then, coverage would be held at least a little bit longer. There would be maybe an announcement that an event had occurred. Maybe this is what we know at this time.

You all remember you're about the same age I am, right? You turn on the nightly news as you're following some crisis, and it's like this slow drip where the reporter just has to stand there with the microphone outside of the courthouse waiting for the person to come out, and we're going to go to commercial break now, and there would be this... You would watch the television for hours in these big... I remember this on September 11th, waiting for the one new detail to come out after it had been vetted, and that's just not how it works today.

So the idea that COVID misinformation was a term that got tossed around a lot when it was really things we didn't actually know yet. And the process by which people spread information from person to person, we had a term for that too, which is rumors, unverified information spread from person to person because people are interested, and they also do it altruistically, they want to help their community, and they might do it because the thing is sensational and shocking and everybody wants to hear about the affair that the politician had with the teacher or whatever.

And so we have these words like propaganda and rumors that have been around that have been in our understanding, our collective understanding of how stories have power, and how top down versus bottom up stories have a different kind of power. I felt like we had lost those terms, and we kept trying to shoehorn things into this problem of facts. And after all the work I did in 2020 on the election and then 2021 on COVID, I was like, I actually want to write about why that's wrong, why that is just not the right frame to use to talk about the experiences people are having and the crisis and the information ecosystem is not a new one, it's just how this current ecosystem has shifted propaganda and rumors into something that we should still be thinking of as propaganda and rumors while also understanding the different inputs and power centers in that system.

Justin Hendrix:

About halfway through the book, you write about 2020, you write about that moment at the beginning of COVID, you write, "My memory of that time in early 2020 is of feeling disoriented. There had never been more people shouting theories, many with MD after their name, and yet somehow it felt impossible to know what was true. Entirely distinct constellations of COVID-19 experts began to emerge, some focused on lockdown, some on school closures, some on treatments." I think everyone remembers this, right? We remember suddenly following certain doctors who seem to have some special interest in some topic that was going to help us understand what was going on in the world. And yet, I guess on the whole, it didn't all add up to consensus. Certainly quite the opposite, right?

Renée DiResta:

Right.

Justin Hendrix:

What does COVID tell us about this interplay between elites' knowledge and information and the current media and information ecosystem we've got?

Renée DiResta:

See, I think elites is also a strange word, and a word that I think is... It's used as a pejorative now to refer to people who the "old media ecosystem," the top-down media ecosystem would've considered a source just to continue with that sort of Chomsky framing. I used to ask this question periodically on fights on Twitter with populists, "What do you mean when you say elite? What is an elite?" You've probably seen this idea that an elite is a person who holds approved left-wing opinions was how it was reframed, but we used to consider doctors to be quite elite, right? You go through a very extensive period of training, you have a very formal degree after your name, you're a credentialed expert, unambiguously credentialed. You have the MD, right?

And yet this sort of trend toward populism turned that word into something that I felt became increasingly... It meant nothing. It was just like, "Oh, those people over there that we don't like," particularly as it was repeatedly used by very influential figures, influencers, who had millions of followers, the reach of far beyond the average mass media figure, the average random reporter at the Washington Post compared to some of the quite influential people who had built up a profile as themselves.

And so this question to me, what do we mean when we say elite and media? It really just became a coded language for whether or not you were an institutionalist or a populist. It ceased to be a term that had any actual meaning, or meaning that we could point to and say, "Okay, here's the criteria for being an elite today." It used to be that you had that kind of reach, that kind of megaphone, that kind of following, but now we're pretending that that's not true, that those people aren't elites. They're just people with massive powerful followings who are extremely rich.

The thing with COVID though was that it reflected the... what I describe in the book as the split into niches. COVID was a... just truly phenomenal to see the extent to which your experience of what was happening was shaped by who you followed as you sat in your house doom-scrolling on your phone. And also this notion of who you trusted. And it started very early on with questions like should you wear masks, in early January, February of 2020. Initially the CDC was saying no. The health officials were, you might remember this, they were like, "No." You had people like Zainab writing about why the answer was yes, actually. You had online influencers, people who... data scientists with Twitter accounts saying, "No, you should be." I remember Andreessen had a sign on their door, "No handshakes, wear a mask." These sorts of things that got actually ridiculed in mainstream media early on. It was seen as very paranoid before that flipped. And then ironically, those people then came to hate masks.

So there were just all these big alignments that happened, not so much because there were demonstrable shifts in facts, but it started to feel like a lot of what was happening during COVID was the facts weren't necessarily making it out, the institutions were not communicating. The CDC was not out there every day on Twitter or wherever else putting out the latest information. They were very reticent. They were waiting until they were sure. And so the voids were being filled by people who both attacked the CDC for being slow for not entering the modern era, which I think is a very reasonable critique, and they were also in there with people offering all sorts of crazy ideas. Remember, take colloidal silver, it'll keep you from getting it. High doses of vitamin C, that will save you. And these things that were random, but they would go viral because somebody was giving you something that you could do.

And that really became a very interesting thing to experience. I spent time both in the center-left and center-right kind of social communities and ecosystems, and they were not disaligned. There was more kind of vitriol towards the institutions early on in the center-right media ecosystems. But you did start to see the sort of drifting apart as so much of COVID became about identity. And that even reflected in the vaccine numbers of course later on, this idea of not trusting the institutions, not trusting the vaccines. It really continues to snowball, and you wind up with these two parallel universes where people are very upset about completely different things and actually quite contradictory things a lot of the time.

Even between the center left and the progressive left, you start to see some pretty big divides around trade-offs related to school opening. It just winds up being a very fragmented environment in which you don't get the sense that we're actually going to come to any agreement on this. We're just going to move apart, and different identity-based silos are going to take action in accordance with what their community is doing. And people also don't want to speak up. This I think becomes another thing that you see during COVID. It really does continue to solidify some of the dynamics where you'll be viciously attacked if you deviate from the orthodoxy of the group that you spend your time in, and because the stakes seem so high. So that was my experience of COVID. I imagine yours was pretty similar.

Justin Hendrix:

I keep thinking while you're talking about that famous tweet from the World Health Organization from... I think it was even in March, 2020 or early March, 2020, that COVID is not an airborne disease.

Renée DiResta:

Yeah.

Justin Hendrix:

Something along those lines. It's almost perhaps maybe precisely that. But I guess that's part of the other issue too, is that science just simply doesn't move... Often, knowledge-producing institutions, these entities that are epistemic backstops or meant to be, they just don't move at the pace that this conversation is happening.

Renée DiResta:

But what's so interesting about it though is that it becomes such a function of identity. So let's take hydroxychloroquine. I was paying attention to hydroxychloroquine because we had a project... So we recycled the name. We called our project The Virality Project, but it was originally a study of state actor propaganda about the vaccines. Each of the eight of us at SIO had picked a country and we were following narratives like what was happening in Nigeria, what was happening in Germany, what was happening in, I think Saudi Arabia, United States, Russia was one of them, oh, and China, duh. And so we had this project looking at how state actors were messaging the pandemic, because China was really out there full-on, full-bore propaganda machine, on it beginning in late 2019 as they begin to realize that there's a crisis. And I wrote a couple papers detailing what they did, so I was focusing a lot on China.

But what was interesting about hydroxychloroquine is that it emerges originally in Southeast Asian Facebook posts as people are... physicians and others are saying, "We have this thing. It looks like it might be effective." And remember, they've been hit by COVID earlier. So you actually see the conversation move around the world as also hydroxychloroquine is something that folks in Africa are quite familiar with. It's an anti-malarial drug. And so you see people in Southeast Asia and in Africa speculating about whether hydroxychloroquine will work long before it makes it to the US in around March or so of... goes viral in the US in March of 2020 when Trump talks about it, and then all of a sudden people are talking about hydroxychloroquine. And then all of a sudden it becomes almost like an identity marker, like hydroxychloroquine is the thing that you should take, but you also see a reflexive pushback against it because while this is a thing that Trump is promoting, so obviously it must be bullshit.

And there's the sort of politicization of basic things like this. Ivermectin of course becomes the next one, and ivermectin is that does not really move around the world in the way that hydroxychloroquine did. But you also see in the hydroxychloroquine conversation interestingly is that in other places when it's shown not to be particularly useful, it just drops, people just stop talking about it. Whereas in the US, people keep talking about it because now it's this culture war identity-based thing about do you believe in hydroxychloroquine or not as if belief is something that is divorced from fact and research. And yes, well, you might say this is very interesting, this is very promising.

When the science does the work and finds that it is not effective in the way that people had hoped it would be, you would expect to see what happens in Southeast Asia and other parts of the world where they just stopped talking about it. But that is not what happens in the US. And so you see this bizarre dynamic where even cures become part of the culture war, and that's the thing where, as we move into the sort of what The Virality Project as it came to be in the public, which was the study of vaccine narratives, you see that same kind of thing happening. You see the identity-based kind of culture warriors in the US to a degree that is just not represented in a lot of other parts of the world.

Justin Hendrix:

Anybody who reads this book, I don't think that they're going to come away from it thinking that you are necessarily a fan of the current corporate social media environment that we have, or that you're necessarily invested in any particular technological outcome. You do show some enthusiasm for decentralization and some of these approaches like middleware and possibly rejiggering the entire power dynamic in terms of how things like content moderation work or who's made responsible for moderation in those types of systems. You have different types of suggestions for the social media companies. A lot of those will sound very familiar to tech policy press listeners, things like making it possible for these things to be studied, thinking through the dynamics of what government regulators can potentially do.

But I'm going to zoom out and I may slightly caricature where you've ended up on this, so you can correct me if I'm wrong, but it seems to me that you see the role of social media platforms and their trust and safety content moderation teams and policies as playing a role, but you're not terribly invested in the idea that's going to be the solution. It seems to me that you're saying government has a role, but there are all sorts of limitations on what government can do and for good reason, so we have to limit that a bit. It seems to me what you're calling on somewhat is for people and institutions to change, even though you're also saying human nature is fundamentally not going to change, so that's not going to change.

You have this idea of reset norms. I'm always interested in the norms. What if there was a norm that if I mislead someone, I have to walk into the Times Square and rend my garments and beg for apology from the public? You could imagine a different society, right?

Renée DiResta:

Yeah.

Justin Hendrix:

But I don't know, they're caricatured, the general thrust of things. Do you ultimately come down to this is about institutions, this is about... we've tried to define the term elites, but is this about elites changing the way they behave?

Renée DiResta:

It's such a great question. I'm actually really glad you asked because I always have this problem where I feel like there are people who have very deeply held beliefs that there is this one thing that if it happened, things will get better, and I am not a person who believes that. I really believe you have a complex system here, and I mean that in a strict academic sense of the term, where anytime you change one facet of it, you're going to have downstream effects. This is how we designed SIO to think about it in these terms. So much of the work we did was we said, "When this technology emerges, what are the cascading effects? What are the implications this will have?" For example, what will generative AI do to trust and safety? What will it do to disinformation? What will it do to spam and scams?

There's going to be, anytime you transform the technological system, particularly in a system that involves user participation, where the norms of online interaction are shaped in large part by the affordances that we're given, the things that platforms let us do, because they do ultimately control those structures, the things that they let us do from a design standpoint shape the kind of outputs that you get. Tobias Rose-Stockwell writes a lot about this, and I like his thinking on it, because it is a recognition that when you're given, for example, users participate in this process. We've talked about the influencers in propaganda, but the crowd, us, engaged in these online spaces, particularly highly homogenous visible ones where people are really orienting around here's my Rosenbio crew, my MAGA crew, the K-Hive with the coconuts now, whichever kind of fashion you're part of, you're there. You're given tools to get a message out, and we're all using them, but everybody's in competition with each other also.

So when you change design, you are shaping how people behave, and more importantly, you're shaping the content they create. This is like the media theory 101 element of it. When Facebook decides to shift the kind of content that it puts into your feed, I use an example in the watch tab of it deciding to first it's really heavily boosting this kind of comedic content creator, and then it realizes that he's moved into producing spam videos, honestly, and then Facebook shifts what it's promoting, and you see his engagement go virtually to zero. So he then goes and tries to play again with, "Okay, what will the algorithm reward now?" So you have people are designing content and messages for machines and for humans, and that, I think, when you think about how do you change a system, you have basically... Policy education and design are the three things that I like to use, but when you're thinking about the design components of it, the design is going to change how people behave towards each other. It's going to change the content people create. More importantly, it's going to change what the platforms amplify and curate.

There's work being done right now on bridging algorithms. I know TPP has covered this quite a lot. And if instead of curating sensational content, you curate bridging-type content, you're going to have a different set of norms and a different set of content created because people are responding to a shift in the incentive system. This was the whole thing, the whole reason I wanted to connect the book to Chomsky was Manufacturing Consent was a book about incentives. If you think about what are the incentives in this system and how do you change them, design, I think is actually the most interesting and more importantly immediate lever. So that's why even as I am very supportive of and have advocated for regulation as a tool to manage unaccountable private power, which is what a centralized large tech platform is, particularly when it's owned by one guy, then you have this... Regulation can do so much, but it's going to be... How long has it been since... We've been at this now... You and I have known each other since 2017.

What major bills have you seen the US Congress pass? What major regulation has the US government gone for to actually put even the slightest crimp in unaccountable private power? The answer is nothing, and the answer is nothing because they are polarized and they're responding to incentives from their bases who sit there on Twitter all day long fighting with each other and who have very deep opinions now about tech policy because it's framed as either safety or censorship, and that is the environment that we're in here. No, I don't think that we're going to get the kind of regulation in the US that people hope will make magical changes. I think there are certain areas like transparency that provide a baseline understanding that enable harms to be discovered and assessed, and I think that's something we should be looking at. I think there are child safety laws that desperately need to pass. I think there are NCII laws like the Defiance Act that I'm largely supportive of. There are a few things where I think regulation can fix something that is quite tangible.

But a lot of what people are upset about, with propaganda, for example, with misinformation or disinformation on the internet. What you're talking about is expressive speech, and that I don't think the government should be interfering in, and I don't think the government has a role to play in the day-to-day adjudication of content moderation. So this is why I spend so much time in the book saying, "You can do this and this." My preference is to lean into the design route. This is why I spend time talking about decentralization and middleware, because they're not perfect. They're not without trade-offs. There are some really big, frankly in my opinion, bad shifts that will happen in a very decentralized world as far as content moderation goes, as far as who is responsible for some pretty egregious stuff, or not responsible because no one will be.

But at the same time, I think that it is possibly a less bad experience from an unaccountable private power standpoint because it is distributed, it is decentralized, and people have a little bit more agency. I feel like I'll stop there because it's a long enough rant, but be very happy to have one bill to point to where I'm like, "This is the bill, this is the way." I just don't think that it exists.

Justin Hendrix:

I do want to just press you on the behavior of institutions and the behavior-

Renée DiResta:

Oh, yeah, yeah, yeah.

Justin Hendrix:

... of epistemic...

Renée DiResta:

Totally.

Justin Hendrix:

... these epistemic backstops that you've just come out of one of these institutions, and-

Renée DiResta:

One way to put it.

Justin Hendrix:

That's one way to put it. On the other hand, I think that your critique in the book is of a broader set of these institutions, whether it's universities or agencies or any other type of entity. It's almost like you're begging them to recognize the situation we're in and at least play by the current rules of the game.

Renée DiResta:

Yeah. So in the policy education and design lever, I throw that into education, really. The institutions, they do not cover themselves in glory. There are some things that are technological problems and there are some things that are social problems that happen to be reflected on what we think is... We think of them as technological problems because how they manifest is often on social media these days are where they're most visible. But so much of the work that I did in 2021, like The Virality Project, even though it was miscast by politically motivated lunatics as some sort of vast censorship cabal, what it actually was a project to try to surface, "Hey, these are the most viral vaccine narratives of the week." And we posted the weekly ones on a PDF up on our website. They were there. Every week it went up, and we sent it out in an email list that anyone could subscribe to. And a lot of people who were subscribers were people who were in public health, various state and local public health officials, and then of course people at the CDC and HHS and Office of the Surgeon General.

And the point of it as I saw the project was to give them the things that they needed to respond to. It was not a project to say social media needs to take this down, and social media didn't take it down for the most part, just to be clear. But what they were not doing was they were not out there in the conversation. There were isolated groups of doctors, and I wound up writing a paper in Annals of Internal Medicine with this group called This is Our Shot, because this is an academic endeavor ultimately, which was asking the question, here are these frontline physicians. Social media communication is not their job, but they too see the failure of the institutions to communicate. And they're saying, "Hey, we are willing and we are passionate about this. We want to reduce hesitancy by putting out messages as doctors who are still largely trusted," even in the Pew ratings or whatever. But what are we responding to?

Because if you're responding to what [inaudible 00:36:49] in your Facebook feed, and you're a physician following your friends, you're not going to see stuff about the vaccines being magnetized or snake venom or any of the wild theories that began to go around or even the more realistic and therefore troubling to people stories of side effects and things like this. And so the doctors wanted to contextualize this, and they didn't have a way to do it, because they were not getting this kind of assessment from the institutions. And so this question of how do you communicate to people who are credible voices, how do you put them into that conversation? Twitter employees at the time, this was pre-Musk, realized that they needed to be helping that however they could too, and you saw them doing that by credentialing doctors, by going and giving blue checks to frontline physicians so that people could at least know that doctor was who they said they were. This was before you could buy your blue check.

And so this was, again, a model of how do we elevate or how do we bring people who have some knowledge, because there is such a thing as expertise. I know that the populace don't like that, but it's true. And so you have that model of how do you get them in there communicating in the modern era. And as you note, we did this with election officials also. It was the same thing. They're trying to run an election. You're seeing New York Times articles about this now. They're trying to run an election, and instead they're fielding FOIA requests from people who are convinced that there's Chinese bamboo fibers showing that the ballots came from China. They're still getting FOIA requests with this stuff. And as they're trying to run an actual election, there's nobody there saying, "Here are the things that are actually starting to go viral that you actually should be responding to."

So you need to have a way to pull those institutions in and to force them to communicate in real time to the greatest extent possible with the American public. I don't think that this should be a scandalous or wild thing to say. You can be communicating in real time saying, "Hey, we don't actually know the answer yet. Here's what we think is happening. That might change in 24 hours, but for right now, here's our best understanding of the facts." It's not that hard to do that. You just have to actually commit and prioritize doing it, as opposed to deciding that weekly, you're going to hold a press conference and you're going to speak to mainstream media journalists, and that's going to be the sort of arbiters of factuality. So no, when you alluded to my own institution, I felt like this too for us. I felt like when the rumors and conspiracy theories about our project started, we should have been out there saying, "Oh, come on, here's the facts, here's the truth, here's this, here's that," and showing people how the sausage is made as the conspiracy theories operate.

I find people are very responsive when you do something like you get a bad-faith email from a journalist, and you post that email and you show people, here is how the story is going to be told. The story is already written. This is how this works. Please understand, this is the dynamic that's happening here. I don't know that you're going to convince, quote-unquote, the other side. You recognize at some point that smear campaigns are what they are, but if you're saying nothing, other people tell the story about you. Other people fill the void. And I just think that institutions are so bad at this today. And so yeah, the book was like a how do you drag them kicking and screaming into this era, or at least make them think differently about it?

Justin Hendrix:

It's Thursday, August 22nd that I'm speaking to you is the final day of the Democratic National Convention. I've been struck listening to some of the rhetoric at the convention attempting to undermine the strategy of false claims about the election, or false claims that emanate specifically from former President Donald Trump. And it's almost as if they've adopted some of the approach that you're talking about here. I don't know if you agree with that or if there's something you've observed about the current moment we're in that you think corresponds to some of the frameworks that you've got in this book.

Renée DiResta:

So I've been fascinated by watching the strategy that they're running here, the comm strategy that they're running here, in part because it is very much a meme-focused... the vibe shifts happened in large part because there is this new energy, the campaign is very plugged in to the people who shape culture, and they're able to bring that in and it feels fresh, it feels new, there's some curiosity from people who are maybe older and not hanging out on TikTok or Threads or wherever. Threads is old too. But you are seeing this extremely online influencer-driven way of treating some of these things.

And one thing that I keep thinking about, going back in history is there's this book, and I'm pretty sure it's in the 1960s, it's by Saul Alinsky, and it's called Rules for Radicals. So he was a socialist leftist type figure, labor organizer. And as he's writing this book, Rules for Radicals, it's about how do you actually stand up to institutional elites, what are the ways in which you attack them effectively? And what's been interesting about the campaign is they have positioned Trump as this legacy institutional, oh, he was the president, he's not a... But when he ran in 2016, he was very much in that insurgent position. Now they have reframed him interestingly into this, "This is the old way. We're not going back," and it's dismissive.

But one thing that Alinsky talks about is the sort of psychology of how you conduct attacks like this, particularly on these figures. And he has this phrase that ridicules man's most potent weapon. I was thinking about that nonstop as the JD Vance stuff started in part because it is very hard to know. It's because it puts people on the back foot. They don't know how to respond. They don't know what to do. The couch thing was a masterful example, just literally something that some rando on the internet said. People found it funny. They were in on the joke immediately. And there's this other kind of thing that Alinsky says about a good tactic is one your people enjoy.

And it is a recognition that for better or worse, that sort of trolling culture is online culture now. It is not just a purview of the right. The left realizes that it is going to use it and participate and go full bore at it also. And so I think watching that shift of the JD Vance memes being largely a function of... Nobody was misinformed. Nobody thought a man actually did what the tweets said he did. But they were in on it, they were participating, they were engaged, they found it fun, they found it funny. I think that people who are more traditional institutionalists, maybe not extremely online, I think my parents and stuff, found it very crude and tasteless, and like, "Why are they doing this?" So you do see that sort of cultural divide between people who see norms of good behavior in one way versus the kinds of things that you're seeing in this different type of very participatory culture where they're reshaping the frames immediately now around those people are weird.

And it's a very hard thing to respond to. "No, I'm not weird. You are." Then you sound like you're on a playground, right? I was really curious to see if he would respond by getting in on the joke, which is what I think would've been the approach, but he didn't. Instead he said nothing, creating again a void whereby you had John Oliver saying he didn't deny it. So it was just an interesting, funny shift to see, because again, it feels like a generational shift has just happened.

Justin Hendrix:

I recommend this book to my readers – not just, Renée, because I've known you now, as you say, for some years, and feel like I've been exposed to a lot of the thinking and certainly followed very closely the events that went into it. But the other thing, I know that this book didn't just come out of an academic experience, and this has been a rough and tumble few years for you as you have both been involved in studying, but also been the subject of many of the phenomena that you have covered in this document. I remain impressed that you've been able to pull off this book in the context of all of that happening, and I appreciate you speaking to me today, of course, but for you helping me and others to understand these issues over these years.

Renée DiResta:

Thank you. Appreciate that.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics