Home

Donate

Documenting the Assault on Disinformation and Hate Speech Research

Justin Hendrix / Nov 24, 2024

Audio of this conversation is available via your favorite podcast service.

During his recent campaign, President-elect Donald Trump made various promises consistent with the ongoing effort by Elon Musk and MAGA Republicans to target researchers and civil society groups that study issues such as propaganda and mis- and disinformation.

Today's guest has looked deeply at this effort, conducting an analysis of over 1800 pages of primary documents to identify the strategic approaches employed by these parties, including the House Judiciary Select Subcommittee on the Weaponization of the Federal Government, and the outcomes and broader democratic implications of their campaign. Philip M. Napoli is the James R. Shepley Professor of Public Policy, the Director of the DeWitt Wallace Center for Media & Democracy, and Senior Associate Dean for Faculty and Research for the Sanford School at Duke University. His findings are published in a new paper in The Information Society titled "In pursuit of ignorance: The institutional assault on disinformation and hate speech research."

What follows is a lightly edited transcript of the discussion.

Justin Hendrix:

Good morning. I'm Justin Hendricks, editor of Tech Policy Press, a nonprofit media venture intended to provoke new ideas, debate and discussion at the intersection of technology and democracy. During his recent campaign, President-elect Donald Trump made various promises consistent with the ongoing effort by Elon Musk and MAGA Republicans to target researchers in civil society groups that study issues such as propaganda and misinformation. Today's guest has looked deeply at this effort conducting an analysis of over 1800 pages of primary documents to identify the strategic approaches employed by groups such as the House Judiciary Select Subcommittee on the Weaponization of the federal government and the outcomes and broader democratic implications of their campaign.

Philip Napoli:

My name is Phil Napoli. I'm the James R. Shepley professor of Public Policy at Duke University, where I'm also the director of the DeWitt Wallace Center for Media And Democracy.

Justin Hendrix:

Can you tell me just a little bit about your research interests and what you get up to at the center?

Philip Napoli:

Sure. So at the DeWitt Wallace Center, we do research. We do also different kinds of public engagement around a whole host of issues that fall under the media democracy umbrella. So we do a lot of work on fact-checking and over the years worked with the platforms on how to figure out how best to integrate fact-checking to their content curation. We do a lot of work on the crisis in local journalism and how we can understand when and how and why news deserts develop and what can be done about them from a policy standpoint.

But then also do work on platform governance and the role that digital platforms should be playing in the public sphere and whether there's any policy interventions there that might make sense. But aside interest of mind has always been also broadly the politics of research. How do our political dynamics affect how research that guides policy is conducted in terms of who conducts the research? What are the questions that they're able to ask? How is the research used in decision-making? I think that's a really understudied but important area of inquiry.

Justin Hendrix:

And of course that's what we're here to talk about today. I invited you after reading this paper in a journal called, The Information Society that you wrote called, 'In pursuit of ignorance: The institutional assault on disinformation and hate speech research.' We've talked about these issues on multiple occasions on this podcast in the past. Our contributors at Tech Policy Press have addressed it in various ways. We've covered some of the both litigation and other kinds of consequences of this institutional assault as you call it. But I think one of the things that interested me about this paper is you do situate it in a kind of longer recent history of the ways in which media research has become politically charged. Can you talk about that trajectory and where you feel like this history starts most usefully?

Philip Napoli:

In media research in particular lends itself to being politically charged because we're obviously talking about the realm of speech, we're talking about the democratic process. If you think about really some of the origins of media research, particularly in the US, goes a lot to what we witnessed in Nazi Germany and the rise of political propaganda there. Then what could we learn about the nature of Cold War propaganda and how to, as they use the term, immunize people, from Soviet propaganda. But as we get move away into more recent times, things like media ownership, there was a time in the early two thousands when media ownership concentration was a very highly contentious policy issue and the FCC made, which was, it's ironic now, but at the time was seen as a very bold step, which was, we're going to commission a series of studies to guide our decision making.

And that was a change of pace from the traditional model of letting whoever wanted to participate in the process, submit studies and the FCC would sift through those and decide which ones had the most merit. That dynamic then around how the FCC commissioned that research, who was commissioned to do it, who had access to the data afterwards to make sure the findings were replicable, all these became surprisingly contentious, maybe not surprisingly contentious issues at that time. And then more recently, my most recent experience with this was in the early 2010s when the FCC started inquiring into what they called the future of media. What is the future of local journalism and what do we need to know about it so that the FCC can potentially ensure that communities information needs are being met?

And we did research on that topic for the FCC and didn't know at the time that was going to mean that we were going to start getting hate mail and people were going to try to get us fired from our academic jobs. And that even back then, we think about the politicization of so many different policy areas as a policy research as a more recent phenomenon, but it got ugly even back then. And the idea there is that any kind of research in this space was a slippery slope to government regulation. That's something we hear a lot in this space. And of course then the notion of research in this space becomes intertwined with First Amendment freedoms. So yeah, it's been a fraught sort of space for quite some time.

Justin Hendrix:

Of course, that brings us to the more recent cycle, which you point to really picking up in the wake of the 2016 presidential election, the idea that the platforms face all kinds of scrutiny and critique about the ways that they facilitate misinformation and disinformation and various other kinds of online harms. You point out the body of research on these phenomena consistently has identified a partisan asymmetry in the production and dissemination of mis and disinformation, and the various kind of politics around that make it incredibly fraud. You introduced this idea of agnotology, the social construction of ignorance. For my listener who may not recognize that word, and I must admit I didn't immediately bring the definition to the four of my mind either, what is this concept and why is it important to understanding these phenomena?

Philip Napoli:

Sure. I think probably the best thing I did was follow someone's advice and not include that word in the title of the paper. Nothing happened, bored people instantly. But yeah, so this is a field of study that has looked at what are the social political conditions that can contribute to the creation or the maintenance of ignorance. Probably where it's been most pronounced was the fairly substantial body of research that explored, for example, the way that the tobacco industry strategically approached creating ambiguity and uncertainty about the health effects of smoking. They really approached the cultivation of ignorance as an intentional strategy.

So it meant doing things like attacking any research of course, that demonstrated some negative health effects of smoking, conducting their own research that was often incredibly methodologically flawed, but that would lead to different outcomes. In some cases, lobbying in such a way to make sure that government funding for research in this space was cut off. That's actually something we saw more in the realm of gun violence. So it's this idea that certain political interests can see the maintenance or the creation of uncertainty and ignorance around certain issues as an important part of a larger political strategy.

Justin Hendrix:

You write that the key point here is that ignorance need not be the result of passivity or inaction. It can be the result of concerted actions intended to maintain or cultivate ignorance. And then you go on to look at a corpus of documents that tie together what's been going on of late. You look at threshold investigations and the role of academic and nonprofit researchers. You look at state attorney general investigations and lawsuits. You look at advocacy organization lawsuits and finally, platform efforts. So a lot of material. How did you go about pulling this corpus together and how would you describe its contents?

Philip Napoli:

What interested me and importantly, this particular issue of the politics of research in this case fell under this larger umbrella concern about whether there was an improper level of engagement between government officials and digital platforms in particular. And the question here was whether or not the research that was being conducted in this space was essentially in and of itself an element of an improper level of government intervention into the speech rights of individuals and platforms. And what was striking to me was how I was watching this take shape across so many different contexts. So yes, I found myself, some of these things are obviously very high profile. The weaponization subcommittee that you mentioned before was holding a lot of hearings on this topic, and so that generated a lot of material that I wanted to make sure I had thoroughly gathered. But we were also seeing these actions in a couple of states at the state level where once again, what was interesting about this is that to some degree the target, the researchers themselves that their actions that they're just conducting a research represented essentially a violation of other stakeholders speech rights.

We've seen some advocacy organizations also get involved in this space. And so there again, their lawsuits and that material, the arguments that they were making was something I wanted to make sure I was able to gather. And then some platforms have been particularly aggressive in this space X in particular, suing a number of nonprofits under the logic that the research that they were conducting demonstrating the prevalence of hate speech on the platform was actually fraudulent and intended to undermine their business model by scaring away advertisers. So I was just struck by how we were seeing these issues play out in so many different institutional contexts. And so it was a fairly substantial data gathering enterprise and I think I pulled together over 1800 pages of documents when all is said and done, and I have no doubt that plenty of other important stuff probably slipped through the cracks.

Justin Hendrix:

To what extent does your look at this corpus suggest a degree of coordination among the various kind of, I don't know how to quite describe them, are they the antagonists in this case or perhaps the protagonists depending on which way you look at it, the various parties from the committee on through to those corporate actors on through to the quote journalists that have engaged in a lot of the work around this. How coordinated do you think this institutional assault is?

Philip Napoli:

We have a few sort of examples of their being coordination. In some cases we had, for example, folks who were testifying in the weaponization of the federal government subcommittee hearings we're also folks who are engaged at the state level in some of those state level lawsuits and information seeking inquiries. We know that Elon Musk and X were involved in the Twitter files that provided a lot of the raw material that facilitated the work of the Weaponization Subcommittee. So we see some of this in terms of who testifies, who's participating in the writing of legal briefs, who's filing friend of the court briefs, things of that sort. So there seems to be a fair bit of interaction in a lot of these areas.

Justin Hendrix:

It seems to me that coordination and the collaboration on this appears to be only growing particularly as the house weaponization committee has indicated that its work will certainly continue into the next Congress. We see other efforts from folks like Brendan Carr who will be commissioner of the FCC to get involved on these issues. And of course Elon Musk continues to pursue his interests through X and in other contexts as well. Anything that points to what you can expect from the future with regard to this kind of collaboration from the work that you've done so far?

Philip Napoli:

What I expect of the future unfortunately, is for us to see more evidence of it having an effect. Some research enterprises shut their doors. I know in conversations I've had with funders that some funders are starting to feel skittish about funding work in this area. Certainly it's already happening, but we can imagine becoming much more pronounced. I think we're going to expect to see any federal funding that even is in the vicinity of this kind of work drying up completely. And again, snuffing out the research is part of a much larger political strategy here. One is to of course limit the extent to which the federal government even has information that might inform the work of platforms where they so inclined to act on it. But then also to more broadly create a political environment where just the knowledge creation around what is and what is not false is a lot more anemic than it currently is.

Justin Hendrix:

So I don't want to ask you to necessarily call the match or call the game as it were, but is it your estimation that essentially this coordinated effort has worked? Have these various parties accomplished their goal?

Philip Napoli:

I don't know what absolute success looks like, but it's had an effect and that in and of itself is unfortunate. We're seeing researchers also be a bit more hesitant to move into this space. But one of the things I saw back in 2012 when I was working with the FCC was the FCC basically said, "Wow, this is a bit more of a hot button political topic than we are in a position to engage with." And they really, folks in leadership at that time reached out to some foundations and said, "Hey, can you guys pick up the ball and run with it?" So I think this is similarly an interesting time where it's a test of philanthropy and will they maintain a commitment here?

Universities especially, well private universities, are reasonably well insulated, so you'd like to think that they would also continue to support the work of faculty who want to, and researchers who want to continue working in this space. But I see this as very similar in moments like this when the press really says, "All right, this is the time for us to step up and make sure if ever that we're doing our job as good as possible." I think the exact same applies right now for folks who are doing misinformation research because we are entering a time again where unfortunately it seems safer to assume that our own government will be lying to its people more than we are even accustomed to.

Justin Hendrix:

Seems to me that it's a real open question whether universities will have the appetite to fight in this cycle. Most universities, of course, are recipients of federal research dollars in some cases it's very substantial to their operations. As one of the things that's being threatened at this point is essentially the retraction of any funding, whether it has anything to do with this information or not. If any university is engaged in this type of research, this could potentially leave a lot of folks out in the cold.

Philip Napoli:

And that sort of is reflective of when we're talking about coordination before, one of the aspects of the strategy that we saw very consistent across all of these different institutional contexts is taking that very idea of disinformation and trying to discredit it in the most basic way, which in a lot of these documents, the word disinformation gets put in quotes or disinformation researchers are described as pseudo experts creating a political environment where even the very notion of targeting anything, of identifying anything as disinformation is seen as hostile to the first amendment. So I think that's a key part of what we're going to continue to see going forward.

Justin Hendrix:

As far as the platforms are concerned, you've already mentioned that the platforms have changed some of their behaviors. Most notably we saw in this election cycle, a sort of retraction from trust and safety activities. There's also the phenomenon of restricting data access. That's something we've talked about in great frequency and great depth on tech policy press. This seemed to you that the platforms are, if the refs there have effectively been worked as well.

Philip Napoli:

Oh, I think that's absolutely the case. And as you mentioned, Brendan Carr and some of his ideas as it relates to project 2025, we may see a really concerted effort going forward to sort of re-conceptualize what kind of rights platforms even have to engage in the kind of content moderation to the limited extent that they currently engage in it. I think the notion of these platforms as common carrier could resurface and perhaps be embraced in some consequential legal decisions going forward. To be honest, part of me wonders at this point whether any of the platforms would even care. I think that was what we saw over the past few years is the platforms throwing up their hands in some cases and say, "We're damned if we do, we're damned if we don't. Why are we even bothering?"

God knows some of them are not in that position to, or they would say they're not in the position to dedicate the same kind of financial resources to it that they were before. So it does seem to be like we might be approaching a bit of a perfect storm that will lead the platforms even if they were being provided with research that's useful to them in making content moderation decisions, whether they're in a position to even pay any attention to it.

Justin Hendrix:

You suggest it's important to document the efforts against this disinformation research to look at evidence of coordination of collaboration, to look at the types of organizations that are engaged in this activity and to try to understand how they operate. Are there other kind of suggestions that you'd make to anyone that might be listening to this whose part of this community of researchers or who's concerned about this body of research and whether or not it continues?

Philip Napoli:

It would be really troubling. I think most important at this point too might be deepening our understanding, not just of what's being produced and circulated, but learning as we're starting to learn a lot more about the types of people who are and who are not susceptible, the types of people who are compelled to share it versus those who are not. Because if the future is one in which the platforms aren't going to really be engaging in this kind of content moderation, the path going forward might be one of more. And we've heard a lot about this issue in recent years of digital literacy and trying to make sure folks are educated to be as resisted to and as divorced from the of circulating and amplifying online disinformation as possible. And that may be just a huge ask, but I think we are still in the dark and maybe because there is no answer as to what are the effective strategies of taking those folks who are very susceptible to misinformation, improving their ability to be resistant.

Justin Hendrix:

So when I ask you a slightly speculative question or maybe a question that asks you to speculate, one of the things that it seems to me in looking at all of these phenomena is that institutions that would normally defend the quote, unquote truth or defend fact-finding or defend research as it were, appear to be operating in a very different space than many of the individuals who advance claims against disinformation researchers and against the institutions that harbor them. Is that part of this that we're essentially fighting an unfair fight? You've got this sort of legion of sub-stackers and armies of folks on social media who grab onto various claims, whether they're true or not, advance them. Doesn't seem to matter if stories are true or false. It doesn't seem to matter if someone makes serious errors of fact. They continue to be called to Capitol Hill to testify as experts. They continue to be lionized in this community as truth-tellers. How are institutions that for the most part, don't engage in Twitter wars, don't engage in the substack universe, rely very much on traditional media. How should they behave in this environment?

Philip Napoli:

This is actually a project we're working on now, which is looking at, there's so many different categories of institutions that essentially don't receive, even I think a fraction of the criticism that they should about the decisions they make and essentially utilize a fraction of the authority that they have to be more meaningful contributors to a healthier news and information ecosystem. Take an example, like cable systems. I find it fascinating that in the wake of a nearly $1 billion settlement in which the Fox News Network was found to be intentionally disseminating disinformation, not one cable system said, "You know what? This is not the product we signed on for and we're not going to carry it anymore." That seems like a reasonable reaction. And for months we were tracking every major cable and multichannel video program distributor in the country to see if any of them might even issue a statement on this.

Nothing. The Federal Election Commission, the Federal Election Commission has the authority to distinguish between news and political influence operations. And that's something that's the press exemption we call it. And that's something that they apply very rarely. And so they have abdicated a lot of their authority and responsibility in this space. You could go to other contexts like press associations in terms of who is and who is not granted membership. So there's a lot of different contexts where there's a lot of different, and the platforms themselves of course, where that's been one of the most unfortunate narratives of the past decade and a half that they never really engaged in much meaningful distinction between who is providing true and accurate information and who is disseminating falsity, and the ease with which one could create a completely fake news organization on any platform that continues.

And so I think there's this larger, as you point out, institutional responsibility that no one has ever really called into question as much as they should. Advertisers, we're seeing advertisers now apparently return back to X in large numbers and advertisers and the decisions they make about where to advertise could be profoundly influential in helping to address some of these issues. But none of these spaces are we seeing this kind of institutional social responsibility that I think we're entitled to ask for.

Justin Hendrix:

That might point very much to just the politics of the moment where we are as a society. In looking at all of these documents. In thinking about these issues closely and considering the evidence put forward by the Weaponization Committee, by all of these various Substack folks, et cetera, are there points that you would concede to them? Is there anything that you empathize with in the arguments that they're making?

Philip Napoli:

Oh yeah, absolutely. I think the question of what is the appropriate boundary for the federal government's engagement with the social media platforms is absolutely legitimate question. And that really rests on this not easy, but important determination about when is the boundary from persuasion to coercion crossed. I think that is a legitimately important question, and I think it's completely fair to raise that question that we do not want our federal government overstepping the bounds into coercion as it relates to trying to influence the behaviors of any kind of media outlet. But when those who are engaging in the act of conducting research are lumped in as being part and parcel to that process. And of course there's a fundamental irony in hypocrisy there that is also speech and the idea that we're comfortable stifling one form of speech in favor of another form of speech is really problematic. A stifling of research is something that is, to me, hugely problematic.

Justin Hendrix:

Is there anything from those last cycles that you mentioned, the prior cycles around the FCC, around media concentration, etc, that might point to how this cycle might end?

Philip Napoli:

I think back to what happened at the FCC around the media ownership issue, and that was something they had to revisit every four years, and really how that ended up resolving itself unfortunately, was that the FCC, after a few cycles of actually trying to conduct relevant research and watching how politically fraught that became and how many battles were fought around the validity of the research itself, that eventually the FCC started engaging in that process without commissioning any research opted to return to the old model where they operated in a bit more of a data vacuum and became a bit more dependent advocacy oriented research of the different stakeholders participating in the process. I fear that might be exactly where we're heading here, platforms themselves adopt to see no evil, hear no evil, speak no evil approach, and say, "Not our business anymore to try to be as well-informed about what's happening on our platforms."

Justin Hendrix:

Certainly a grim possibility. Strikes me, I'm talking to you at the end of November 2024, 2025, we could see all of these issues crescendo, and we'll see if there are legislative or regulatory or certainly repercussions in litigation and in the courts. Phil Napoli, thank you so much for speaking to me about this today.

Philip Napoli:

Oh, my pleasure. Thanks for having me.

Related Reading

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics