Disinformation spiked heading into the 2020 elections. Some of it was designed to dissuade people from voting, and some was targeted specifically at the Democratic candidates for President and Vice President, Joe Biden and Kamala Harris. Much of it targeted Black, Latinx and other communities of color, and aimed to cause racial discord among them. And following the election, lies and propaganda about its outcome led to the violence at the US Capitol that was the subject of the latest impeachment trial in the US Senate – seeding narratives that will remain prevalent through the midterms and beyond.
Of course, disinfo isn’t new. Disinformation relies on age-old narratives spread in new ways. For example, the voter fraud disinformation campaign that spurred the violence on January 6th relied on old narratives about Black and brown criminality. Birtherism disinfo about Vice President Harris leveraged narratives of white nationalism, anti-Blackness, misogyny, and the essential foreignness of Asians to stoke fear among the electorate. Socialism disinfo piggybacked off both recent experiences with authoritarian leaders in Latin America and Cold War-era narratives that equate government regulation with dictatorship and nationalization of the economy. What is new is the speed at which disinformation travels, and the number of bad actors and chaos agents responsible for the manipulation. This is the playing field on which disinformation and misinformation is spreading. So how to combat it?
As strategists, here’s what we’ve learned so far:
DISINFORMATION IS A FORM OF SOCIAL CONTROL
Any analysis of power in the current moment needs to incorporate disinformation as a form of soft, decentralized power and social control.
Disinformation now travels at the speed of the internet, and with governments, corporations, and new generations of digitally skilled chaos agents jumping on the bandwagon of information and data manipulation. The pandemic has put more and more people across the globe online for more hours in the day, and has limited access to trusted community sources of information that relied on in-person connections, such as church gatherings and neighborhood meetings.In this context, disinformation is becoming more effective at generating chaos and seeding doubt in reality.As a form of social control, disinformation is especially dangerous for three reasons:
(1) Unlike the disinfo or propaganda of the past, it can be difficult to determine the source of a particular stream of disinformation. For example, when Democrat-registered voters in Florida began receiving threatening emails, allegedly from a Proud Boy account, speculation flew as to whether the emails were actually from the Proud Boys, or whether they were a hacker hoax. There is not always a clear adversary, and the adversary is not always part of a larger organized force. Regardless of the independence, affiliation, or intent of the source, a great deal of disinformation supports right-wing racist, misogynist, homophobic and xenophobic agendas.
(2) Companies with outsized power create algorithm-driven echo chambers on the internet which track people into entire worlds of disinformation, where the majority of the material in their feeds or in the groups they join might be false.
(3) These disinformation echo chambers play a rising role in radicalizing the populist right wing – including its armed factions – and in recruiting unorganized people into its ranks. This activity has hastened since polls closed on November 3rd, 2020, as many flocked to platforms such as Parler and Telegram.
NEW TOOLS ARE NEEDED TO NEUTRALIZE NEW THREATS
Civil society needs technology that helps organizations and institutions understand the larger picture of narrative battles in real time, and the role of disinformation within them.
For example, ReFrame and its sister 501c4 organization, This is Signals, are building new technology-assisted infrastructure to support organizing-driven narrative change. The approach is like doing a one-on-one organizing conversation on a giant scale. Adapted from Upwell, it combines machine intelligence with human intelligence to monitor the “narrative weather” and to track conversations over time. The tools used for machine intelligence scrape data from different platforms (for example: YouTube, Twitter, reddit, news sites, etc.) to yield broad trends such as spikes in conversation on topics like “police” or “socialism”. Then researchers apply human intelligence to hone in on the content of these conversations among specific audiences (for example: what Black elders 65-80 years old were saying about police after George Floyd was murdered, or what Venezuelans on the right versus the left were saying about socialism in the month before the presidential election). Taken together, these methods let us aggregate what people are saying and where they are saying it. This helps us identify what is resonating and what isn’t with different audiences in moments across time.
Just like in any good one-on-one conversation, it is necessary to listen 70 percent of the time and inject content to test 30 percent of the time. The ultimate goal is to equip organizer, activists, and advocates with the research they need to respond to conversations in real time, to help messages break out of small echo chambers and into wider public debates.
Florida was one of the battlegrounds that tested this technology and infrastructure during this election season. You name the disinfo stream, it was probably circulating in Florida. Organizers were particularly worried that disinformation that labeled Biden’s platform as socialist would suppress votes from Cuban, Venezuelan and Colombian communities or would turn undecided voters over to Trump. They were also worried about disinformation targeting the integrity of the voting process. They believed that disinfo about vote by mail fraud would keep Black voters, familiar with targeted voter suppression, from exercising their right to vote. Florida voters were also the first targets of an email hoax that threatened Democratic party voters with violence if they didn’t switch their votes to Trump – voter intimidation initially pegged to the Proud Boys but found to be caused by Iranian hackers.
This is Signals set up a narrative command center for Florida organizers to identify and track disinfo and other narrative trends, and provided them with information on how big or small a storm was brewing. The narrative command center integrated research briefings and strategy clinics into existing statewide organizing and communications infrastructure, which groups like New Florida Majority, Central Florida Jobs with Justice, Dream Defenders and others had been building for more than two years.
As Jonathan Alingu of Florida for All explained, “We weren’t thinking about disinfo until October, but we needed to be thinking about it much earlier. We had a system in place and were ready to go, so it was easier to integrate narrative research and disinfo into this.”
BRING DISINFO AWARENESS INTO ORGANIZING
Disinformation is here to stay, and it will continue to affect communities. Rather than fight it in bits and spurts, or after damage is already done, organizers can mount a steady pushback against disinformation using tools that already exist.
In Minnesota, ISAIAH and Faith in Minnesota are taking a “strong offense is the best defense” approach to disinformation. Communications Director JaNaé Bates says they first and foremost train staff and members to use their own “spidey senses” and deeply held values to detect disinfo designed to harm their communities. Specifically, ISAIAH and Faith in Minnesota have used the race-class narrative to train organizers, influencers and member leaders in how to recognize racist dog-whistles, and how to subvert and respond to them effectively. They also teach stakeholders how to not unintentionally spread disinformation.
For example, Faith in Minnesota along with statewide partners started a newsletter called Repugnant, which features a pug dog who alerts members to disinformation and racially coded dog whistles. One of the issues was called “Don’t use the F word” and advised readers to avoid repeating the word “fraud” at all costs when talking about voting, even when trying to debunk claims of voter fraud.
A first step is for organizers to build a foundation of disinfo literacy among the base like Bates is doing in Minnesota. Activists can do this by incorporating tools from the Disinfo Defense Toolkit, and other tools from any of the contributing organizations. Another first step is to begin integrating disinfo listening and response into campaigns, as Alingu is planning to do, starting with resources the START tool.
TELL STORIES THAT ENGAGE FEELINGS
While communicators can’t just fight disinformation with content, no matter how constituency-specific it may be, they can make sure that the content they do create has more impact. Disinformation travels faster than factual information in part because of sensationalism, which activates people to share out of deep emotional impulses like fear and excitement. Disinformation streams give new emotional urgency to old narratives and thrive in voids of clear, factual and equally emotional information. So content must engage feelings, focusing on movement-building emotions like joy, rage, humor and pride. Examples of this include the Movement for Black Lives’ GOTV content and its victory video.
IT TAKES A MOVEMENT
Organizers concerned about disinformation need to combine forces with others: those who are running large-scale campaigns to hold platforms like Facebook and Twitter accountable, educating and organizing journalists, and building networks of trusted messengers, disruptors and meaning makers across all sectors of society (see the Movement Framework for Disinformation Response). It will take a whole ecosystem response, as well as implementing visions for community controlled platforms, to combat disinfo in more than just the whack-a-mole or “the more you know, the more you know” approach that dominates now.
As Dr. Joan Donovan says, “While I know the pandemic will end, or at least we will manage it through treatment and vaccines, I do not know how misinformation-at-scale will be slowed without a similar whole-of-society approach.”
The Disinformation Defense League has been a source of many of these lessons. The League was started earlier this year by The Media and Democracy Action Fund to fill a void in the larger disinformation field, and to focus specifically on disinfo targeting communities of color heading into the 2020 election. This formation is an important one for organizers to learn from and through which organizers can help develop a movement response that situates disinformation within a larger understanding of power.
“The great thing about an election is it forces you to look with clear eyes at the country you’re living in. It brings every social force out into play,” Max Elbaum said in a recent interview on KPFA. “We just got a good hard look at the country we’re living in. And if we’re going to change this country, we have to understand it.”
The same goes for combatting disinformation and shifting narratives toward a new common sense based on justice. To neutralize disinformation and change the conversation, organizers must understand both terrains – not just through an information-gathering approach, but through the lens of power-building.