Home

Twitter Whistleblower Anika Collier Navaroli Looks Forward

Justin Hendrix / Apr 30, 2023

Audio of this conversation is available via your favorite podcast service.

In the course of its investigation into the insurrection at the US Capitol, the House Select Committee on January 6 spoke to hundreds of witnesses, including social media executives with insight into the role that platforms played in propagating the false claims that motivated violence that day, and in connecting and facilitating the movement and organization of people that sought to overthrow the election.

One of the individuals that testified to the Select Committee was a former Twitter official, Anika Collier Navaroli. I had a chance to speak with Anika earlier this month, to hear how her thinking has evolved in this time under the spotlight, and what she’s hoping to do next to continue her journey as an intellectual and an activist working at the intersection of tech, media and democracy.

Background reading:

What follows is a lightly edited transcript of the discussion.

Justin Hendrix:

Anika, we're going to talk about your career, the experience you've had over the last year and a half, which has been rather peculiar, and some of the thinking that you're doing lately on the topics that you've been working on for quite some time. But for my listener, can you just describe your career trajectory? You ended up prominently as an executive at Twitter, and came to national attention during the hearings of the January 6th select committee. But joining that technology company for you was part of an intellectual journey?

Anika Collier Navaroli:

Twitter was just one stop on my way. I've also had another job and a whole other career after Twitter. But this started for me many, many years ago. I remember back in, I think it was, the early 2000s, I was a high school senior. And I was in a debate class, and I went to Washington, D.C. for the very first time to talk in this, it's so nerdy, this simulated congressional hearing. And the topic that I was obsessed with was the First Amendment and the limits that could be placed onto it. So I've been doing this congressional hearing about free expression for a little while, actually.

And so from high school, I went and got my undergraduate degree at the University of Florida, where I studied journalism. And I worked at a newsroom that actually turned digital the summer that I worked there. And I became really obsessed with these kind of questions about how free speech that was built on this printing press was ever going to be able to transition into MacBooks, and InDesign, and the way that media was being processed, and these brand new things called social media at the time. This was back when Facebook required a .edu account, and it was the hottest thing on the streets.

And so I took that sort of interest and love and went to law school at the University of North Carolina. And there, I really continued to study what at the time was called cyber law. It was this really niche area of also called media law. I worked for the UNC Center for Media Law and Policy and was really honing in on these questions. Trying to figure out how this new technology was really going to change our lives, and really how we were going to implement these constitutional principles like free expression within these emerging technologies.

I then went and got my master's and specifically studied Twitter and social movements that were happening on Twitter at the time. The Arab Spring was happening, and Occupy Wall Street was happening. And there was this case of first impression with Twitter with Occupy Wall Street, where they were trying to ask for user data and information to be able to identify where a protester was. And I was obsessed with figuring out how these new social movements and these idea of constitutional rights were going to be able to expand again into social media.

And at the time, people thought I was a little nuts, to be honest with you, and were like, "Are you sure this is going to..." It wasn't a thing. Trust and safety teams weren't a thing. Content moderation, commercial content moderation, didn't have a name was. And I was obsessed with this thing. And I realized that I graduated, and there were no jobs, really, in the field. And so I went and I worked at two different law firms, and I kept my eyes open, and I had this really unique sort of skillset.

And then, this job opened up at this new think tank called Data and Society back in the day. And I applied to head up their work, specifically looking at big data. That was the hot term of the day back then. Big data, and civil rights, and fairness. And so these were our very early ideas and understandings of technology not being neutral, and questioning algorithm bias and fairness, and looking at emerging technology. This was also the time of the first BLM uprising, so a lot of new policing technologies. And thinking about policy implications of things like body-worn cameras.

And then I went and worked at an advocacy organization. And actually, one of my jobs there, I went to Twitter and spoke to the vice president, Del Harvey, and was saying, "These laws that you have..." These laws. These policies, which were basically the laws of social media companies. I was questioning the way that they were implemented, and the way that they were being written, and the gaps that existed there.

And I remember specifically her asking me, "Well, okay, if we have all these problems, how do we fix it?" And I didn't know, and I didn't have the answer. Because I realized I had no idea how the machine worked, or how Twitter worked, or what the levers could be pulled to actually make this sort of change happen. And so a couple years later, I joined the team at Twitter and started doing those things.

Justin Hendrix:

So what happened then, of course, is that you found yourself at the center of a lot of the questions that were coming to the fore in the 2020 election cycle and shortly thereafter. And then, of course, you came on the national stage, first anonymously, and then attributed, by the January 6th Committee as one of the whistleblowers who came forward from Twitter. My listeners can go back and read your deposition. They can watch the video of your testimony.

Anika Collier Navaroli:

It's a lot of pages.

Justin Hendrix:

And so we're not going to go through the ins and outs of all of that. But can you just tell me what it was like this last couple of years in the wake of leaving Twitter, going through the process of testifying, and becoming a public figure in this way? What's that been like?

Anika Collier Navaroli:

Yeah, it's been, I think, life changing is really the best way to describe it. Everything about my life has been put in a blender and spit out. I worked in companies and jobs that were very low-key, and the way, due to the nature of our jobs, we were kind of ghosts to begin with. And then, I started whistleblowing and had to live an even more kind of secretive life doing especially anonymously. I was doing something that was not public that I couldn't really speak about, and it was really difficult.

I think the way I described it in the Washington Post headline was like, "Terrifying." And it was. It was absolutely terrifying. It was absolutely isolating to have to sit down and realize, and look back on moments, and realize, like, "Hey, this was really important. Oh, my gosh, I can't believe this is what happened." And then have to come to the realization of it's important for these things to be known, and for the truth to be on the record, and to sit there and say, "Well, somebody should say something."

And then, to keep looking around and realize that, "Well, that someone is you. And it's up to you to be the person who says something." And so it has absolutely changed the trajectory of my career. I was working with entrust and safety departments. I was working at Twitch in the policy department as a senior expert, while I was whistleblowing. And so it was difficult, and it took a lot of energy. And it still takes a lot of energy. And I think it was worth it.

I look back, and I kind of describe it and think about it as I had these moments in time where I would look and think like, "Wow, this seems to be important," or, I" think that this might be historical." And I had to ask myself, "When history looks back at this moment, who do you want to be?" And having to rise to the occasion, and rise to those moments, and meet those moments as they come has been... It's been tremendous. And it has changed my outlook on life. The way I walk around, the way I'm able to walk around. And it has also allowed a conversation to happen.

Justin Hendrix:

So since your testimony, and in just the last few months, some of the decisions that your team took, and that ultimately Twitter took, following the January 6th insurrection to de-platform Donald Trump, that decision of course was reversed -- not just at Twitter, but also at Facebook, at YouTube. And this conversation that we've been having around the sort of boundaries of free expression, protecting people from potential harm, violence as a result of cascades of communication across social media platforms. We're still very much having this conversation right now.

I want to just open that up a little bit, maybe get your reflections now both looking back, but also looking forward. What do you make of this moment we're in at the moment? With it on some level, it's a moment of retrenchment. We're seeing social media platforms pull back from some of the practices that they engaged in during the 2020 election cycle, during COVID. A lot of critique that's come from First Amendment absolutists about some of their practices. What do you make of this moment?

Anika Collier Navaroli:

I think we're in a really important moment. I think we're in this moment that's right before leading up to another historic election, and it honestly feels a lot like déjà vu. Because having been a person who has been in the moment right before the election, I'm seeing a lot of the exact same things. And you mentioned this sort of re-platforming that has happened. And I wrote the op-ed that you all published, specifically thinking and talking about how these decisions, I believe, are dangerous.

And they are, I believe, ahistorical and not taking into account not just the testimony that I gave to Congress, but Congress's own investigation that wasn't released in the information that was included within it. And so I think that it's an important time, and it's also a very scary time. You mentioned the sort of retrenchment. And I think part of the goal and why I wanted to talk about January 6th and make sure that we got this on the record was to understand the sort of failures that happened, to make sure that they wouldn't happen again.

And I am increasingly concerned that that is not happening. Because my testimony and the things that I said were all about filling the gaps in policies and making sure that policies were enforced. And we're now living in a world where the people who had my jobs no longer have jobs. There have been layoffs across the industry. And so not only do we have a policy problem, where the rules that are written on the books have gaps that don't account for the way that conversations and political conversations have been happening for the past several years.

We also don't have individuals who are there to make these arguments. We don't have content moderators around the world anymore who are able to look at these things and make these sort of nuanced arguments. We don't have the safety staff. And so it is, it's really concerning to me to see where we're at, and to especially think about where this could lead for the future.

Justin Hendrix:

On the one hand, we seem to have a kind of environment where social media platforms are making decisions about questions like whether to re-platform Donald Trump, based on a set of things: What is the threat scenario presently? How great is the risk of potential civil unrest? That seems to be part of the consideration. The other is based on whether they feel they can judge his individual utterances or the utterances of any other demagogue for that matter, and assess whether those are appropriate to their policies.

What's missing here? What's missing in this kind of equation for you? Part of your testimony, you talked a lot about coded incitement to violence. You seem to be kind of hinting at signals you were able to see on the platform of mounting... What's the word I'm looking for? Kind of mounting signals of potential violence. Something that went beyond just speech, but became a kind of collective physics, if you will.

Anika Collier Navaroli:

So you talked a little bit about the boundaries around free expression. And just to say very clearly and... excuse me... loudly, according to First Amendment jurisprudence, the First Amendment has never been absolute. There have always been limitations to free speech and free expression. And one of those that has existed is incitement to imminent lawless action or to riot. And that is what we saw happen on January 6th. And so I think a lot of these arguments around free speech and free expression are disingenuous, in that they don't take that into account, and they don't argue the reality and understand the reality of that.

There is this limit. And it's specifically called out within our jurisprudence with our understanding of the First Amendment. And that's what happened within this day. And I think in addition to the imminent lawless action and this very sort of explicit incitement is what we called coded incitement to violence. And so specifically this sort of... Ian López wrote the book Dog Whistle Politics. So this language or this sort of rhetoric that signals to those who understand and can receive what is being signaled information. And within this case, signaling information about political violence.

And that has been something that not only happened and that I talked to Congress about, we saw leading up into January 6th, but it's something that we're seeing continue to happen today. We are continuing to see members of Congress make veiled threats for... What are they calling it? A national divorce. Just a rebranded recalling for a civil war, which is outrageous. And it is being allowed and continued in this, again, this rhetoric that we are hearing once again from Donald Trump, himself. Specifically in reference to his indictments and the legal cases that he is facing. It is very much mirroring exactly what we saw in the lead up to January 6th.

Justin Hendrix:

So part of your idea here about coded incitement, it's not so much even just the words that someone is saying, whether those directly lead to violence. So if I say, "Go ye forth, ye who follow me and commit this act on my behalf..." That's one way of thinking about it. You're suggesting that we take into consideration the now legible to us response of an audience or a following.

And that's part of what you were seeing at Twitter. So you would see when a political leader said certain things, that his followers, or her followers, would hear those things in a certain way. And you're suggesting that we should basically take that into account when we make decisions about how to moderate content?

Anika Collier Navaroli:

Conversation and communications does not happen within a vacuum. You have the very basic communication triangle. The information goes some place, it's received somewhere, and someone responds to it. And so looking at just one piece of communication is problematic. One of the things that I talked to about to Congress and was very, very visible, and I don't think we talk a lot about, is the response that was happening to Donald Trump's rhetoric on January 8th.

And so this was the day that he was permanently suspended from the platform. And what we saw occurring that day, and what I told Congress and talked to them about, was we saw this exact sort of language that had been coming up to January 6th start all over again. But this time, instead of hashtag J6, it was hashtag J17. And so from our vantage point, we were seeing individuals planning again for, honestly, what looked like another insurrection.

And a lot of the commentary that we were seeing were things like people saying, "I wasn't able to attend last time, now I can make it." And so this conversation was happening not just about going to Washington D.C., but to state capitals all over the world. And so by taking this into consideration and looking at this, we were making the argument that this is going to happen again. And it was only by receiving the responses to Donald Trump saying things about however many American patriots.

It was that sort of coded language, and it was the way that it was received that was saying people saying, "We're going to respond with violence." And it was that it was absolutely necessary, in my opinion, to understand the response of the way that people hear things and the way that they are going to react to them. It's incitement to violence. So yeah, it's not just the words, but it is very much the way that it is received, the way that it is responded to. Because in those situations, I think incitement is a really, really hard thing.

Typically, we'd have to sit there and say, "Is this going to lead to violence?" It's sort of this prediction. We want to stop something before it happens. I think on January 6th we had... It's one of the clearest cases of incitement that we've ever had. We saw it happening, and then it happened. And then, on January 8th, it was, "Is it going to happen again?" And I think we were able to learn, thankfully, in that moment from history to make sure that it didn't happen again.

Justin Hendrix:

So do you think the law is set up to do this type of analysis? We've got the Brandenburg test. We've got other legal precedent here. In this particular situation, you're talking about almost like a mathematics or a threshold. Where if a certain following or political group is hearing instructions, even if they're not being uttered in a semantically obvious fashion, that perhaps there should be action taken by private platforms to step in and perhaps try to, I don't know, mute or otherwise kind of interfere in that cascading effect that we saw happen before January 6th. Is there a legal analysis here? Or is this mostly about the decisions of private platforms? Is that what you're arguing for?

Anika Collier Navaroli:

I think it's all of the above. So the legal analysis is something that doesn't come directly from me. There's the robot plan of action that has existed for quite some time that talks specifically about things like intent and impact, and the way that communication happens, and the balancing test, which I think is very, very hard to do. And I will acknowledge it's very, very difficult to do. The other thing that I have constantly advocated for, especially as we think about free expression, is asking these questions of free expression for whom and safety for whom.

And so no longer just taking this power-blind approach to things, and recognizing that when decisions are being made, inevitably, we are allowing free expression to exist at the expense of someone's safety. And then, we are also allowing someone's safety, at times, to be put in jeopardy at the allowance of someone's free expression. And I think making these decisions without being cognizant that that is happening, and to whom, creates very, very dangerous situations that are consistently just reinforcing problematic power dynamics.

Justin Hendrix:

If you were to go back into another social media platform at this moment in time, and they gave you the reins of the trust and safety department, what would you do? What type of process or procedure would you put in place right now that would potentially protect from these types of events?

Anika Collier Navaroli:

I will be really honest with you and say that part of what I am reckoning with, and I think the whole industry is really reckoning with, is the question of whether being the person in the head of the trust and safety department is the way to make these decisions or the way to make these changes. And I think that self-regulation has only gotten us so far.

And so for me, I think a lot of this question is, should the reins be put in the hands of someone else? Because we have seen how irresponsibly they have been wielded within the hands of social media companies. So all that to just say, I don't know. I don't know.

Justin Hendrix:

I'm going to include that at the end, that part. But I want to ask you a couple of other questions. One of the things that comes through in your testimony and in your thinking around coded incitement is that there is an analysis that is a race analysis here. I want to ask you just to kind of expound on that a little bit. Some of your work prior had been looking at questions specifically around race, questions about representation in media. And part of this coded incitement to violence question, you have to, as I understand it, look at it in a sort of historical context. The January 6th insurrection was not just an insurrection, it was in many ways a white supremacist insurrection.

Anika Collier Navaroli:

Yeah, absolutely. I think there's documentaries that have been made linking January 6th to other domestic terrorist events that have been perpetrated by white supremacists. And the book I mentioned, Dog Whistle Politics, that book, in addition to Michelle Alexander's amazing book, The New Jim Crow, when they talk about dog whistles, they're specifically often referencing what at the time was called the Southern Strategy. So this sort of rethinking and this sort of remarketing or rebranding, of white supremacists, white supremacy language and lingo.

And so from going from, "We can no longer just flat out call you the N word," to, "We're going to use something else." And so this has been going on for quite some time. And again, these books have written it throughout history. And I think one thing that if we look back at 2020 and just saying this in my professional capacity at this moment, looking back at the language that was happening within the Republican Party in that time, it was very much a rebirth and sort of digitalization of a southern strategy.

And so we saw a lot of the exact dehumanization and hateful rhetoric that was just placed within the social media ecosystem, and it had the same repercussions that it has always had, which is death and violence to those who have been othered or those whose humanity has been removed from them and stripped from them, in such a way that individuals believe that perpetrating violence on them is okay.

Justin Hendrix:

So I think, clearly, in your analysis, the tech world that we have at the moment is not sufficient to the social and political challenges that we face. Are you able to see a different future? You're a relatively young person. This episode, even though it's hard to imagine, may just be one chapter in your career and life. What vision of the future gives you some hope or optimism?

Anika Collier Navaroli:

I think we have to hope that there can be a different future. And as much as I've been a doomsday-sayer through all of this, I don't think I would be doing this if I didn't believe that there was an alternative available. So one of the things that I have done for fun on the side while working in technology companies is reading science fiction canon, that a lot of the leaders of technology companies get their inspiration from. And so I have found myself in a lot of interesting works.

And I think about Snow Crash. And as an example, I remember reading the first three pages or something and thinking the way that they described a woman's thighs, and thinking like, "Wow, what a world that we live in that we have envisioned this new future. And there is baked-in misogyny right from the beginning." And the other thing that I did in reading all of these books, I went and looked for Black people.

I went and looked for people that looked like me, and I wanted to see what futures they possibly had. And I remember reading books like A Strange New... Or sorry, strange. Strange to me, but a Brave New World to some. And have reading about genetically engineered humans and who are divided in this sort of caste system and thinking like, "Well, where are all the Black people?" And shockingly enough, all of the Black people are still inferior within this future.

And so to me, while all of these things might be innovative, it's still the Gene Rodenberry effect. Where you can think of all of these great new species, but for some reason, all of the Klingons are bat black and they're all hyper aggressive, and they all have a problem with violence. And so we are replicating these exact same problems that we have within society, and we're putting them into our futures.

But what gives me hope is reading other sci-fi. And so I read things from Octavia Butler, I read things from N.K. Jemisin, and I see that there are worlds that not only include Black people, and queer people, and trans people, but these folks are thriving, or they are leading the charge, or they are the architectural brilliance behind the world that gets saved. And I realize that it matters who is writing the story, and it matters who is driving the creation of the narrative. Because it's amazing what can happen when it's open, and people are invited, and they exist.

And I think that we have for so long been creating such a narrow version of history that is being dictated not just by the straight White male CEOs of Silicon Valley, but the straight White male authors that came before them. And I think people like me show up to these places, and we have different visions and different versions of a society that we are looking at. And I hope that the work that I have done, and that the work that others have been able to do, have created space within these technologies companies, and in the world that we currently live in, for people to exist that wouldn't necessarily have had safety or the ability to exist within these worlds.

Justin Hendrix:

Is there a sense that it's possible to imagine a future that's more salient than the vision of the future that Silicon Valley is selling at the moment? How do we get there? Clearly, your voice is one of many that's pushing that direction. But can you see a kind of catalytic moment, another catalytic moment, in the future? Where perhaps the tide turns?

Anika Collier Navaroli:

I hope so. Again, I hope something changes it. I hope it's not another January 6th. I hope it doesn't take political violence or the attempted usurpation of a government in order for us to realize that we need to make change. I fear that we live in a society that requires something like that. And I fear for what could come, especially come 2024. I've said I worry that the very fabric of our society is at risk. Not just American democracy, but so many things. And so I hope that there's a way to make change. Something that has given me hope, you mentioned sci-fi, and I'll just randomly tell you this, something that has given me hope recently.

There's a new show on Apple TV, called Extrapolations. I don't know if you've seen it. But it's all about the sort of visions of what could happen if we don't fix climate change. And so for me, I've been thinking about it, and my world, and my field, and the sort of, "What does the world look like in which we don't fix social media?" And it's a little horrifying to think about, and it is dystopic. And yet, folks like me and so many other whistleblowers and individuals still continue to come forward, and to talk, and to share our knowledge and our guidance, with the hope that we can stave off the things that we fear.

Justin Hendrix:

What's next for you? What are you going to do from here?

Anika Collier Navaroli:

Well, I am continuing to write. As I mentioned, I've been doing this for quite some time. And so I've been thinking a lot about free expression, and in theory and in practice, and the way that it falls apart in all of those ways. And so I am putting all of those dots together, connecting them, and thinking about all the things I'm seeing happening around me, and doing some writing on those things.

I'm also doing some speaking and continuing to use this public platform that I have ended up with in order to continue to tell the truth and to shed light on a lot of the work that is happening. And I want to continue to think about these things. I want to continue to teach. Teaching is one of my favorite things in the world. And so I'm looking forward to being able to put all of these pieces together and see where we go.

Justin Hendrix:

Thank you very much.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics