Home

Nina Jankowicz on How to Be a Woman Online

Justin Hendrix / Apr 21, 2022

Audio of this conversation is available via your favorite podcast service.

In the opening to her book, How to Be A Woman Online, Surviving Abuse and Harassment and How to Fight Back, researcher and author Nina Jankowicz crafts an allegory of a woman going about her day, encountering creepy and increasingly threatening men in public. It recounts a morning commute on a cloudless morning, but things quickly get dark. The woman endures various encounters, at a coffee shop, on the metro, ultimately at her office. It gets ugly. And yet the point of the story is, this kind of behavior is perpetual on social media, where women endure a tragic volume of misogyny and threats.

This book is intended to be a guide for women who face this material, and yet wish to have a voice in the quasi-public sphere of social media. To learn more about its themes,I spoke Nina just before it hit the shelves- both physical and virtual- on April 21st.

What follows is a lightly edited transcript of our discussion.

Justin Hendrix:

Nina, where did this book come from?

Nina Jankowicz:

So I had always dealt with some level of abuse that had never been particularly bad until September 2020, when I debunked a conspiracy theory about color revolutions, or alleged color revolutions in the United States ahead of the 2020 US presidential election. I made a video about color revolutions, this is in my area of expertise, I'm a Russianist and Eastern European specialist by education, and so I talked about how calling what were grassroots uprisings, conspiracy theories organized by intelligence agencies, which was what was going on in the United States, this theory was going around, I debunked that theory and said why we weren't a candidate for a color revolution.

Some folks who were supporting that theory did not take kindly to it, they were on the right of the political spectrum, although they did attract some folks from the far left as well, and I endured two weeks of the worst trolling that I've experienced. Now, that doesn't mean that it's the worst that anybody has experienced, I know many women of color have endured far, far worse, but it was a wake up call for me because I had been at the same time doing some research at the Wilson Center about women in politics enduring similar abuse and gender disinformation campaigns, and had previously done some research about how Russia in particular uses such campaigns against women in democratizing societies.

It just really was a wake up call to me. I had always of course intellectually recognized the effects that abuse can have, but I always liken it to, especially because I'm an online researcher, my work is on the internet, it's like I'm in my office and there's a crowd of 100 people or more around me and they are picking apart every part of my appearance, they are telling me that I'm only good for breeding, they are shouting violent threats at me, things like that, and it really has an effect on you. Working through that with my friends and family, having very helpfully a support network that many people aren't lucky to have, I have a great network of women who was there to help me through that incident, it really shown a light for me on how unequipped we are as a society to deal with online abuse, both in the policy and legislative spectrum, of course platforms aren't doing enough.

Then I think a lot of women just think like, "Okay, maybe I'm in the public eye, maybe I'm just a regular citizen, nothing like this is ever going to happen to me." Unfortunately, with the way the internet is, with the way our political discourse is right now, you could be in the wrong place at the wrong time, metaphorically speaking, online and your life could be turned upside down in a day. So it was really important to me to, in the absence of better policy, equip women with the tools they needed to understand this phenomenon and to fight back against it.

Justin Hendrix:

So this book is a slim volume, you've got five chapters and quite literally written as a guide, the chapter titles are around things like security, adversity, policy, community, tenacity. But as I understand it, this builds on other work that you've done in your role at the Wilson Center. You set out, I understand, a couple years ago to look at how particular female politicians were dealing with harassment.

Nina Jankowicz:

So again, this was building on work I had done in 2017, this is going way back, I was researching for my first book on disinformation in Central and Eastern Europe, usually perpetrated by Russia, and I kept coming across these stories of women in places like the Republic of Georgia and Ukraine who had been on the receiving end of horrific sexualized and gendered disinformation campaigns, where their faces were superimposed on naked women's bodies and this was put online and it was said that they would run around the streets of Kyiv naked, or one woman even had a fake sex tape released about her. These images and videos followed them in their professional lives to places like the United Nations.

So I had written about that coined the term gendered and sexualized disinformation, and at the Wilson Center we wanted to look more deeply at that phenomenon and how it played into the broader spectrum of abuse online and what the platforms could do about it. Because it was clear to me that this was having an effect not only in budding democracies, but in our democracy as well, we saw vitriolic rhetoric since 2016 becoming more and more common, especially as we saw more women running for office and gaining office. So in the lead up to the 2020 election, we followed 13 women in politics, 10 Americans who were running for office on both sides of the aisle, a variety of ages and ethnic and racial backgrounds, and then three international figures. Jacinda Ardern of New Zealand, Priti Patel, the Home Secretary of the UK, and Chrystia Freeland, who at the time I believe was treasury secretary, but now is deputy prime minister. She is an interesting figure in particular, because she has a Ukrainian background, so that's why we stuck her in there.

But we wanted to compare the tenor of the discourse and what we found on six social media platforms against these women in a two-month period, so this study was pretty unique in that a lot of the other studies looking at online discourse are for a very short period of time and generally focused on Twitter, because the API is open. So we broadened the aperture to a bunch of other platforms, Twitter, 4chan, Parler, Gab and I am forgetting one, but there was six platforms there and we did it for two months ahead of the election. We found 336,000 pieces of abuse or disinformation, and this I think is a very low number, because of what we identified, what we called malign creativity. The abusers online are really good at avoiding detection, we had a list of, I don't know, probably about 50 to 75 different classifiers, so these words that would pick up on abuse.

What we found is that so much of what women receive is actually meant to avoid that detection by the platform's content moderators and AI. So rather than the word 'bitch,' we will see b!tch, or they will use the C-word, but they'll space it out, spaces in between every letter. So if you just write the C-word as your classifier, it won't come up. Even among that really substantive amount of abuse, what was staggering was that 78% of it was directed at Kamala Harris, which makes sense, she was a very high profile figure at that time, running for the vice presidency, but it was just the most horrific abuse. It was sexualized, it was racialized, it was transphobic, and we found that pretty consistently across the targets that we had researched, just similar things about transphobia. A lot of people seem to think that women in power can't be women, they have to secretly be men to be in power and to be assertive. In particular, the women of color in our sample got far, far worse and more abuse than their White counterparts.

The other thing that I should note is that we did, as I said, have both Republicans and Democrats in our sample and we didn't really find a substantive difference between the abuse that women on either side of the aisle got. So this is something that everybody deals with, it's not a partisan issue, and my worry and the worry of my fellow researchers is that young women look up to women like Kamala Harris or Jacinda Ardern or AOC, even whoever you want to pick on the Republican side, Susan Collins was in our sample, and they look at the replies on these women's social media platforms and they say, "It's great that Susan and Kamala want to do this, but politics is not for me, public life is not for me."

That was borne out in some of the research I later did for the book, where I spoke with young women in high school and college and they said, "I don't want a lifestyle that public. My accounts are locked down." I really fear not only are we losing their voices in politics, but just in our society more broadly. I'm a digital native, I grew up online, I had every blog known to man, which was probably not advisable at the time, but I had never been afraid of raising my voice in that way and I think there's a lot of self censorship going on with Gen-Z, and that's really scary for me to see and it breaks my heart frankly.

Justin Hendrix:

The threats that these politicians, these women endure, they're not just speech threats. You talk in your book about physical security and broadly more cyber security. I know that you weave in stories of folks like Brianna Wu, who very much dealt with real threats to her physical security in the context of Gamergate. How do you address that in the book and think about physical security?

Nina Jankowicz:

Yeah. I tried to really bring this home for people, because a lot of people think it's just mean words online, but it's not. Three out of the five women that I profile in depth in the book, so Brianna Wu, Van Badham who is a Guardian columnist in Australia and a playwright, and Brittan Heller who is a lawyer in the States and works on issues related to tech and had been the unfortunate recipient of a bunch of abuse related to the AutoAdmit scandal, which I won't get into in detail, but it was a law school message board and people were saying horrible things about her and a classmate that then followed her to her career.

All three of those women, in several instances, have had to move their physical locations, either move houses or have had to take out restraining orders with the police because they have been either attacked or stopped on the street. So these aren't just online threats, because of the vast amount of information that we have online about our daily lives, people can track you down pretty easily. So one of the other women I spoke with, Cindy Otis, who's another disinformation researcher, former CIA analyst, gave some really good tips about they types of information that you share online. This goes beyond your basic OPSEC and IT security, two factor authentication, password managers, et cetera, although that's covered in the book as well, it's about thinking really strategically about the types of information you share online.

So Cindy said for instance, "If you share a picture of your car, you might be giving away one of the security questions for your bank or another account." She said, "You have to be careful about the patterns that you share online." So I go through an example in the book of let's say that I post on Instagram that I love to get tapas on Tuesdays with my girlfriends, you could then extrapolate very easily from my author bio on my books that I live outside of Washington DC, if you look on my Twitter I've shared some things from the Arlington County government before, because I want to inform people about probably vaccines or voting or whatever. So okay, I live in Arlington, Virginia. You can look up that there are seven tapas restaurants in Arlington, Virginia, and then you can look and see that photo that I shared of me and my girlfriends enjoying sangria and compare the décor to all seven of those restaurants, and somebody unsavory might show up at our next tapas gathering.

Luckily, I don't get tapas on Tuesday with my girlfriends, but it's very easy to follow that breadcrumb trail. That's even with being slightly careful, even if you didn't post that while you were there or check in at that place, you could make that deduction. Often, we make it much, much easier for unsavory characters. I love posting pictures with my dog, I had to change how I did that when I became a public intellectual, because his dog tags have our address and phone number on them, so now we turn his dog tags to face inside so that I can easily post pictures of him and not have to worry about that. But stuff like that is stuff that generally people don't think about, but it really has an effect on physical security.

Once you get in the mode of thinking about it, people might say, "Oh, you're being paranoid." But it's kind of set it and forget it, you just get into a pattern of the way that you share information about yourself online. Now, it doesn't really feel like a burden to me. Once in a while when I'm traveling and I want to share real time pictures, I sigh and wait until I get back to my hotel room. But other than that, I think it's just a smart thing to do that everybody should think about doing in today's day and age to keep themselves safe.

Justin Hendrix:

So all of these issues to some extent roll up to be national security concerns in your view. Do you think that the powers that be-- folks in positions of power whether it's in the federal government, its agencies-- look at it through this lens? Do they think about misogyny online as a national security threat?

Nina Jankowicz:

I think we're getting there, but it's been a long time coming. I think a lot of these issues during the Trump era unfortunately were swept aside. We didn't want to talk about the racial aspects of it, we didn't want to talk about the economic or political aspects of it, we certainly didn't want to talk about the gender aspects of it either. So I think that understanding is gaining traction in the government and in law enforcement, but it's taken a long time. Even if you look at a lot of the coverage of 2016 and the way the Russian campaign against Hilary Clinton went, gender and misogyny are not really mentioned.

But if you look at the ads that Russia ran, the tenor the discourse on Russian state propaganda outlets, gender plays a huge part of that. I've done some research looking at how women are treated in Russian language and Polish language Russian backed state media, and it's very much the same. It's just part and parcel of how countries like Russia divide and conquer and try to amplify the fissures in our society. Let's be real, misogyny is endemic to most Western societies, and it definitely splits us apart. So there's a couple ways that I'm worried about this, obviously the one is this rote propaganda that we already talked about and I did find in Russian state coverage of the 2020 election that many of the same tropes from what we saw the US discourse were echoed in Russian state propaganda, and not coming up with this themselves, they're looking at things that we're saying about our own politicians and they're amplifying it.

But the second thing that we worried, and you and I have talked about this at length before as well, the second thing that I'm worried about is deepfakes. So we often talk about deepfakes in the context of a nuclear threat or election lies and things like that, and that's all well and good, but the deepfakes that exist today, more than 95% of them are non-consensual pornography against women. The technology is getting better and better and there isn't a whole lot of understanding in general society of how to identify a deepfake, and I'm not fully bought into the idea that if men knew that there was a deepfake porn video, that they wouldn't watch it. If it's going to float their boat, it's going to float their boat.

So we have this issue where an adversary like Russian, Iran or China, or whoever else has the technology, could put together a convincing deepfake of a female politician or official and that would be incredibly damaging to her. It might not be damaging for a man, but it would be damaging for a woman, because again, of that baked in misogyny in our society. We're not doing very much to stop that right now. So there's a whole host of issues that I think we need to think more proactively about, not just the government, but the platforms as well, because they do not often put women first and foremost when they're creating new technologies. I hope that as we have more women who are representing constituents in positions of power, whether they're elected or appointed, that we can start to bring that female first perspective and that intersectional perspective to tech policy.

Justin Hendrix:

You've mentioned the experience of these issues for Black women and other women of color is different and more pronounced than it is of course for White women, of which you are one. In other parts of the world, it's even worse to some extent. I've been reading over the last couple of weeks about some of the issues around misogyny online, for instance in India and some of the campaigns that have caught fire there on platforms including YouTube and Facebook and WhatsApp, even women being auctioned online on those mainstream platforms. How do you think about that, the global picture, and what is the responsibility of the platforms when operating in these societies, that none of which, as you say, are particularly good on these issues, but some of which are worse?

Nina Jankowicz:

I think this gets back to the idea of the platforms are very irresponsible in languages other than English and a few select Western languages. We've seen disinformation campaigns that aren't sexualized or gendered and violent campaigns, to say nothing of Burma, that have just caught fire and the platforms have done little to nothing to stop them. I think the gender and sexualized abuse and disinformation issues that we see in places like India and other hierarchical misogynistic societies are just way worse and the platforms don't have the cultural or linguistic or social expertise to deal with them, not to mention the political will. They barely have the political will to deal with it here in the country that they are based in.

When it comes to these far away countries, where this abuse is just generally more societally accepted, there's a lot less pressure for them to deal with it. So India is a great example, one of the first effective fake porn deepfakes was in India with Rana Ayyub, a Washington Post journalist, and she's endured an absolutely horrific amount of abuse. There's very little that's being done for women like here, for women like Maria Ressa in the Philippines who receive these sexually violent threats. I would love to see, especially for high profile women, but for everybody, a more targeted way to respond to abuse for those who are being abused and then on the platform side.

So what do I mean? Right now, when you report a piece of content, it's just that piece of content or that account. You can't put together a picture, a 40,000-foot view of the entire system or campaign of abuse, because it's often dog-piling that has one stem, either a news story or one particular high follower account that dog whistles for everybody else to see. If we did that, I think A, it would be less re-traumatizing for targets of abuse, and B, that picture would help the platforms in identifying repeat offenders, identifying the really, just really vile vectors of abuse that occur. These people are often repeat offenders, as I said.

So I hope to see that soon, and on the platform side I would love to see, especially for politicians, especially for high profile journalists, points of contact, real human points of contact that women can reach out to, to deal with this. I know Google has a similar setup for their Advanced Protection Program, there is a human, I am told, who is monitoring my account and ready to help me if the Russian government or whoever else is trying to access my account. I've never had to use that feature, but I take comfort in knowing that it's there. If I were dealing with horrific abuse like what I dealt with in 2020, it would be nice to be able to talk on the phone to a human and say, "Okay, here's what's happening to my account, what can you see, what can you do to at least make my timeline a little bit more bearable and hopefully to impose some consequences on the people who are sending me dick pics and are telling me that I deserve to be taken care of in the streets and things like that?"

They have the power to do that, they have the resources to do that, I'm just one person who is going through a traumatic situation when you're in the middle of that, and right now what we hear is that's not really possible. We've seen Twitter make some steps toward a human centered design on their reporting process, and I'm hopeful about that, but it's still not enough. This is 50% theoretically of their possible user base and they're just ignoring it, because, I hope this isn't the reason, but a lot of people believe it is, because this drives engagement on the platforms. That's really sad. I would love to see a platform putting a respectful dialog, a respectful safe place for people to exist first, I would love to use that platform and you'd hope that would be a selling point for a platform that did that as well.

Justin Hendrix:

How does this roll up to policy concerns? What should governments be doing? I assume the answer might be different in different contexts or different circumstances, but for instance, what would you hope that the US Congress might do?

Nina Jankowicz:

Well, the US Congress has been extremely slow to act on all of these issues, and I know there are plenty of representatives who care about this stuff, but it's a matter of I think first dealing with the broader regulation issues that we've been debating for six years now. I just am not super hopeful that those things are going to get decided anytime soon. I have advocated for adding a stipulation to the Violence Against Women Act, when it is reauthorized, to include online abuse. Maybe equipping law enforcement with training, so that they are better able to understand what's going on.

There's this quote that I love from I think Soraya Chemaly's book, Rage Becomes Her, where she talks about how somebody ... It might be from Danielle Citron's book? I read a lot of books before writing this one! But at some point someone said she went to a police officer and brought in examples of abuse from Twitter, and she said, "It was like I was drowning and they didn't know what water was." They just don't have the skills to understand what online abuse is and why it should concern them. If there's not a physical threat, your local law enforcement are often going to tell you that there's nothing they can do about it, especially if the person's over state lines.

So we need some structure around that, we need to at least upskill police officers and local law enforcement to deal with these things and perhaps start some collaboration. In the UK, they're looking at making the content that they call "awful, but lawful" illegal. I think we need to think about that as well, because again, going back to the comparative situation, if I were walking down the street and there were a bunch of men yelling these slurs at me, the police would intervene, I could get a restraining order against these people who are coming back again and again to say these things to me. Online, that just doesn't exist yet. So I'm hopeful for that architecture to come into play.

But again, I think it's just not very high on the totem pole of things that people care about with online regulation right now. Until those broader structures are put into place, I worry that we're going to sink ourselves deeper and deeper into this hole, because as I mentioned before, the platforms aren't thinking about the issues that will arise for women and people of intersectional identities as they're creating the technology. It's an afterthought. That has to stop, and that's why I try to build awareness as much as I can when I'm receiving abuse, the different ways that people are using the loopholes to send abuse to people like me and the women I interviewed in my book, platforms need to know about that. Luckily, I know some of them read my Twitter, read my writing, so hopefully they're thinking about it in the right way.

Hopefully I'm inspiring people to be better online bystanders too, to not just scroll past when they see an abusive tweet, but to report it, because that helps the platforms as well. Or a piece of content, not just Twitter, but Twitter is my main platform, I'm not picking on them. I think that we'll get there, but it's going to be slow going, it's not going to be overnight, all this is going to be incremental change, just like when we dealt with sexual harassment in the workplace and on the street, it didn't happen overnight. So I know that's a very wishy-washy answer to what everybody can do, but a little bit at a time I think is the most realistic thing that we can hope for, and equipping law enforcement, making sure this content that wouldn't be legal on the street is not legal online, and making sure that we're all sticking up for women's rights when we see bad things happening on the internet.

Justin Hendrix:

Are there broader societal changes we probably need to make in order to address root causes, I assume?

Nina Jankowicz:

Yeah. We're never going to get rid of misogyny, unfortunately, time has shown that it will continue to exist. But I think if we make the consequences for that discourse and language a lot more severe than they are, right now I would say there are very few consequences at all, then I think the calculous of people who are sending anonymous abuse online will start to change. Whether that's them getting called out by other men who are witnessing what they're doing, or whether that's the platforms imposing some costs on abusers, not only deleting their content but forcing them to shut down their accounts, making sure they're not able to just create another trolling account, a secondary burner account immediately after getting kicked off a platform, that sort of thing is really important. Because right now, people are doing this without any consequence. The worst that happens, they delete the content, sometimes they're asked to delete their account, but it's pretty infrequent.

Justin Hendrix:

What do you hope this will look like in 10 years time? Do you think we will meaningfully address these issues, or will it get worse? I don't know. I was thinking to some extent some of the online abuse and harassment that you're talking about feels to me in some ways like a response to the rest of society creating and imposing those consequences for misogyny. Partly it's being driven by that-- it's like the force of nature that is misogyny having got pushed out of some spaces has moved so forcefully into this online space. I don't know, I don't know if you agree with that or think that has something to do with it.

I feel optimistic on some level that the platforms can create better tools, affordances, and could certainly invest more, more human and technical effort into solving these problems. I don't know without a stick of law whether they'll do that, it seems to me unlikely. Right now, the scale of the problem versus the investment seems wildly off.

Nina Jankowicz:

Yeah, totally disproportionate.

Justin Hendrix:

So I don't know, I'm trying to imagine a decade hence, what has to change to make real meaningful change.

Nina Jankowicz:

I don't want people to think that it's women's job to deal with this either. The whole thing about the book is that I want women to feel empowered to speak and safe to speak, it's not about doing a tech company's job for them. Clearly, they need to do more in that space and we need better legal infrastructure to deal with the horrific things that happen to women online. I guess over the past five years, researching this issue, there's been a lot more attention to it, so I think that awareness building makes me a little bit hopeful and there have been commitments, public commitments made by platforms.

The only one that I have seen make significant progress has been Twitter, and I haven't really seen much else from TikTok, YouTube, Facebook and its related platforms. So I'm a little worried in that regard, but I do think, again, as we build awareness, as we keep talking about it, as prominent women keep sharing their stories of abuse, there is a conceptual change that hopefully will happen. That being said, will a book like mine still be useful in 10 years? Yeah, unfortunately I think so, because especially in this country we view the right to spout off, even along these completely ludicrous gendered lines that have nothing to do with actual intellectual disagreement, as a First Amendment protected right.

So I think we're going to run into some troubles there. If you look at the UK Online Safety Bill, even as they try to make this content illegal, they're running into some difficulties of exactly how to do that, and they have a much less robust protection of freedom of expression than we do here in the States, or a much less robust commitment to it, I would say. Of course, they still have free expression, but they don't hold their own version of the First Amendment to the same degree as we do in our very flesh and blood. So we have that to contend with. I hope that my book will be obsolete in 10 years, it might be obsolete just because the technology has moved on and different tools and techniques are necessary to respond to the abuse, in which case I may have to issue another edition of it. I hope it's not necessary, but it's possible, it's definitely possible in the trajectory that we're on right now.

Justin Hendrix:

Well, I suspect fortunately, or perhaps unfortunately, I'm not sure which to end with, but this book will indeed have an important shelf life. So Nina, thank you.

Nina Jankowicz:

Thanks for having me.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics