Home

Donate

Taking Stock of Facebook vs. Trump: A Conversation with Erin Shields & Katy Glenn Bass

Justin Hendrix / May 12, 2021

Last week, the Facebook Oversight Board decided to uphold Facebook’s suspension of former President Donald Trump, who was booted from the platform hours after inciting a violent white supremacist insurrection at the United States Capitol to interrupt the certification of Electoral College votes that sealed his defeat in the 2020 Presidential Election.

But, the Oversight Board did not rule definitively. While it did find that the evidence “shows that Mr. Trump used the communicative authority of the presidency in support of attackers on the Capitol and an attempt to prevent the lawful counting of electoral votes,” it also found that “it was not appropriate for Facebook to impose the indeterminate and standardless penalty of indefinite suspension.” So, the Oversight Board says Facebook has to go back to the drawing board, sort out its policies and make a final determination about Trump’s account within six months.

For expert reactions, I spoke to Erin Shields, a National Field Organizer at MediaJustice, a grassroots movement for a more just and participatory media that fights for racial, economic, and gender justice in a digital age; and Katy Glenn Bass, the Research Director of the Knight First Amendment Institute at Columbia University. The following is a lightly edited transcript of the discussion.

Justin Hendrix:

This week, the Facebook Oversight Board delivered it's much waited for, much anticipated decision on the curious case of Donald J. Trump. What were your first reactions?

Katy Glenn Bass:

I was surprised. I had expected them to issue an up or down decision either you have to run it back on or you have to kick them off permanently. I think they've put down some interesting messages for Facebook in terms of what they believe their role to be and what they intend to do in terms of future interactions with Facebook on these cases. So it's a much longer and meatier decision than I had expected.

Erin Shields:

Similarly, I was surprised. I thought it was going to be an up and down decision. And I also was a little, dare I say, annoyed that we spent, five months or so waiting for them to essentially kick the can back to Facebook. But I was happy to see that the Oversight Board affirmed that Facebook made the right decision in suspending and removing the content from former President Trump, which we knew was the case all along.

MediaJustice has been calling for Trump to be banned for a long time, not just because of the actions of January 6th and all of the rhetoric that helped incite it, but for previous hateful comments and his behavior on the platform prior to that. So we were happy to see that, but just a little concerned about where that leads us. Are we going to have to wait another six months to actually get a permanent ban, or will they wait until this sort of fizzles out of everybody's mind and then roll back their decisions? So we're just kind of waiting to see, and also pushing Facebook on making that decision quickly and publicly.

Justin Hendrix:

Katy, I want to kind of ask you a couple of questions about the submission that Knight did before you were one of the nine thousand odd commenters and submitted a particular kind of proposal to Facebook, which I saw somewhat represented in the Oversight Board's decision. Do you think that's true in some way?

Katy Glenn Bass:

In some ways. So we had also recommended that they not decide whether or not he should be allowed back on the platform, but our condition was different. So what the board has done here is say, "We're not going to make this decision. You need to come up with a clear and consistent policy and then apply it to him." What we had said was you shouldn't make this decision. You should tell Facebook they have to commission an independent investigation, and ‘independent’ is important there- into the role the platform and your design decisions, and your policies may have played in facilitating what happened on January 6th. And after Facebook has done that study and published it, then we'll take up this question.

We were basically arguing that Facebook is trying to turn its massive problems with its platform into a simple yes or no as to whether one person gets allowed back on the platform or not. And what's more, they're trying to kick it off into this supposedly independent board that they've created and make them make the decision. So it doesn't even have to be Facebook saying we needed. And what we were saying to the Oversight Board is you shouldn't allow yourselves to be played in this way essentially.

Justin Hendrix:

And Erin, you mentioned that MediaJustice has been calling for action against Trump's account in the past. I assume that has to do with instances such as calls to violence about the protests against police and racial injustice last summer. Were there other things that you were kind of looking to in that regard?

Erin Shields:

That and other concerns. So like here's our thing- I'm going to level with you here. We've been concerned about the Facebook Oversight Board and the uptake of this particular issue. And we engaged because obviously there's going to be a lot of media around it and Facebook is going to be using this to push narratives around about how they're advancing this idea of free expression and all of these sorts of things. But what we don't want is to continue the spectacle of Trump, and also recognize that there are still other people who with very wide platforms and networks that are pushing out hate speech, disinformation, misinformation- including disinformation about vaccines- all these sorts of things on Facebook's multiple platforms, including on Instagram and through WhatsAppsWe're really like looking to move on from this, right?

So what we want to see is really decisive action from Facebook. Facebook, we believe, has all the information that it needs to make a decision. And that includes not only the actions that happened around January 6th, but other actions prior to that. And we saw this as sort of Facebook trying to push off its responsibility onto this board that we're supposed to believe is independent- and that's not a strike against the people that are on the board, but more so of the all encompassing way that Facebook likes to bring in experts and sort of distort their work and expertise to benefit Facebook's reputation publicly. And so we didn't really want to play into that too much, but moving forward, like I said, we're really looking to see Facebook make this decision itself.

And really what we've been doing is calling into question the leadership of Facebook. I think a lot of our campaigning and working with Facebook for many years now has led us to believe that Facebook really reacts and responds when there's public pressure and spectacle. For as many years as we've been working with them and observing their moves, it just doesn't seem like it's in good faith to us. And so for us at MediaJustice, we're actually looking to see some like institutional changes at the top. Because we really don't feel like we're going to get the changes that we need until we have folks in executive level positions who are really willing to take this seriously and take it as more than just like a PR thing or Facebook is getting bad press and so we're going to act or look accommodating or like we are taking this seriously.

I think if Facebook took it seriously, they would have addressed a lot of the "Stop the Steal" content prior to January 6th, they were aware of it. I don't know if either of you were on to there the Facebook Oversight Board's stakeholder meeting, but they actually said that. I think the Oversight Board asked Facebook, the company, 46 questions and Facebook outright declined to answer 7 of them.

And one of the questions that they refuse to answer was how "Stop the Steal", #Stop the Steal content was actually amplified by the platform. They refused outright to answer that. And I really feel like that's incredibly telling. And so we want to dig into those points. Like it's clear that that's a pain point for them. And I think it's really telling in their role in everything that happened on January 6th and even before that, beyond that, I'm sure there'll be something in the future, but I think that's sort of like where we need to be drilling down and digging in.

Katy Glenn Bass:

Those are really good points, and I was wondering if I could add a little bit to some of what Erin has pointed out. So one thing in terms of the involvement in "Stop the Steal", I think that's right. And I think this really underscores a recurring issue at Facebook. So I've never seen them step forward to take responsibility before they've been hit for it in the media. So after the 2016 election, Mark Zuckerberg first comes out saying, I think it's pretty crazy to think that Facebook played any role in this only after mountains of evidence emerge later does he admit that maybe there was a problem with the platform. After "Stop the Steal", Sheryl Sandberg first comes out saying this was organized by platforms that don't have the kind of safeguards or abilities that we have to check their stuff.

Later, it turns out quite a lot of it was actually organized on Facebook. And then even going further than that, as you all may have seen BuzzFeed published a piece last week, I think where they had heard about an internal report that Facebook had done an internal investigation which is not the same as an independent investigation, but an internal report they had done looking at "Stop the Steal" and looking at places where they missed the mark, or didn't do what they should have done and how effective their measures were. So not only do they know that there was the problem there, after Buzzfeed reported on it, they pulled down the internal report off their internal employee messaging boards. So they won't even stand behind their own internal investigations which just highlights the fact that there's even more need for further pushback and pressure on them beyond just this one Trump spectacle as Erin put it.

And then another thing that she mentioned that I wanted to underscore is the board and the board's mandate, which is a problem that we raised in the Knight Institute submission here. So the board itself, the people on the board are good smart people. I think they need to do a better job representing the global south in some particular categories of people. But they're smart people, they're thoughtful people, but they've been given a mandate that prevents them from looking into the really critical questions about Facebook and what should happen on Facebook. They've been told you can only make decisions on cases where people have appealed a particular piece of content that's been taken down. Now they've slightly expanded our mandate to include decisions, to leave up certain pieces of content you can appeal those to. Or you can look at cases that Facebook directly refers to the board, which is what they did in the Trump case. But they're not allowed to look at the algorithmic amplification. They're not allowed to look at the way Facebook decides which users get recommended to which groups.

And in fact, when the board asked them questions about that they simply declined to answer, so that's a real fault with the way the board has been set up. I mean, it's a fault in my view, in Facebook's view, it's by design, they didn't want them to be able to look at this. But these are the kinds of things that the board is obviously chafing at these restrictions. And the fact that they noted that Facebook refused to answer those questions is significant, but they don't really have the independent power to compel Facebook to do more, to answer those questions let alone to do an independent investigation. They can make non-binding recommendations, but that's all.

Justin Hendrix:

So it was 46 questions that the board sent to Facebook. And then as Erin said, I think roughly seven that they outright refused to answer or partially did not answer. And those were questions about how Facebook's newsfeed and other features impacted the visibility of the content from Trump that was deemed to violate their terms, whether Facebook had researched or plan to research it's designed decisions in relation to January 6th, information about violating content from followers of Mr. Trump's account. So really it was almost like they were going for the crown jewels. They wanted a network analysis. They wanted to know what Facebook had done to really understand its own role in this particularly, or particularly damaging event.

Katy Glenn Bass:

Yeah, I think that's right. But Facebook's response was, that's not information you need to know.

Justin Hendrix:

On some level, this is also information that Congress might seek, I would assume if in fact there's ever a national commission, and if in fact there is a subpoena process.

Erin Shields:

I would certainly hope so. In the past when we've seen some of these platform CEOs go up to testify and speak in front of Congress in some of these committees, we've seen that sometimes congressional members are a bit behind on their understanding of platforms, asking how platforms make money and like why they're not seeing certain email or why their constituents aren't seeing certain emails, all these sorts of like off-base questions. I feel a little more positive outlook after some of the more recent hearings that congressional members are becoming a little bit more savvy and understanding what to ask for, and also partnering with groups like ours and other advocates to really figure out which questions to ask to make sure that you're getting to the root of why this is happening.

I think we heard a lot about algorithms and amplification at the most recent hearing. I think it was a judiciary hearing, which I was really happy to see. And so I think as we move forward and we start hearing more about regulation, or just having congressional members hold tech CEO's feet to the fire and using their bully pulpit more that they'll be asking those more pointed questions. And really what I would love to see is, the Representatives really being able to call them on their non-answers and bad answers in real time so that we can actually move forward with something or leave a hearing with more information than we had going in.

Justin Hendrix:

Katy, you had called for that independent investigation. I believe there have been at least one or more independent investigations. There was a human rights investigation around Myanmar. But did you have something in mind for that specifically around January 6th or how that would work?

Katy Glenn Bass:

I think multiple investigations would be good including one led by Congress. What we had in mind here was actually something that would compel Facebook to give researchers, independent researchers more access to its platforms data, which has been a recurring issue with Facebook as well. So they don't like to let researchers see the kinds of data that they would need to do studies of social media platforms and how speech travels on social media. And in this case, what we were stressing is that any sort of an independent investigation that happens needs to include people with the technological expertise to really know what they're looking at. So the reason we didn't recommend that the board demand to see Facebook's algorithms or something like that is because the board is not going to know what they're looking at if you show them the code, but there are people who would know what they were looking at and would know what to look for on the platform, but that's going to require Facebook to open itself up much more than it has wanted to do in the past.

The Knight Institute has a sort of separate long-term conversation going with Facebook over amending its terms of service to allow for more public interest research to happen on the platform. Because right now, any scraping of publicly available data on Facebook or the use of temporary research accounts to do research on the platform are considered violations of its terms of service. And therefore, technically the researcher could be held liable under their Computer Fraud and Abuse Act which bands violations of terms of service as a crime.

Justin Hendrix:

I wanted to ask just in the few minutes we've got left about how this decision and the way it played out changes I guess the atmosphere around Facebook, around the kind of question of the Oversight Board, the question of accountability or potential regulation around Facebook, or even the kind of atmosphere around activism regarding Facebook. So I don't know how to quite put that question to either one of you and what that prompts in your mind. Erin, you've been, as you say, campaigning against and in some cases working alongside or with Facebook on these issues for years- how does this change the nature of the relationship?

Erin Shields:

It feels like a neutral to negative change and I'll say why. So originally when we would campaign against and in collaboration with other groups against Facebook, we were dealing directly with the company and we were holding the executives who are making those decisions accountable directly for their decisions, right? And to us, this like propping up of this Oversight Board felt like a diversion from that. So don't hold us accountable for our actions, for our decisions, for the way that we're making policy or not making policy. We're actually just going to defer that for five, three, whatever months over to this board. We've established this board of experts who are deliberating around these decisions and whatever they decide, you can take your gripe up with them. And we can't, what can we do? This is an independent board.

And so even though the board came back with a decision that we think is favorable, that affirms what we had said before, and also itself pushed the decision back onto Facebook, how long did this take? And what organizing energy was expended in order to publicize this to make sure that Facebook wasn't able to use this as a way to get out of their responsibilities around this. And so moving forward, we're not interested necessarily in engaging with the board in a way that further legitimizes their existence. And that Katy and I may diverge on this issue because I think that as advocates, we're just really fed up with Facebook and the way that they've been engaging.

But I think what this decision does do is gives us the opportunity to again push Facebook to do what we've been asking them to do, which is deliver a permanent ban based on not just the actions, the posts, the content that went up around January 6th, but even previous things, previous attacks on communities of colors, on Muslim people, on all the undocumented people, all of these groups that have a lot of hate and terror directed towards them because of the words of essentially our Federal government being amplified over and over again, right? I put this in my statement that we put out about changing the ways that political figures are handled on these platforms for a long time and perhaps even still political figures, government figures were given a lot of leeway and actually had like the content policies were less restricted to them.

And again, what we've been saying for a long time and what changed the terms has been set, which is, I don't know if folks are familiar with changing terms group that does a coalition that does work on content moderation, but we've been saying that that's like, actually, what should be happening is the reverse, right? So that people who have massive platforms and have institutional power at their disposal should be held to a higher standard when it comes to the content that they're putting out and having an amplified on platforms and not a lower one. And not because it's news worthy, not because they're a government official, precisely for those reasons, what they say should actually be monitored and held to a higher standard. And that's not just in the US that's across the globe actually too.

When we were doing work here in the US, actually I do think that the US sets a standard because of our ability to sort of the institutions are based here and any regulator that would have any opinion about this is based in the US. And so it's important for us to consider that what happens here in the US also affects people abroad. And so we want to make sure that when we're pushing for these policy changes, we're not just considering what's happening in the US context, but also globally.

Katy Glenn Bass:

As of right now, I'm not sure the decision changes much at all, because we've yet to see what Facebook is going to do in response. The board has now made its supposedly binding decision on the Trump case. And they've also made a bunch of non-binding recommendations that Facebook has to reply to and indicate whether or not it's going to take up. But the fact remains that at any point, Facebook could just say, "You know what, this was a mistake, we shouldn't have done this, the board is disbanded." And they might take a minor PR hit for a day or two over that, but it really wouldn't be much. And they know that. And so I'm skeptical of what capacity the board has to really hold them accountable. I think that they're in the board's decision.

One thing that I think the board's early decisions have been good for is highlighting how wildly inconsistent Facebook is in applying its own existing policies. So highlighting things like the incoherence of the newsworthiness standard that Facebook has tried to implement and how that applies to political leaders as Erin was referencing, highlighting even just basic, things that are so basic and so shocking when you realize they're not in place already. So before the Trump decision came out, they issued a decision related to India to a video that was critical of Narendra Modi that was pulled down. And one of their recommendations was you should probably translate the Facebook community standards into Punjabi, which is one of those things where you're like, well, wait a minute, the community standards aren't available in Punjabi already. And so they're at those sort of basic low hanging fruit things, I think the Oversight Board can do some good, but there's a whole range of other issues. They're not going to be able to touch.

One thing that I found interesting in the decision on Trump, which I assume was intentional although I can't be sure is that ruling basically lays the foundation for a rationale for Facebook to issue a permanent ban. They have basically made the case that he has violated the standards so many times and also that in order to let him back on he would have to disavow things that he's obviously not going to disavow, like all of the lies about the election. They have made it very easy for Facebook, if it wants to, to essentially just adopt that framework and say, "Here's what we think." So I think that they probably meant it that way.

Justin Hendrix:

I read it the same way and I think it does increase the likelihood that Facebook may rip the bandaid off and just go ahead and suspend Trump possibly indefinitely or forever. But I suppose we'll see. I mean, on some level it has now been kicked back to those senior executives, Erin, that you were concerned about. It'll be on some level, I'm sure mark Zuckerberg and Nick Clegg who will make this decision.

Erin Shields:

Yeah, we'll see. I'm trying to remain optimistic. Again, I feel slightly jaded because I've just been doing this for so long and interacting with the company for so long. I'm hoping that they'll see the actions that happened in places like Twitter where there was slight uproar for like two days and then everyone's like, "Oh, things can be normal here. We don't have to worry about somebody tweeting out videos." We just don't have to worry about hate speech at that level and that volume on the platform, or like constantly monitoring what Trump is doing on the platform. And I think we could see something similar like that from Facebook. I'm not sure what they're hanging on to by continuing to drag this process along that would be my question for them.

I'm hoping that they'll make the decision well before the six month mark based on what you all were just talking about this framework that's been laid out. I don't think it's going to take six months. And I think they'll also see a lot of increasing pressure from the advocacy community if it does seem that it's going to be dragged out longer and longer. So we'll see what happens. But I'll say that I am neutral to slightly optimistic about what will happen, we'll see.

Justin Hendrix:

And Katy, I'll put the last question to you just in terms of the overall kind of discourse around speech in the First Amendment and content moderation in terms of the reactions that you're seeing out there from different parts of the political spectrum as the First Amendment Institute. What do you think if anything this does to kind of change the dialogue, the political dialogue in the United States around speech issues?

Katy Glenn Bass:

Not much, because I think the way our First Amendment is already represented in these conversations is intentionally distorted. The reactions that we've seen mainly from the right that this is somehow violating the First Amendment to not let him back on the platform. I mean, these people know that's not true. The First Amendment restricts government action. Moreover, Facebook and all these other platforms have their own First Amendment rights to decide who does and doesn't get onto their platform. They know that, but it's a convenient cudgel to say that these platforms are violating the First Amendment. Interestingly, it seems like we're converging on the need for more regulation. We don't know there's no convergence on what that regulation should be or how we get there. But the fact that the GOP which usually is pro-business and economic Liberty or however you want to put it is now calling to break up the tech companies or to regulate them in greater ways. That's something I wouldn't necessarily have expected a few years ago, and I wonder what potential it may open.

Justin Hendrix:

Thank you both for talking to me today.

Katy Glenn Bass:

Thank you for having us.

Erin Shields:

Thank you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics