Home

Talking About An Ugly Truth with Sheera Frenkel and Cecilia Kang

Justin Hendrix / Aug 9, 2021

In the concluding chapter of An Ugly Truth: Inside Facebook's Battle for Domination, authors Sheera Frenkel and Cecilia Kang- both New York Times reporters- write that Facebook's business is set up to continue to dominate. Absent government intervention, that is almost certainly true. The company has tens of billions in cash reserves and it just posted a record quarter of revenue, earning $29 billion. But the book is the story of a company that has achieved scale and financial gains at the expense of some pretty important things. Frenkel and Kang write that "throughout Facebook's 17 year history, the social network's massive gains have repeatedly come at the expense of consumer privacy and safety and the integrity of democratic systems. And yet," they write, "that's never gotten in the way of its success."

If you care about the intersection of technology and democracy, this book is a must read. It joins other books- such as Roger McNamee's Zucked: Waking Up to the Facebook Catastrophe, Siva Vaidhyanathan's Antisocial Media: How Facebook Disconnects Us and Undermines Democracy and José Marichal's Facebook Democracy: The Architecture of Disclosure and the Threat to Public Life- as an important chronicle of the rise of Facebook, and the challenges it presents to democracies.

An Ugly Truth: Inside Facebook's Battle for Domination- Harper Collins, July 2021

Through hundreds of interviews, the book reconstructs the internal decision making process at Facebook on a variety of issues and key controversies. It takes the reader on a journey from the company's founding, including the development of the newsfeed that became central to Facebook success, through to the company's paralysis on January 6th, as a violent insurrection overran the US Capitol.

It describes the company's political calculations during the Trump administration and chronicles how some employees- like Yaël Eisenstat, who is on the Tech Policy Press masthead- sought to get ahead of serious problems like voter suppression on the platform, only to be pushed out; or how then head of security Alex Stamos reported evidence of Russian interference in the 2016 election to senior executives to an icy reception. Indeed, according to the book, Stamos had been filing reports about the Russian effort for months and had even warned the FBI, but Mark Zuckerberg and Sheryl Sandberg were purportedly unaware of what the Russians were doing until Stamos took it to them directly- after the election.

The book's account raises questions about why more executives at Facebook, when they knew the company was misleading elected officials and the public, did not step forward as whistleblowers. It chronicles an internal debate at the company over whether the architecture of the Facebook platform favors populists like Narendra Modi in India, Rodrigo Duterte in the Philippines, or indeed Donald Trump. And it concludes at the moment of Mark Zuckerberg's decision to remove Donald Trump, only after the former president's incitement and support of the violence at the Capitol.

Frenkel and Kang note how, in October, 2019, when Mark Zuckerberg gave his first major public address about his company's responsibility regarding free speech, he began with a lie- making up an alternative origin story for the company he started. That's the interesting thing about the book's title- An Ugly Truth. It's at once a reference to a controversial memo written by Facebook executive Andrew Bosworth, in which he justified the company's focus on growth as an effort to connect people across the globe, and to the reality that the connections it creates are not primarily for the public good, but for profit. An Ugly Truth is, in fact, a book about obfuscations and deceit- and I'd argue, about the lies that Mark Zuckerberg continues to tell himself.

I spoke to the authors about the book's reception, and whether the type of change it suggests is necessary at the company is possible under the current leadership and organizational structure. Below is a lightly edited transcript of the discussion.

Justin Hendrix:

So congratulations on the book, the reception has been incredible. This is not the first interview that you've done about it by a long shot, and it certainly is not going to be the last. So I want to ask you a few questions, not just about the book, but also about the reception that you've had to it. It's been a couple of weeks now you've been out on the road. You've had a pretty extensive media tour and done lots of events, including a handful of Twitter spaces and Clubhouses and things of that nature. How do you characterize the response so far? I mean, obviously it's on the New York times bestseller list, but how do you feel about it a couple of weeks in, and maybe Sheera, we'll start with you.

Sheera Frenkel:

I think we've both been really heartened by the response. It's been amazing after years of working on a book to see it out in the world and to hear everyone from members of Congress, to employees at tech companies really respond positively. For me, it's really just been a dream. I was just talking to Cecilia earlier this day about an employee at one of the big Silicon Valley social media companies telling me that they were reading our book as a kind of manual, like a 'what not to do' list. I mean, really studying the book, and I think that is sort of the most we could have hoped for.

Cecilia Kang:

Yeah, I would echo everything that Sheera said. And I would just add that it's been so satisfying and gratifying to hear from many of the people who we spoke to for this book who have been telling us, you got it, you just nailed it, you got it right. And other people coming to us who weren't part of the process of this book, coming to us and saying, 'I was also in that meeting and that was absolutely what happened.' And we're so glad that that story is out.

The other thing that I would say is that it's been so satisfying to see how people are so engaged with the themes that we brought up. Because we've tried to cover so much, we really tried to focus on the problem with misinformation and how that started coming from the tools and the technology, but absolutely the business model as well, and really explore the leadership and decisions and the culture within. It was a hard book because we tried to do so much. And I think that what we've been hearing is a lot of people got it. They get it and it resonates. I think one person said it to me really well recently when she said, 'you connected all the dots that needed to be connected. And now I'm thinking about this story differently. And I thought I knew everything there was to know about Facebook.'

Justin Hendrix:

So just for the sake of my listeners, can you reflect just for an instant or two on the reporting process for this because you talked to hundreds of people. How does it work when you're pulling together a project like this?

Sheera Frenkel:

Well, you know, we, as reporters, we often start with what's already in our notebooks. Both Cecilia and I had written so much about Facebook over the years and seen this pattern emerge of a problem being brought to light or a mistake becoming public, and Facebook apologizing and promising to do better. And so we started with the notes that we already had from conversations and articles we had done with the New York Times and looked at the nuance. I think what you can do with a book that's unique is really flush out what a scene feels like and what a meeting feels like.

And we tried to go to as many sources as we could because we wanted as many different perspectives. Of the 400 people we spoke to for the book, the vast majority still worked at Facebook. And so if one of them gave us a telling of a particular meeting or particular scene, we then went in independently, tried to find other people that had also been in the room because we wanted to know like, oh, well, is that just a single person's opinion? Or did multiple people in the room get that same feeling coming out of that meeting? Were they all affected by it the same way, by what Mark Zuckerberg said? And so we were able to really, I think, create a more cohesive story out of all those interviews.

Justin Hendrix:

Clearly Facebook pushed back almost in a boiler plate way. They said, this is one of- they mentioned this number, this magic number- of 367 books. I'm not sure where they got that one. And they disputed some of the instances in the book.

At this point though, who are Facebook's defenders. And have you found any of them out there? Are there others that have pushed back on the book, apart from the company itself?

Cecilia Kang:

I mean, I really hope that doesn't come across as like cavalier or arrogant, but no, nobody's been pushing back on the book. I just feel like if anything, we're hearing more reinforcing feedback, and again, not from people who we've already talked to, but from a lot of new people who we're talking to. And we're also just curious, and we're still reporting, we're trying to reach other people. It's not just people who self identify, volunteer to come to us. We're hearing from all kinds of people and trying to find out what their impressions are. We just keep hearing 'that's right' you know, or we're hearing, 'well, I think that that's right. I don't know enough and I want to learn more.' And I think that's really satisfying as far as Facebook's defense goes.

I mean, I just think that the evidence is clear in the book. Everything, and we both write this in the author's note- as Sheera said everything is multiple sourced. We went through a very thorough fact checking process with Facebook. It was months long, hundreds of points that we ran by them. Every single scene should not be a surprise, should not have been a surprise by the time the book was published.

So no, the one thing that I think- I don't want to go too much on the defensive because I don't think there's a lot to defend- but I do think that what's really important for people to know is that we want to defend our sources because Facebook has talked about how these are all disgruntled people who talked to us. And that's just simply not the case. I mean we can't go deeply into who our sources are at all or much more beyond the fact that we really think it's important for people to know that many of the people there who spoke to us really liked what they do. They believe they're doing something important. And they spoke to us because they saw something that they felt like was that was wrong and they wanted it to be corrected, or at least they wanted the true story to come out.

Justin Hendrix:

Can you maybe give a little bit of detail to a couple of the lines of criticism against Facebook where you feel that the company's critics overreach, or go too far?

Sheera Frenkel:

You know, I think that there are people who believe that Facebook is acting out of bad faith. That Mark Zuckerberg and Sheryl Sandberg either don't care or are cavalier about the harms they do to people. And I would just say that one thing we found is that they often do feel really badly about things that have happened on the platform, whether that's Cambridge Analytica, Russian election interference, or the violence in Myanmar. We've spoken to multiple people who say that, that executives of the company have shed tears over what happened in Myanmar. And so I think for us, I think it's important that people know that it's not as though these billionaire leaders of Facebook, it's not that they don't care. It's, it's more a question of what are they doing to change things? What are they going to do differently at the company once they learn about their mistakes?

Justin Hendrix:

The book does focus the mind both on organizational questions, questions of scale around the company, but also these individuals- especially of course, Mark Zuckerberg and Sheryl Sandberg- and to the question of change. Do you think it's in Mark Zuckerberg to accomplish the kind of change that the book suggests the company needs?

Sheera Frenkel:

You know, it's hard to say because what we have heard from people within Facebook is that the one thing that would need to change is really Mark Zuckerberg. And he would need to accept some level of oversight on his position. He would need to relinquish some of his control of the company and perhaps put himself in a more vulnerable positions that he has to accept criticism. And he has to accept the opinions of others. Right now Facebook's board is really just there in an advisory role. And all those things I just said, it just doesn't seem likely that that Mark Zuckerberg is going to do them. People close to him and say that he's more intent than ever to be the leader of Facebook and to continue to lead Facebook into the future.

Justin Hendrix:

One of the things I've gotten the sense listening to him in the recent congressional testimony is that he does seem to feel more definitive that Facebook is not to be blamed for some of the things that it's being blamed for. His dismissal of responsibility around January 6th, for instance, seem to be much more definitive than past dismissals round prior conflicts and controversies. Do you feel like that's true, or do you think that on some level Mark Zuckerberg is hardening?

Cecilia Kang:

I think Mark Zuckerberg has actually from pretty early on thought that other people have it wrong, and other people are blaming Facebook too much. And certainly Sheryl Sandberg has felt that way for quite some time. But I do think that the decision by Facebook very recently to blame the White House for scapegoating Facebook when it comes to COVID misinformation is just part of a very long pattern where, just culturally- and I've covered Silicon valley companies for quite some time- I think the defensiveness that comes for Facebook feels a little bit more pronounced than from other companies. So I don't know if he's hardened in that way, but I definitely feel like that there was a broad agreement within Facebook, starting from 2016 when revelations first began of interference on the platform for the election, and then Cambridge Analytica, and then misinformation related to COVID.

And then there was the January 6th riots, one thing after another- Facebook felt like it was always the easy company to blame. And you hear that still now. And we've heard in our reporting that Mark Zuckerberg, Sheryl Sandberg, and others said that the media was over-indexing on our coverage of Facebook scandals, because they were jealous because they, meaning the media, the company was accusing the media of being jealous of Facebook's success and destroying the business model of journalism, which just can't be true. We're in the business of accountability, journalism, and this is a big and powerful company. So of course, we're going to write about the scandals, especially those that are so profound, like the ones at Facebook,

Justin Hendrix:

Reading the book and thinking about the sort of succession of things- whether it's Russian election interference, January 6th, the question of COVID-19 misinformation- there's often this issue of at first the denial and then an attempt by the company to sort of reframe the question. We saw that most recently with Guy Rosen's post around COVID-19. Then there's been this issue, which Sheera brought out in her subsequent reporting around just the sort of presence- or lack thereof- of the proof, the data that would solve the debate about whether in fact Facebook is taking too much of the blame or just the right amount or too little. Do you think that this latest issue with the White House- do you we'll get to the point where the next time that that data science team says, 'we need the budget' to look into whether the platform is responsible for polarization, or the problem with vaccine adoption in the United States, or what have you, that they'll get those resources to deal with or to answer those questions?

Sheera Frenkel:

I don't know. I would hope so. I think again, there's been a pattern laid out in our book where this isn't the first time that an employee internally has raised a question or issued a warning and said, we should put more resources here, or we should spin up a team to investigate that and has been either not given those resources or has, has been ignored. So, yes, I'm always hopeful that companies listen to their own employees more, their own experts more; many people including the White House would like to see that change.

Justin Hendrix:

All this does bring up the issue of scale. The fact that this company is so massive that these problems are on some level so big, the numbers are so big. The data is so big. Is that the fundamental issue- is this thing just simply too large for anyone to get a handle on, much less an individual like Mark Zuckerberg?

Cecilia Kang:

It's really important- before we get to the issue of scale- is to understand that growth is the most important goal. So growth and engagement, attention, all those things have led to scale, right? Has led to the scale of the amount of data they have. It's led to them trying to get to the network effects and getting so many new users. So yes, scale is a huge problem in that they do not have commensurate guardrails in place. In other words, they don't have the precautions when it comes to safety, privacy, and other things at scale. That is a big problem. If there's any company that can catch up, it is Facebook with the resources they have. But what we do show though, is that even when they do try to catch up- in the book, we show that when it comes to, for example, Russian election interference, there was plenty of warning ahead of time.

The security team was tiny and they were trying to get reports up, escalated up to Zuckerberg and others top executives, warning of interference they were seeing, especially towards the end in 2017 on Russian ads and organic content, and time and time again, they were ignored. So it's not even just that they are not scaling their security and safety operations at the same time, the commensurate to the scale they're avoiding and delaying. And we show in many instances in the book- often with many, many warnings- perhaps intentionally they're avoiding addressing these things for far, far too long.

Justin Hendrix:

I'm going to maybe focus in a little bit on January 6th as an example of this, because one of the things that's confused me in reading the book, and also just on following the history of this- if you look back to August, September of 2020, Mark Zuckerberg himself was saying, 'there's a real chance there may be civil unrest', violence after the election. He put a lot of his own personal money at stake into election defense, donating it to try to help with the election security issues.

It struck me as really strange then to then also read in your book that the company on some level seemed flat-footed when it came to January 6th and when the violence that took place, they seem to want to deny that anything on the platform had played any role. How do you square all that? On some level you sort of feel like Zuckerberg was prepared for the 2020 election, perhaps more so than ever before. He was saying some of the right things in the fall. And then a couple of days later you've got Sheryl Sandberg suggesting 'we had nothing to do with any of this.'

Sheera Frenkel:

Right. I think Mark Zuckerberg and really Facebook, I would add, that the security team at Facebook were prepared for a lot in the 2020 elections. They were prepared for foreign election interference, and we saw them monthly issuing these reports about taking down Russia networks, Iranian networks. They were finding these new foreign networks as quickly as they were being launched to ensure that what happened in 2016 couldn't happen again. And in some ways they were fighting the battle of 2016, but in 2020. Now what they were not prepared for, and I think what they really struggled with, was misinformation that was being shared by Americans to other Americans. Especially when some of that misinformation was being driven by the President of the United States. And this was kind of a corner that they backed themselves into when they decided that Trump was going to get a special carve out on their platform, to be able to say things and post things the average Facebook user was not able to do.

And so you had the President of the United States, even before the elections, telling people that there was going to be voter fraud and that the voting system shouldn't be trusted. Now, if you- the average Facebook user- posted some of those things, they would be taken down. But he was posting it. And so therefore they were allowing it to spread. And between when the vote happened in November and the Capitol Hill riots happened on January 6th, ferment and unrest and anger was growing on Facebook. They were watching this happening and they took down some of the groups that were claiming 'Stop the Steal', this idea that the election had been stolen from Donald Trump, but there are others that persist.

And again, we saw a very kind of haphazard approach where they weren't quite sure what to do about a lot of these groups and their own security experts were telling them anger is really boiling over- like it's getting really out of hand on Facebook. And that's why they were sitting in that room, watching the events of January 6th unfold- because they knew themselves that there was going to be a potential for violence on that day.

Justin Hendrix:

Do you think on some level, Facebook has a wrong set of assumptions about America? I remember there was one particular Facebook employee that when he left, he made some kind of post where he said, the types of problems we've seen elsewhere in the world are coming here now. And we're going to see these types of problems in the United States. And I think what he was referring to somewhat was, political violence, loss of trust in institutions, some of the things that happened, perhaps we associate with happening elsewhere in the world happening here, because the situation here is degraded, certainly from where it was when Facebook launched in 2004. Maybe a set of assumptions that the company has about what goes on in the United States or what the threat landscape is here that is wrong.

Sheera Frenkel:

There's some things that are country specific, right? Like I think you would, you could make the argument that in many countries, including Myanmar as an example, the things that went wrong on Facebook led to really sort of horrible, real life consequences, because they didn't have a free press. They didn't have NGOs, government institutions that were acting responsibly to deliver accurate information to people. And so all they had was Facebook. And so Myanmar was kind of a perfect case study of what happens when all you have is Facebook and information on Facebook is hate speech and misinformation. Here in the United States, we have a lot of safe cards, right? Not every American, but quite a few do believe institutions like the New York Times, the Wall Street Journal, the Washington Post, and they trust them to deliver accurate journalism.

And so you have that perhaps pushing against potential misinformation on Facebook. I just think that what happened in the rest of the world could have served as more of a warning to Facebook. Human beings are human beings. And if you give them an algorithm that promotes the most emotive content, the most emotive content is often going to be things that anger people or produce a response of frustration. It's going to produce an extreme emotional response. And so you should know that if you, if you create algorithms that surface, that stuff, wherever you are in the world, that is going to probably drive people towards emotive content.

Cecilia Kang:

Yeah. I mean, one thing that I would add is that what we saw and what we reported in the book is that especially over the last few years, Mark Zuckerberg has made the decisions when it comes to the technologies, as well as the policy decisions. So it is really important to understand also that we are also seeing his evolution when it comes to content moderating, content and how misinformation should be handled. That's a lot of responsibility and power held by one person, and that has been a problem, and many people in the book say, so that it's, there's, there's nobody inside nor outside the company at this point, especially, and within the board who oversee him, that serve as a check.

So I think when asking, it's hard to get into the culture and what they look into. I definitely think that their priority is the business and growth, but as far as like their assumptions about Americans, I think that a lot of the decisions we can say are being made by one person.

And we're seeing him kind of mature in real time and evolve his thinking. Like he gave a very famous speech at Georgetown- we have that in the book where he even shocks a lot of people within the company, and it describes the full expression of the philosophy that he has, that more speech will drive out bad speech. And he course corrects when it comes to the pandemic and COVID misinformation, and that becomes a priority. And he course corrects again when it comes to Donald Trump after January 6th. So we're seeing how one person's decision displays a point of view for the whole company.

Justin Hendrix:

Let me ask a question about one check that Mark Zuckerberg has essentially introduced for himself, which is the Oversight Board. In some ways, I feel like your book is coming out at the beginning of this, maybe new period for Facebook, the existence of the oversight board, almost coincides with maybe getting close to where you finished the book, I assume in terms of its operation. Are there any early indications from your perspective of whether the Oversight Board can be the check that the company needs to, to write its past wrongs and to get onto a different trajectory?

Sheera Frenkel:

I think a lot of people welcome the Oversight Board as a group of independent experts. Many of the people on that Board have years of expertise in subjects that the average Facebook employee does not nor does the average member of their policy team. And so they're thinking on this, I think is really valuable for Facebook. However this is a big caveat. They have no authority, right? They can only make recommendations and it's up to Facebook, whether or not it accepts those recommendations. And as we saw with the Facebook account of president, Donald Trump, they I think very wisely said in this case, you have not created policies or rules to explain your decision for removing president Donald Trump. And instead, you've kind of punted the decision to us and tried to get us to make a call for you, but we can't make a decision for you without you enacting basic policies for us to look at.

And so they ultimately handed that one back to Facebook and said, no, you've got to figure this one out for yourself. I think we're still seeing the board sort of emerge and find its footing. And for it to be, I think, a more of sort of value and to have a greater impact, it would be interesting to see them actually given authority to make decisions.

Justin Hendrix:

I think you did say that the Oversight Board essentially gave Zuckerberg the perfect out on Trump early on. To some extent they did turn that back around on him. And I guess a lot of folks are hoping that exactly, as you say, that they'll be given a bit more teeth in terms of the types of recommendations that they can make and the extent to which the company is held to that. Let me ask just a couple of more questions, thinking back now over the types of questions you've got in every event you've done and things perhaps you've learned since it's only been a little while, but if there was one piece of the book you could go back to and either add to, or I'm not going to ask you to fact check yourself or tell me what you might change, but there was one piece you wish you could go back and unearth a more on or get into more depth on that you think would be important to continue to pursue. What would it be?

Cecilia Kang:

There is a lot on the cutting room floor and a much longer version of this book was in draft form. And one thing that we both were really hoping to explore more and that it was the right decision to produce the book in the form right now, because we also wanted to move fast. And we think it does for people to read, but we had so much more on sort of on Zuckerberg and Facebook's position on global expansion and why that was so important for them and me and Myanmar was a really important chapter. And we're so glad that that was in, but we had so much more on like Zuckerberg courting China and Narendra Modi and India, and trying to get into those markets and how that was a huge goal for them. I think that that would have been really important because it is such a global company. And if we were, we would have liked to have a little bit more global perspective.

Justin Hendrix:

That's so interesting, and I do those are probably the stories you'll be writing for the next year or two. Sheera, anything that you would add to that in terms of what you'd be keen to go back and do more?

Sheera Frenkel:

Absolutely. I mean, I agree with Cecilia. I think both of us really felt like we could have included a lot more about what was happening in the rest of the world, because there are so many interesting case studies about India, Sri Lanka the Philippines that we wish we could have included in this book and, and really delved a little bit more into how Facebook entered those markets and what the effects have been.

Justin Hendrix:

The January 6th Select committee is about to get underway. Nancy Pelosi has included in the legislation establishing it some language to do with online platforms, any indications, either from your reporting, for the book or otherwise about how Facebook intends to comport itself with the committee or what we can expect in terms of what the committee will do with regard to social media?

Cecilia Kang:

I think that Facebook will say that they want to cooperate with the committee. I think there's going to be really interesting things that come from discovery. And as the committee starts to investigate this more already, we've seen so much come out in the indictments things that Facebook itself initially denied. Not that they happen, but that's the extent to which Facebook was an organizing platform for these groups that disturb the Capitol. So that will be quite interesting. I just think that being an observer of Washington and reporting in Washington for so long, it's really going to be hard to, to do something actionable after the committee's findings that does not curb speech in a way that all Americans can feel comfortable about that. So Sheera and I have talked a lot about how there is a way to look at misinformation and speech and especially the spread of violent speech to be very like, almost like scalpel-like in that sort of approach to it, to be very specific about things that just should not be permissible, not necessarily as law, but even just across industry as a standard.

Justin Hendrix:

Well, I'll just tell you both you know, as someone who teaches on technology media and democracy, this book will 100% be on the syllabus. I suspect for years to come. So I'm very grateful for all the effort that you've put into this. And, and very grateful that you took the time to speak to me. Awesome.

Sheera Frenkel:

Thank you.

Cecilia Kang:

You're so welcome, but thank you also for your interest and we follow you a lot on Twitter and we know you're honest, so thank you.

Justin Hendrix:

Thank you. And I wish you both a good afternoon.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics