Home

Andrew Bosworth: The Ugly, Part Deux: Now in 3D

Justin Hendrix / Dec 22, 2021

An interview with Axios reveals the operating philosophy at Facebook (now Meta)

In September, when Facebook Chief Technology Officer Mike Schroepfer announced he would step down from his role after thirteen years, Facebook Founder and CEO Mark Zuckerberg fingered another man who has spent most of his adult life at the company to replace him.

Andrew “Boz” Bosworth, a 15-year veteran of Facebook, is known to be one of Zuckerberg’s closest confidantes. But while some senior Facebook executives take pains to delicately address the company’s failures– such as Nick Clegg, a former British MP with a talent for affecting a kind of “righteous exasperation” with Facebook’s critics– Bosworth is known as a provocateur. If Clegg is Zuckerberg’s Superego, Bosworth is his Id.

Bosworth is perhaps best known for his much-referenced 2016 internal memo, “The Ugly”, which taken as text alone remains a totemic example of Silicon Valley callousness. Bosworth claimed the memo– revealed by BuzzFeed in 2018– was meant to push the internal debate about the company’s role in the world. At the time, some outside the company agreed. Conor Friedersdorf, writing for The Atlantic, said “the United States would benefit mightily if a senior leader at every major corporation attempted to lay bare the most powerful incentives shaping their enterprise and the most damaging possible consequences of their behavior,” while Casey Newton wrote at The Verge that “the Bosworth memo shows the company reckoning with its unintended consequences and the ethics of its behavior” even prior to the 2016 election.

In retrospect– particularly in light of the revelations brought forward by Facebook whistleblower Frances Haugen– that reckoning clearly never happened. Instead, the company continued to do precisely what Bosworth’s memo justified: pursue growth at any cost. Providing a rationale for Facebook’s rapacious growth in service to its mission to connect people, Bosworth wrote in his memo that “maybe it costs a life by exposing someone to bullies.” Now, the company is understood to have played a role in multiple atrocities, such as the genocide of the Rohingya in Myanmar.

The mass graves that hold the discarded remains from that tragedy were not in the frame last week when Axios interviewed Bosworth, capturing him in gently lit high definition at Facebook’s offices in Menlo Park. He goofed around in a VR headset with reporter Ina Fried before answering questions about Facebook’s role in the world. As the interview progressed, it became apparent that Bosworth’s thinking has hardly evolved since he wrote The Ugly; in fact, his perspective is fairly similar to what he wrote five years ago. It’s worth looking at the conversation closely- the below is based on a transcription of the full 8 minute interview that aired on HBO.

When Fried asks Bosworth how the company will avoid “terrorist planning going on or just misinformation from being spread” in the ‘metaverse’-- Facebook’s preferred term for its future virtual world– Bosworth reframes the question:

Andrew Bosworth: I think it's not even really a metaverse issue. It's an issue that we face today with the tools that we have, like WhatsApp and Messenger. How do we want to balance our ability to communicate privately, private from governments, private from corporations, versus, 'I want to make sure that nobody's having a conversation that I don't like, and therefore we should sacrifice some of that privacy.'

The insinuation that those concerned about extremist activity or misinformation on Facebook simply “want to make sure that nobody's having a conversation that I don't like” is, of course, a cartoonish reduction of what the company’s critics are worried about with regard to extremism or the spread of dangerous false claims, such as those related to the outcome of the 2020 election or the COVID19 pandemic. But Bosworth is just getting started. Asked whether the company did enough to stop the formation of the Stop the Steal movement on Facebook– remember, the company’s own internal research suggested it failed to prevent the network harms that contributed to the violence on January 6– Bosworth dismisses the question outright:

Ina Fried: If you look at January 6th, a lot of the conversations leading up to January 6th happened online. A good number of them happened on your platform. Do you guys feel you did everything you could to stop it? Or is it more, 'this is an inevitable trade off of bringing the world together?'



Andrew Bosworth: When you bring the internet together, you bring together people who otherwise wouldn't find themselves, including people who are in marginalized or at risk communities. How do you do that without also bringing together communities that you'd rather not bring together? People who have violent ideologies. And I don't think it's a solvable problem. Those things come hand in hand.

Whether this is the inevitable trade-off that Bosworth suggests is worth consideration. Few of the company’s critics think that all intergroup conflict on a platform as large as Facebook can or necessarily should be avoided-- but the real question is the extent to which Facebook’s design and algorithmic systems play a role in exacerbating the problem. Facebook’s own research– as well as mounting evidence from external empirical researchers– indicates it does. If that is true and it isn’t a solvable problem, then Facebook operates knowing it is contributing to conflict in the world. Which brings us to Fried’s next formulation of roughly the same question:

Ina Fried: The core of Facebook's algorithm, if you boil down what I'm sure is something incredibly complex to something quite simple, it's ‘what is it that people want to see.’ A lot of it is negative- the things that spur, that make our blood boil. Facebook has been less editorial in the past and more, ‘let's give people what they want.’ Where do you see your role going forward?



Andrew Bosworth: Any business has a concept called red revenue. The idea of somebody who came in and bought something, but it wasn't the right thing. So now they had a bad experience. You wish you'd never sold it to them. I think there's a similar concept here where it's things that people click on. There's no person on earth who is building a product that they want people to later regret the time they spend on it.

It is relatively easy to think of quite a few products that are sold by people without much regard at all to whether the individual buyer might later regret it. Think of cigarettes, alcohol, sugar, or heroin- or any number of other products that satisfy our cravings in the moment but cause problems for us down the line. People sell such things all the time- just not the sort of people to whom Bosworth or Zuckerberg want to be compared, since often such products are regulated or subject to other forms of close scrutiny. But then Bosworth switches gears, admitting that the company is engaged in a “process” to “inform our algorithms to optimize for different things.”

Andrew Bosworth: There is a process, not quite an editorial process, but a process of learning as we inform our algorithms to optimize for different things. Not just the amount of time that somebody's spending or not just the clicks. But how they feel about the time that they're spending.

So Facebook is changing its recipe in order to optimize for how the experience of using the product makes users feel. But is it doing it fast enough, knowing the harms to individuals and to society? That’s the gist behind Fried’s next question. And here is where the conversation takes a turn back toward The Ugly:

Ina Fried: There are a lot of people that feel like Facebook, now Meta itself, hasn't been fast enough to respond to the negative consequences of its products. I think there was a worker that came out in the Facebook papers in August, 2020 that left and said, 'Facebook only addresses things when they get dire, they don't do it fast enough.' And then of course, five months later, we had January 6th. Can Facebook move faster? And you're going to be in charge of the technical direction starting next year. Can you do more?



Andrew Bosworth: If we took every single dollar and human that we had, it wouldn't eliminate people seeing speech that they didn't like on the platform. It wouldn't eliminate every opportunity that somebody had to use the platform maliciously.

Again, did anyone say Facebook’s goal should be to “eliminate people seeing speech they didn’t like on the platform”? Or that the platform should eliminate 100% of instances of malicious use? No. The room is getting thick with straw men, conflating reduction with elimination- a tried and true rhetorical tool. If Bosworth were wearing a tie, one might be forgiven for imagining a tobacco executive sitting in the chair: “If we spent every single dollar we had, we couldn’t stop ALL cancer.” But let’s keep going:

Ina Fried: The alternative is to not have these tools. And there are people who believe these tools are fundamentally unsafe, that our democracy is less healthy, our health is less sound because of misinformation. We just shouldn't have these tools because we can't solve for this.



Andrew Bosworth: Yeah. If your democracy can't tolerate the speech of people, I'm not sure what kind of democracy it is. I understand the speech of people can be dangerous. I really do. That is what we are talking about, a fundamentally democratic technology. But I do believe in giving people more access to information and more access to connect with one another. And not reserving those as tools for some small number of elite people. So I stand by the tools that we build. You talk to a random person. Do you use Facebook? Do you use Instagram? Do you use Snapchat? They do. And they like it.

Fried didn’t say we should shut down the internet, or that no form of social media might ever work. Fried is sitting inside Facebook’s offices: it seems fairly clear that the reference is to Facebook and the apps it operates.

But let’s allow, for the moment, the idea that Fried is referring to social media more generally. Is Facebook a “fundamentally democratic technology”? Is anyone concerned about its use merely intolerant of the “speech of people?” Is the alternative really just going back to elite gate keepers, as Bosworth suggests? Is Facebook really just giving people tools to access information and connect with one another, or is it not recommending information, communities, content, and contacts and operating a combination of algorithms and affordances that, at scale, are creating incentives that play a role in shaping society? Fried opens the door to such nuance, but gets perhaps the most hardline response in the entire interview:

Ina Fried: I think it's more complicated. I love the way that I'm able to connect with my cousins that live far away. I don't feel better that COVID is worse in our country because of the spread of misinformation, some of which is happening on Facebook. Are you confident that the overall impact of what you do-- not just the good, but the overall impact-- is better than if we didn't have these tools?



Andrew Bosworth: The individual humans are the ones who choose to believe or not believe a thing. They're the ones who choose to share or not share a thing. I don't feel comfortable at all saying they don't get to have a voice because I don't agree with what they said. I don't like what they said.

Admittedly, Fried is setting up a binary here- a yes/no on the question of whether Facebook is, on balance, good for the world. But that’s not the question that Bosworth wants to answer- he’s still on the last one. And he wants you to know: individuals are the ones to blame– not Facebook. Pay no attention to the science on collective processes or the behaviors of crowds or any of the computational social science that his own company has helped to advance that suggests the idea of the individual is, at best, overrated (more on this later).

Ina Fried: But when you look at the level of COVID misinformation that's spreading, are we still not getting it right when it comes to what speech is amplified on platforms like yours?



Andrew Bosworth: Our ability to know what is misinformation is itself in question. And I think reasonably so. So I'm very uncomfortable with the idea that we possess enough fundamental rightness, even in our most scientific centers of study, to exercise that kind of power on a citizen, another human, and what they want to say and who they want to listen to. Instead we have, what do people want to hear? Which is really the best way to approximate the algorithm.

In this response, you can hear echoes of conservatives such as Senator Ted Cruz (R-TX), who point to early moves by social media platforms to limit the spread of the Wuhan lab leak hypothesis. Admittedly, Bosworth is right on this when it comes to such complex subjects. But are there not other forms of misinformation that are different from that? We know drinking turpentine won’t cure worms, for instance. We may not have perfect science, but is there any subject on which we can have a bit of epistemic closure?

Ina Fried: So do you think vaccine hesitancy would be the same with or without social media?



Andrew Bosworth: I think Facebook ran probably the biggest COVID vaccine campaign in the world. What more can you do if some people who are going to get that real information from a real source, choose not to get it. That's their choice. They're allowed to do that. You have an issue with those people. You don't have an issue with Facebook. You can't put that on me.



Ina Fried: Well, I have an issue not just with the fact that the people are saying it. I certainly support even their right to say it. I have a problem that it has such huge reach on your platform. Those voices are still getting amplified, even with significant effort to avoid it.



Andrew Bosworth: That's not a supply problem, that's a demand problem. People want that information. And I know that you don't want them to want it. I don't believe that the answer is, I will deny these people the information they seek, and I will enforce my will upon them. That can't be the right answer. That cannot be the democratic answer.

“You can’t put that on me.” Well, thanks to the Facebook Papers, we can put that on him. We have a good deal of information about how executives at the company sought to hide information about the scale of the COVID-19 misinformation problem, how its engineers assessed the platform's ability to detect vaccine misinformation is "bad in English and basically nonexistent" in other languages, and how it shelved solutions brought forward by its own employees to address the spread of vaccine misinformation.

Again, is there a middle path, Fried asks, offering Bosworth the kind of easy out that might elicit a standard "we can always do more" response from a more politic spokesman like Clegg. Instead, Bosworth hammers the point that this is about individual responsibility:

Ina Fried: Is there somewhere in between where you're not completely preventing them from getting that information, but you are making it less easy for it to spread?



Andrew Bosworth: We're doing that. We are on the middle path, you just don't like the answers. But at some point, the onus is and should be-- in any meaningful democracy-- on the individual.

Bosworth is surely right that the company is attempting to contend with the harms on its platform. What is at play is the success rate of its interventions. The company’s policies are only as good as its enforcement. The Facebook Papers make clear that the reality is that the company’s performance is a long way from its promises. In particular, its technology- which presumably Bosworth is now responsible for- is just not up to the job.

But there is a deeper question here, and it’s worth stopping to consider. It’s about the role of the individual as an agent in a democratic system. The conception of the individual that Bosworth appears to be adopting here is one that the cascades of behaviors observable on the platform he operates would seem to call into question. What Facebook’s own researchers seem to identify when they talk about phenomena such as “network or movement level harms” is not about individuals, but about the nature of the connections between them and how they behave in groups.

The idea of the "individual" is itself a fuzzy thing. As the scholars Jean Baechler and Suzanne Stewart point out in a paper, Individual, Group, and Democracy, “The human individual is the geometric locus of an indefinite number of determinations, which each in turn may be defined along a scale that moves from the greatest generality to the greatest particularity.” Baechler and Stewart recognize that, where it concerns the ways in which individuals deal with the determinations that affect them, there is a further range or scale “that can be sketched between two extremes”:

At one extreme one would find the individual who has deliberately become conscious of the determinations and their singular, particular, and general manifestations, and who cultivates them all with the most scrupulous respect for their multiple definition. In relation to such an ideal– impossible are all ideals– individuals would be fully and simultaneously citizens/economic agents/believers/ethical subjects as representative of a civilization/members of a collectivity/themselves.

This appears to be the ideal individual Bosworth is describing. But there is another extreme along the spectrum that Baechler and Stewart describe, of individuals who are not such ideal agents:

At the other extreme, one would find individuals who are engulfed by one determination, captives of their singularity and lived by an idiosyncratic psychological formation. An anomalous cohort of drug-addicts, marginals, religious fanatics, virtual or real tyrants, misers and coveters, and so forth pass by. Between these two extremes, one would encounter all those human types that make up average and middle humanity.

While it is convenient to make demands of the ideal individual and to put the onus on them, that’s just not the real world. Instead, Bosworth would do much better to look to ideas about the stewardship of collective behaviors, and to contemplate the meaning not of the connection of individuals, but of the nature of connection itself. Facebook is perturbing society and the means by which the individuals, groups, institutions and other units of organization across the human population interact-- to put the final responsibility on the behavior of the individual is, in the end, plain wrong when working at such scale.

If there is one thing the interview makes clear, though, it’s that Facebook executives such as Zuckerberg and Bosworth are unlikely to evolve their thinking much beyond where it was in 2016, or perhaps long before. These men– whose identities, power and wealth is so bound up in the company they created– are unlikely to break free of the motivated reasoning that casts them as liberators, as champions of a new and more pure form of democracy that crummy old institutions resist. No matter what verbal gymnastics people like Nick Clegg might perform, the Id has won inside Menlo Park.

Welcome to The Ugly, Part Deux. Now in 3D.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics