One Man's Outsized Influence on Facebook- and the World
Justin Hendrix / Apr 17, 2022Audio of this conversation is available via your favorite podcast service.
In this episode we hear an account of the prominent role that one Facebook executive has played in US and global politics, making many key decisions that, over the years, have literally been engineered into Facebook and its polices.
Our guest is Benjamin Wofford, the author of a WIRED cover story titled The Infinite Reach of Joel Kaplan, Facebook’s Man in Washington: How one man came to rule political speech on Facebook, command one of the largest lobbies in DC, and guide Zuck through disaster—and straight into it.
What follows is a lightly edited transcript.
Justin Hendrix:
You start with an account of Joel Kaplan's role in shaping Facebook's response to some very inflammatory words in the height of the unrest, following the murder of George Floyd in Minneapolis. So you have former president Donald Trump, he's posted an incendiary missive on Facebook, pledging the support of the US military to stop protests. And he appended it with what you call a "hellish augury," the phrase "When the looting starts, the shooting starts." So take us through these events and why you chose to start this examination of Joel Kaplan's role at Facebook with this moment.
Benjamin Wofford:
Yeah, so there are two reasons why this is a perfect illustration of the power and importance, quietly, behind the scenes of someone like Kaplan. The first is that this is a perfect crisis for Facebook. In this moment, four years of Zuckerberg walking this tightrope are coming to an absolute head. Trump's post is slammed by the left, who sees it as an incitement to violence and think that the post should come down. And the right, conservative right in America, views this as a litmus test about whether or not Facebook is really committed to free speech and is very much wrapped up in a giant campaign that's been waged to suggest that Facebook is anti-conservative for a lot of complicated reasons. It looks for a moment that there's no way to escape this, that Zuckerberg is going to have to choose. Does the post stay up or does it come down?
And the story, as we've known it, publicly unfolds one way that Trump calls Zuckerberg essentially to ask for forgiveness and then writes this follow up post that basically clarifies his original stance, that this is not an incitement to violence. That it's really a prediction of future events, a warning, which makes the post basically kosher and magically puts the post out of incitement to violence, which is a violation of Facebook's community standards and into the green zone of being allowed to stay on Facebook.
But behind the scenes was another story going on, which hasn't been reported. That in fact, Trump did call Zuckerberg, but only after Zuckerberg personally called the White House to urge, really plead personally for an audience with Trump. And the person who arranged that phone call was the head of Facebook's Washington office, Zuckerberg's sort of chief consigliere in Washington, Joel Kaplan, who was listening silently on this call and had been in touch with the White House throughout that morning.
And people who heard this call said that Zuckerberg described having a staff problem, and that he had a tone of seeming to need Trump, to quote bail him out. That wasn't Zuckerberg's quote, but people who'd heard this call listened to the urgency in Zuckerberg's voice and understood for the first time just how important it was for Zuckerberg politically to enlist Trump, basically in the solution to how they were going to get out of this political dilemma. And so the reason that we start with this is because it's a perfect crisis in many ways for Facebook. And it's a perfect illustration of Facebook's chief fixer and that is Kaplan.
Justin Hendrix:
This is not the first time that Joel Kaplan has found himself at a turning point in history. Take us back 20 years, where did Joel Kaplan get his start in politics?
Benjamin Wofford:
Kaplan's political start follows after he graduates from Harvard law school in the late 1990s, where he changes his registration from a Democrat to a Republican, and he starts rocketing through the elite conservative ranks as a clerk to Antonin Scalia on the Supreme Court a coveted clerkship in the upper in intelligentsia of conservative American legal thought. The summer after ending his clerkship with Scalia he goes to work on the Bush campaign in the summer of 2000. And by accident or cosmic misfortune Kaplan is thrown into maybe the central event of Bush v. Gore, which is the counting of the votes in Miami-Dade county. That election of course came down to Florida. Florida came down to these three counties. These three counties in many ways came down to Miami-Dade, and Miami-Dade came down to about 10,000 votes in one room on the 19th floor of a government building called the Clark Center where Kaplan was assigned as an election observer.
That day-- this is in November-- was the ignominious event of something that later became known as the Brooks Brothers Riot. And in a nutshell, basically conservative operatives, many from Washington DC, as well as other activists more or less stormed the outside of an election office that was counting, doing a hand recount that had been ordered by the Florida Supreme Court. There's a lot of debate about how rowdy and how violent this protest was, but it's believed that at least 50 to 80 people packed into a small room and tried to stop the count. And they tried to stop the count by basically fighting for the door that led into the election office where the hand count was going. I think the difference, at this point in the votes between Bush and Gore was somewhere between 200 and 300 votes separated the two candidates.
And these were 10,000 votes that hadn't been properly counted in heavily Democratic leaning Miami-Dade. And for reasons that actually conservative operatives were quite candid about years later, they simply did not want those votes counted. Now they have a legal rationale for this, that's very complicated that we could get into, it's not worth getting into. But the point is that one person actually, who was in many ways leading this, I don't want to call it a mob, a group of protesters. Democrats called a mob Republicans called it a protest group was a Republican Congressman named John Sweeney who told operatives around him that he thought that if Gore took the lead in Miami-Dade by one vote, the election was over. And that it was his goal to make sure that Gore did not take a lead by even one vote in Miami-Dade.
So that's the context. This crowd stormed the vestibule, fought for the door. In some cases got physical with people and vote counters trying to get through the crowd. Democrats would later claim they were kicked. One named Joe Geller claimed he was punched. Was this protest violent? It was going right up to the line and kicking dirt over it. Inside, on the other side of the door is Joel Kaplan, along with three other vote counters. And the chair of the board of elections, a man named David Lahey pleads with the Bush campaign to go out and stop the protest because they can't count the votes. We know this by the way, because it was almost certainly captured on audio, but it's recounted in near contemporaneous account by Jake Tapper, interestingly, who was then Washington assignment writer who was covering Bush v. Gore and their conversation is captured in our story printed from Tapper's book.
Lahey says 'we can't count the votes until the protests are cleared. Will the representatives from the Bush campaign go out and speak to the crowd.' And Kaplan and another observer drag their feet, they hem and they haw. And they basically say, no we're not going to do that. Unless the board basically commits to relocating the vote count downstairs. That happens and a few hours later, for reasons that are hot debated, but at least partly because of under intense pressure from this protest Lahey and the board of campuses in Miami decide, they're going to stop the count. The Gore lawyers are stunned. The Bush lawyers are thrilled.
Many people look back and say that Bush v. Gore was decided that day, that afternoon on the 19th floor of the Clark Center and perhaps history would've unfolded a different way if Kaplan, a young Kaplan, I think he was then 30 or 31 had gone out and told the protestors to calm down.
Later it was revealed that many of these protestors who had reportedly kicked and punched, fought for the door, tried to stop the count were not in fact local activists. They were Washington DC operatives. They were identified later in video footage. Some had come from the NRCC. Some had come from the RNC. Several had come from congressional offices from Washington. It brought into starker relief whether Kaplan might have had an impact. He certainly knew at least some of these people. There's at least one who was a staffer on the Bush campaign. His name is Matt Schlapp, who later had a very prominent role in the Republican party during the Trump administration.
In any case, Kaplan then goes into the Bush administration. He's beloved. He's a loyal soldier who is one of the very few people to stay all eight years in the Bush administration. He eventually rises to deputy chief of staff. And in that job, you are overseeing just a huge panoply of domestic policy issues. And this isn't in the story, but if you have driven in a car with low mileage standards, or you have gone to the airport and walked through TSA, or you have made a call overseas and wondered if the government was wiretapping your call, you have lived in the world that Joel Kaplan has built.
Justin Hendrix:
So Kaplan has played a very influential role in American politics it's safe to say.
Benjamin Wofford:
He was influential in many of the main policies of the Bush administration. I'll give you just one anecdote that's different from the one in the story. There's a story about the EPA that I don't know how interesting it is to your tech listeners. Kaplan as we report was influential in the Bush White House's last minute decision not to regulate greenhouse gases. There's a little known conflict. Bush's own director of the EPA wanted to regulate greenhouse gases. He wanted climate change to be Bush's legacy, hard to believe, but could have gone down a very different way if history had played out differently. Kaplan intervened and said no. Again and again, you see these examples where history might have gone just such a different way. And you wonder why, and there's Kaplan playing some role. I'll give you another example, which is Kaplan was an expert negotiator, hugely respected on the Hill by both Democrats and Republicans.
I mean, he's got sterling relationships on both sides of the aisle as a good political operative must. He was dispatched to the Hill to negotiate with Democrats on the FISA law. You might remember in the Bush years, warrantless wiretapping. Kaplan went to congressional Democrats and struck a compromise to rewrite the FISA law and negotiated successfully with Democrats. Then Bush tried to pass immigration reform. It was the last real serious effort we've seen in the United States of major immigration overhaul.
And Kaplan is again sent and dispatched to negotiate on the Hill. Democrats were overwhelmingly in favor of Bush's immigration reform, but it was dashed on the rocks of Republican opposition. Republicans turned on their own president. It's one of the first real early moments we actually see a glimmer of Trumpism 10 years later, where immigration becomes the Republican id of Washington politics.
So Kaplan negotiates successfully with Democrats, and then he tries to negotiate for immigration and it blows up as this massive failure because Republicans turned on their own president. And someone who worked with Kaplan in the White House actually told me that from the FISA negotiations Kaplan learned that Democrats could be reasoned with, but from the immigration debacle, he learned the importance of keeping Republicans on your side, and what can go wrong when you don't. And it's a lesson that he would take, I think, to heart for the next 10 years at Facebook. That Democrats can be negotiated with, but Republicans will play hardball.
Justin Hendrix:
So let's talk a little bit about how Kaplan ends up at Facebook, because this is actually another kind of quirk of history. Turns out he had dated young Sheryl Sandberg while at Harvard?
Benjamin Wofford:
Kaplan met Sheryl Sandberg on the very first night of Harvard orientation in 1987. They dated briefly that year and remained friends. They did keep up a cordial relationship. In fact, they crossed paths one day in January, 2000, just before Kaplan went into the Bush White House. Sandberg leaving the Treasury Department, actually threw a little brunch to welcome the Bush staffers to Washington. Clinton staffers were still very bitter about Kaplan's role at Bush v. Gore, and some people at the party wouldn't talk to Kaplan.
And so they had kept up this very cordial Washington relationship in the way Democrats and Republicans do. Kaplan left the Bush White House after eight years. He was one of the very few people who stayed all eight years and he went to work very briefly for an energy company in Texas. And Sandberg who had just taken the helm of Zuckerberg's company and was building this advertising behemoth was looking for someone, basically, to help lead this incredibly small Washington office for Facebook.
I mean, I think it had by 2007 or 2008, just three or four people. The Washington office at that time was led not by Kaplan, actually but by Martin Levine, who is, again, a long time Sandberg friend. But Kaplan gets recruited basically to lead us domestic policy in 2011. Facebook's issues at that time weren't about regulation. They were about image basically helping congresspersons learn how to use Facebook, hosting for lack of a better word diplomatic missions with members of Congress who would come into the Facebook office. But over time, Kaplan made his value clear during the Edward Snowden crisis of the Obama administration.
Zuckerberg goes to meet Obama in the Oval Office and it's Joel Kaplan who accompanies him for that meeting, not Sandberg, not Levine. Come 2020 Levine moves to Instagram and Kaplan is elevated to the top job in Washington as the global head of public policy in 2014.
He brought a very beltway style to solving problems. Staffers really liked his leadership and admire his leadership. He was a former Marine for two years in the 1990s between Harvard and law school. And had a certain esprit de corps, certainly with the men in the office who liked that Kaplan would sometimes reference JJ DID TIE BUCKLE, the 14 values of the Marine Corps, but women too also thought that he was an even minded and fair minded manager. He had a great deal of admiration in the office. He's incredibly charming.
One person told me he could talk to a wall and the wall would have a great day. He'll sort of charm the shit out of you, was the quote that one person told me, including people who were sort of suspicious of Kaplan. And the suspicion came from a certain beltway approach that clashed in some sense with the tech ethos of a company who had been famously founded the credo of move fast and break things.
Kaplan had an idea of be very cautious and make sure all sides are sort of kept at bay. And early disputes about Facebook's policy for instance for firearm ownership. Kaplan wanted to make sure the company wasn't falling afoul of the NRA, for instance. Throughout this period there's this rumbling happening in the Republican party. And in the summer of 2015, this game show host named Donald Trump comes down this long escalator and announces his candidacy for president. Like any company or political operative in Washington Kaplan and plenty of people at Facebook are squinting at Trump and trying to figure out what this means for the Republican party. As Trump slowly starts to gain steam, he's slowly rising in the polls, he's slowly starting to... We see him on the debate stage vanquishing his opponents.
Justin Hendrix:
So in December, 2015 candidate Trump proposes banning all Muslims from entering the United States. He posts these remarks on his Facebook page, and even Republican staffers apparently thought the video violated the company's hate speech policy. But then Kaplan steps in.
Benjamin Wofford:
Yeah. So this is a big moment in the history of Facebook. Facebook is still, hard as it may be to believe a young company whose political footing is sort of uncertain. And Facebook has a real decision to make, because this is 100%, as you said, a violation of Facebook's standards. There's an emergency meeting that's basically convened. It's convened between the Washington office, Menlo Park. Kaplan actually beams in from India where he's part of a delegation basically with the Indian government.
But Kaplan is beamed into this meeting. And Kaplan is by accounts the first to speak up and say, we can't touch this video. If we do, there will be a major backlash from the Republican party. This is a candidate running for president of the United States. He's a candidate in good standing. He may be violating our policies, but as a warning basically to the executives at this office, Sandberg is there, Elliot Schrage is there, others are there. He's the first to warn the company that there will be a major backlash if this video is frozen, if it's shielded, if it's taken down it will be viewed as censorship.
And the quote that's both famously reported. It's actually not our quote. It comes from the great reporters at the New York Times that Kaplan says, "Don't poke the bear." That taking down Trump's video or answering it or stopping it spread would be the equivalent of poking the bear. The bear being the Republican party. And after this comment the group discussion quickly converges on that consensus. And so again, you see Kaplan not snapping his fingers and making this decision, but using his powers of persuasion to move the group towards this consensus. And literally by the end of this meeting, the executives, Zuckerberg's not present, but would later be briefed on this. The executives have basically invented what comes to be known as the newsworthiness exemption.
The idea that if you were a politician or a public figure statements that would ordinarily be clear cut violations of community standards don't apply, or are overridden basically by a concern for the users of Facebook and newsworthiness, that they should know what their candidates are saying. And people who were there for these events and lived through them, felt like this was a major turning point for Facebook.
And so once again, you've got a major turning point in history and Kaplan is there and playing a not insignificant role. It was a moment when Facebook decided that they weren't going to referee politics or try to shape political events, but sort of be a bystander to them. And obviously that's a hotly debated question about whether or not they made the right decision. People years later would draw a line between that decision and a cascade of decisions that followed, that people thought were appeasing the Trump White House and basically the Trump political movement and conservatism. And the extent to which they relied on social media and Facebook to make that political movement in reality.
Justin Hendrix:
So I want to talk a little bit about how some of these decisions end up getting literally engineered into Facebook. You bring up this effort that came along a little later, this Common Ground project. So Facebook has been accused of exacerbating political polarization in the US, helping to cause discord.
And it looks like some of Facebook's own staff in civic integrity are working on this problem. And we know now from the Frances Haugen leaks that they had done their own research and were aware that this extreme group of partisan users are sort of disproportionately responsible for the trouble there, and that, in fact, right-leaning accounts are more of a share of the problem perhaps than others. So Kaplan steps in, and you've got, again, these civic integrity staff on the one hand who are proposing an engineering solution, something that would've potentially perhaps reduced the platform's role in exacerbating polarization.
Benjamin Wofford:
It's a great example of Kaplan's decision making. So as a quick note, I'll just say Trump wins the election, surprise, spoiler. And almost overnight people describe Kaplan's power increasing exponentially. That this was an office and a role that was thinking sort of obliquely about policy questions like what should Facebook stance on firearm sales be on the platform. And changing to a role that basically is injected, and Kaplan is injected, into the marrow of core decisions about what Facebook is, which is speech, the ideas that can be exchanged, how they're exchanged, the reach of algorithms. Kaplan's a lawyer. And very quickly you have a clash of engineers and data scientists who have made their life's work designing and executing social medias, and the systems that run them and a lawyer who is thinking broadly about the political consequences of what they're doing.
And this tension comes to a head perfectly in this program, that's called Common Ground. It was first reported by the Wall Street Journal several years ago. And we know so much more about it now because of the Facebook Papers, what Common Ground was, was what some staffers called a Facebook solution to America's polarization crisis. A Facebook solution was what staffers were encouraged to think of as something broad, something big, something bold, a towering effort basically to solve big problems.
And what Common Ground basically was, was a suite of programmatic changes to the platform that would've changed the experience on the platform, in the simplest explanation to sort of dial down the partisan acrimony of the experience of most people on the platform, but also the influence in the viral reach of people who built their brand using algorithms that very clearly now rewarded partisan outrage. Kara Swisher, I think says it best, use a phrase enragement equals engagement.
And this was a movement inside Facebook of cross jurisdictional team that was literally taking that concept of the algorithm and putting it in the bullseye. And so we know from internal documents at Facebook, for instance, that researchers had found a correlation between Facebook use and what's called affective polarization, which is basically hatred of the other side, trumping all logic and reason.
And their solution was this sort of three pronged effort to reduce polarization in what they call aggressive interventions. And they were going to tweak news consumption to rebalance media diets, basically, perhaps instead of your uncle receiving some crackpot news story, they might instead see the Wall Street Journal. They were going to try to... What they called self-segregation they were going to try to replace that with more cross overlay of different viewpoints.
So you'd be exposed to different viewpoints. They were going to try to build systems that replaced instability with "Good conversations." These are right out of the slide decks from what the Common Ground team was doing. And they did this with a number of programmatic proposals and people were incredibly excited. They hung signs around the Menlo Park office which is what teams do at Facebook when they're working on a big project. And these signs said things like reduce hate and reduce polarization. And there was tremendous excitement around this project.
And there was one problem that these engineers had and I don't think fully realized, which is that in the new system at Facebook after Trump's election, these ideas now had to go through a review process overseen by Joel Kaplan. And that's where things ran aground. Kaplan over saw basically a policy review process that was called eat your veggies, in which engineers basically had to be asked questions about the political implications or ramifications of some of these ideas. And Kaplan is a great and very talented interocular, he's a great lawyer.
If you can, imagine a great prosecutor who can sort of unravel someone on the stand. People at Facebook say Kaplan is extraordinary at this. I mean, he can make almost any idea seem foolish or lacking for some specific detail. Even when many people felt like the spirit of what Common Ground was doing was really good. And so Kaplan leads these sessions. And most of the ideas that would've had a greatest effect in the United States basically get killed. A few of them are brought to Zuckerberg. Zuckerberg also personally asks some of these ideas to be diminished or vetoes them. So a combination between Kaplan and Zuckerberg act as this filter for an idea that engineers really felt like was a positive step for making Facebook like a less polarized place. What conservatives at Facebook will tell you, and you might find this interesting because this isn't in our reporting, but it's true.
What I'm about to tell you is conservatives at Facebook say that this was a very idealistic, very nice idea by left-leaning progressive engineers at the company who weren't living in political reality. That after Trump won you couldn't just play around with Facebook's algorithm, because it would have big political consequences in a new reality where Republicans control all three branches of government. Well they control Congress and the White House in Washington. And that the umbrage that they took was really sort of naive.
And the phrase 'eat your veggies' was the name of this review process. This star chamber, basically, that these engineers had to come and face Kaplan. The engineers on the other hand were really offended and kind of outraged basically by the notion that it be called eat your veggies.
It conveyed the feeding of hard truths to idealistic liberals. I spoke to people on the other side of that inquiry who felt precisely that it was Kaplan and the policy team who they were trying to get to eat their veggies. In other words they felt like, and this is especially true in the civic integrity team that Facebook was almost like a child that just wanted to eat dessert all the time.
That if you have a media diet, that's just giving you a cortisol rush of enraged news, that's designed to keep you addicted onto the platform that's almost like a sugar rush. That's a diet based on something fundamentally unhealthy. And in fact, you see this in the Common Ground slide decks, where they're talking about healthy news diets. And so the Common Ground people thought that by Kaplan calling this review process eat your veggies, that it was almost this sort of cynical genius, rhetorical jujitsu in which he sort of took the spirit of what they were trying to do and kind of turn it on them.
But in their defense I will say one thing that I do think... People will debate about Common Ground, but I'll sort of... one detail that's not reported, but I think is important for your readers to know, was that Kaplan at one point called Common Ground paternalistic, basically by suggesting that conservatives needed to rebalance their media diets. Kaplan and other conservatives at Facebook, they framed Common Ground as this effort by naive leftists. But people at Common Ground do not think that this was an effort by left wing progressives.
In fact, Common Ground was actually hugely influenced by the work of a sociologist named Jonathan Haidt, who is America's chief moderate, and progressives really don't like Jonathan Haidt. He's sort of a tut-tutting moral psychologist who has taken aim at cancel culture. He thinks the left is really out of control. And in fact, several of Haidt's former students were actually on the Common Ground project, working directly to take Haidt's ideas about civil engagement and apply them, basically to take Jonathan Haidt's ideas and make them real on Facebook. So it was certainly news to the people of Common Ground that this was some naive, leftist, progressive effort to sort of reign in conservatives.
And in fact, by the way, I'll just say one quick thing, and people really care about Common Ground. Even the lefties at Facebook were reticent about Common Ground. I mean, so the conservatives have framed Common Ground as this naive lefty project. The people at Common Ground were not naive lefties and also the lefties at Facebook did not think that Common Ground was this big liberal Chimera.
In fact, one person told me who-- she's very progressive-- she told me that Common Ground would have elevated Martin Luther King proverbially at the expense of Malcolm X. That more strident voices on the left actually would've been punished under Common Ground. So it calls into question basically, I think this conservative framing, that Common Ground was just this sort of lefty pipe dream. So that's probably more detail than you need to know about Common Ground, but that didn't make it in. And I always thought that was interesting.
Justin Hendrix:
I think she's right. And there are a lot of people even now talking about what to do about polarization on social media that don't understand that point. They think, oh, you turn the dial and you put people back together and that's great. But what you end up doing is in fact, diminishing the extremes.
Benjamin Wofford:
That also gave me pause, that Common Ground was an idea that would proverbially elevate Martin Luther King, but punish the speech of Malcolm X. And sometimes in our discourse, you need a little Malcolm X.
Justin Hendrix:
Do I want to punish extremist environmental activists, maybe extremism is necessary. The Civil Rights movement's a really good example. I think that's an instructive one. But I guess, let me rephrase this because the thing I'm trying to ask is it seems like the sum of Joel Kaplan's decisions have favored an extremist outcome. They favored a particular extreme with a political valence to the right. Do you think that's true?
Benjamin Wofford:
Well, I'll put aside my personal views after reporting this for a year and just appeal to your listeners to look at the data. I mean, the data is fairly convincing that that's what has happened now. There's a lot of debate about why it's happened, but Facebook has known internally and the Facebook Papers evince this point again and again. Facebook's own researchers have long known right-leaning content is amplified more. And what we might call hard right, if there is such a thing as hard left, such as it is hard right content is rewarded often without penalty.
And if the fact there's a great study from NYU that we mentioned in this story that more or less shows that in the six months before the 2020 election, and then January 6th, exactly what I just said. Right leaning content was amplified much further than what was coded as centrist content, like a Wall Street Journal article, for instance, and hard right or far right by far outperformed every other category, including misinformation.
And that you would think that there would be a proverbial penalty for misinformation. You would think that... They looked at misinformation, both of the far left and the far right. And actually misinformation of a centrist valence. So misinformation of a centrist kind would be you'll never believe these 11 things that can cure your dog of eating chocolate, if it's false, but it's not political left or right. This is the study.
And what they found was that left wing misinformation or hard left misinformation and centrist misinformation did incur a kind of penalty. They were using CrowdTangle data to show this. A penalty being underperformed what the average sort of distribution of content would be. But that hard right misinformation, not only didn't receive a penalty, but out performed the median average of distribution and amplification for baseline content.
And so it raises all sorts of questions about the algorithm. It raises questions about the culmination of decisions made through four years in which Kaplan has been really closely involved. I mean, just to give you another example, and this has been reported by the Washington Post. We did some reporting, but didn't publish it about a project called worst of the worst, which is an effort to retool Facebook's hate speech algorithm. And it's incredibly complex. But long story short Kaplan had a key role in those decisions, basically, which is about what types of groups basically are especially protected or come under fire by people uttering just heinous things basically about people of certain group membership or minorities or backgrounds, whether they're gay, whether they're Black or Latino.
And so it would probably be the work of a lifetime research endeavor basically to try to catalog at every granular level, every decision that Kaplan has had a role in, but what almost everybody close to Kaplan and on the policy team have walked away and told me is that the history of Kaplan is now a history of Facebook itself. It's partly why this story is so long and feels like a history. Because his thinking about issues, especially on content moderation is so enmeshed in the architecture of Facebook itself. And so it creates a curious paradox where he and Facebook decisions are one in a kind of singularity because he's so enmeshed in the decision making process. And yet he's not singularly responsible for the way that these algorithms have beared out and created the reality we live in.
And so he's completely responsible and not responsible at all at the same time. And you see Facebook staffers really struggling, basically, with this paradox of how much responsibility to sort of unload on Kaplan. The one thing I will say that I think is a clear, concise, tangible reform that Facebook could make tomorrow, that lots of people will tell you about Kaplan is that Facebook has a very unusual structure. It's unusual in that the pipeline of decisions that make hard choices about the algorithm and content moderation and the team that lobbies elected officials, both are wired to go through Kaplan. And that's very unusual. And in some sense the most benign way to look at Kaplan is that he is in a position that is inherently beset by a kind of conflict of interest.
Lots of people feel this way. Some people disagree. But it would be as if just to put this in perspective, the best analogy I heard was imagine if Exxon Mobil had a structure in which the department that was in charge of setting regional gas prices and the department that was in charge of government relations, both reported to the same guy. I think we all know what would happen. I think we know that if there was a tough vote coming up in Congress, some of those swing congressmen might see gas prices go down in their district. It's not quite the fox guarding the henhouse, but it's perhaps troublesome that people who are a division at Facebook that is charged with making policies about speech and what kind of political speech can go on the platform report to the same person who's in charge of staying in good standing with many of the politicians who want that speech to stay.
It suggests, one, that it's a recipe maybe for bad outcomes. Certainly lots of reformers think that. The former head of civic integrity, Samidh Chakrabarti and the former CSO, Alex Stamos, are two examples of prominent people who fiercely believe that these two teams basically should be decoupled from Kaplan. But second, a charitable way of looking at it is anybody in Kaplan's role would then become a person of controversy. And so I think the most benign way of looking at this is he occupies a role in which almost any decision, because of the structure of Facebook that they make is going to, I think come under suspicion.
Justin Hendrix:
So you end up in perhaps an obvious place, which is January 6th; and this is an endpoint chronologically, but also conceptually for you in this piece.
Benjamin Wofford:
Precisely because what you mentioned earlier that the history of Facebook is now a kind of history of Kaplan. And I don't say that frivolously. I'll give one example that is a scoop in our story that gives an example of decisions that Kaplan would be a part of that would redound in portentous ways, basically, on January 6th. And also speak exactly to this conflict of interest or the implications of what can happen when the person who's running government and the person making key speech decisions is the same person.
So in 2019... there was a major election in India in the spring of 2019. And the civic integrity team was using a protocol, an automated protocol, to basically combat civic spam, which didn't rise to the level of coordinated inauthentic behavior, CIB, which is Facebook's sort of big bugaboo, but was looking at targeting things that while they weren't violative, while they didn't say I want to kill Democrats, or I want to behead Republicans they were these fuzzier gray area, authentic networks that were spreading information that could have huge electoral consequences in the Indian election.
Basically what these protocols were doing was they were doing in India what the protocols in the United States were often doing, which is by implementing a neutral policy, they had the effect of flagging conservative of electoral speech more, because there were more violations on that side of the political spectrum in India, just as in the United States. That spring Kaplan actually flies to India and is called before basically the equivalent of a subcommittee in Parliament, where according to Reuters he's grilled about the effect of Facebook's algorithms in political speech in India. Because the BJP party, which is the ruling conservative party of Narendra Modi, is seeing these networks that are passing along civic spam or propaganda, sort of partisan material in an effort to swing the election are getting flagged and shut down.
And so the civic spam protocol at Facebook gets flagged to Kaplan in this way. And a few days before the Indian election begins, the civic spam protocol is shut down, the enforcement of not just civic spam, but all domestic coordinated inauthentic behavior is frozen, not just in India, but globally. So think about what I'm saying. There were no cops on the beat for six months in the year 2019. And as far as I know this hasn't been reported, but maybe it has. But these dragnets, these filters that were designed to stop bad actors from meddling in elections and doing all sorts of other things in a domestic context, these two protocols, one called civic spam, one called domestic coordinated inauthentic behavior were frozen, never publicized. Facebook quietly presented to the world that they were still enforcing these policy, but they froze them to do an internal investigation, basically, about why these protocols were flagging right leaning voices more than left leaning ones, in the Indian context.
And when the investigative team in Facebook started kicking the tires on these protocols, they ran experimental reviews, basically, inside the United States using Facebook data. And what they found was that the classifier was flagging... Just like in India, in the United States, when they ran these classifiers, they were flagging right-leaning publishers like the Daily Wire and Sinclair, for example, for domestic coordinated inauthentic behavior. Because these information networks were highly networked and there was a high degree of misinformation. It matched what the protocol was looking for. But it wasn't violation of community standards in the fact that there was by violence, or the fact that there was incitement to violence, it was in this gray zone. And that's what the protocols were trying to do. They were trying to look at things that on the surface could present as political activity, keep January 6th in the back of your mind, but actually had a more nefarious purpose underneath.
And so for six months these classifiers are frozen. I think actually one is frozen for three months, the other for six months. And in the fall when Facebook restores these protocols, they've raised the bar higher. People familiar with meetings of decisions by Joel Kaplan and another vice president Guy Rosen described that the standard now was higher to catch people for domestic coordinated inauthentic behavior. Now you had to have a past history of serious offenses like graphic violence or incitement or terrorism. And that was the way that you would catch people for domestic CIB.
And the reason that someone told me that they did this, the reason that they actually made it harder to enforce policies against domestic CIB was that it would be quote ultra defensible if they were called before Congress, if they were called out in the press, if they came under fire from conservatives or anybody about why did I get taken down from Facebook?
And this was going to be their answer. They might be saying, what does this have to do with January 6th? Well, I talked to someone who was actually very close with Kaplan, her name's Katie Harbath, she's fairly well known, has now left Facebook, but ran their global elections team and remembered this episode. And in her view, this decision basically to change the policies because of the effect they were having on conservative speech in India and change them globally. There is a line now between that event and January 6th now, whether it's metaphorically or allegorically. But the connection is instructive. She told me the coordinated inauthentic behavior problem in India was very similar to the kind of issue that presented itself in the fall of 2020 and just before January 6th.
It was highly coordinated networks of civic actors claiming earnestly to be part of a political movement that the election was stolen. But because they didn't have this past history under the new standard of violence or incitement or terrorism one of the reasons it was hard to crack down on the network of stop the steel was because of the policy that was created in the aftermath of the India decision, which Kaplan was directly involved in. It's not the only reason, but we know that connection because Facebook's internal researchers said so themselves. In Haugen's documents and first reported by BuzzFeed is basically an internal review team looking at how stop the steel basically got through the cracks. And one of the reasons they say is these actors were earnestly building this movement about an election being stolen.
And they didn't have a prior history of incitement or terrorism, which is the new bar. And so I talked to someone in civic integrity, who put it this way. In this moment, you can see the concern of having someone of immense power like Kaplan, who has the power to influence major policies about speech and enforcement of speech that can have major electoral harm, and has the job of keeping governments and political actors happy, because they tested this protocol. I don't want to say it would've stopped January 6th or stopped Stop the Steal. We'll never know that, but according to Facebook's own staffers and political research it might have helped. An enforcement tool that was enforced more aggressively, but had this consequence of punishing conservative speech more when it was turned on in the United States, basically created this panic that the entire enforcement system needed to be shut down.
And one person told me all kinds of important stuff like this was blocked. I'm reading a quote now from someone familiar with this work and said, "Would this policy have made a difference on January 6th, who knows." But they said, "The sum total of this work that was blocked could have really made a difference." And in a moment like this you have to remember that these policies are global. And so this person says, "In the Indian context if you ask the majority of people in the world, should we have not have done these things? So that one political party in the United States would feel happier. They would say, I live in Bangladesh. I don't fucking care. Please do the thing." This is someone on civic integrity who told me this. And Facebook would later say they made these changes to policy because it would make it defensible.
But this person told me, basically it raised the question of defensible to whom. That still is the rub. And this person said its defensibility is to a political part and a certain political movement in this case, a very narrow political movement in the United States. And they said what you can see in this episode of a policy in India having vast consequences and changing the way that Facebook enforced policy in the United States was that he's basically said the entire world was held hostage to one political party in one country.
Because applying this rule in India had ramifications on the American conservative right in the United States, the entire global policy basically had to change. And that was a really powerful example, both of what it means to give one person so much power, both in political speech and government affairs, but also how decisions years in advance can unfold in ways that have unforeseen consequences, including potentially in the instance of January 6th. There are other types of small decisions. It's like trying to look at who's responsible for January 6th and looking at Facebook's role. And it's like going through the forensics of an airplane crash. There's just so much there.
And in fact, we now know that the January 6th special committee in Washington has subpoenaed Facebook. And so Facebook may come under scrutiny for that reason. There are people in Washington who also want to know what Facebook's specific role was in the lead up and the decisions that were made. But just to give one example we know that there was a debate inside Facebook just ahead of January 6th about the break glass measures. Facebook employed these break the glass measures that in many ways looked like some of the things that the staffers wanted to do two years before in Common Ground. These break glass measures were about reducing partisanship. They were about stopping the amplification and virality of more fringe news sites and amplifying the viral reach of cooler-headed, just the facts, mainstream publications.
Those were some of the break glass measures, there were others. And there was a brief debate inside Facebook in which the head of product, John Haman, floated the idea that, "Hey, maybe we can make these break glass measures permanent." This was reported by the New York Times. And on the other side was a contingent that said, no, let's restore Facebook back to what it was. And Kaplan was one of the more prominent people making that argument to cancel the break class measures against the argument to make them permanent. So is there any one decision? We'll never know. But Kaplan and his philosophy is very much clearly on offer in almost all of the major key decisions that get made throughout 2020 and before.
Justin Hendrix:
So I've looked at this question a bunch of different ways as well, around January 6th and Facebook's role. And despite Nick Clegg's arguments that people want to try to pin January 6th on Facebook, somehow that's never really been what it's about for me. It's about did Facebook exacerbate the situation? Did Facebook make it worse and did it do it for profit?
But I have two last questions. One is about Nick Clegg who has been elevated into this global policy position. Do you believe that signals anything about Kaplan's role? And then my last question is, you've talked about how much power has been imbued into Joel Kaplan's role and how much impact that's had in the world. But Mark Zuckerberg has given him that power. So on some level he represents Zuckerberg, he represents Zuckerberg's will. Do you think that ultimately what we've seen here reflects on in Zuckerberg's core politics?
Benjamin Wofford:
Yeah, I think absolutely. And the two questions are married actually in that way. And so people who have worked with Kaplan closely, staff that's still at Facebook, don't believe for a second, that Clegg's promotion such as it is, has anything to do with diminishing Kaplan's role, both in deciding these really close call questions, shaping algorithmic speech policies. But also putting out fires in Washington. So the way that it's been explained to me is Clegg's promotion is additive in many ways, but doesn't come at the expense of Kaplan. There are people who actually think that, and I'm not some expert, but that if anything, it might come at the expense of Zuckerberg's role. Zuckerberg taking a less public role as sort of a global diplomat.
If anything, one person told me that far from diminishing Kaplan's role, Clegg getting this sort of big promotion might give cover basically to Kaplan's role. So much of the operational value of what Kaplan is doing is very sotto voce. And I don't want to say secretive, but he's the opposite of a public person. He's almost given no interviews to the press. Occasionally will testify to foreign governments, but it's not a name, like Sheryl Sandberg that people know, and that's precisely both how Zuckerberg and Kaplan would prefer it.
And it dovetails perfectly with your second question, which is I think you're absolutely right. One of the reasons that this is not a story, as some people have framed it of Kaplan sort of being a nefarious string pull and dicking around with the gears of Facebook behind people's back. Kaplan and Zuckerberg have as close to a mind meld, maybe as one could imagine. Lots of people have sort of speculated that it's come at the expense of Sandberg, has sort of fallen out of favor perhaps in sort of Zuckerberg's inner circle. But that Kaplan is most definitely in Zuckerberg's inner circle. The relationship is very close.
Someone who knows both men said the relationship is almost... It's hard to describe, but tantamount to that of a big brother. Kaplan's a Harvard grad, he's a generation older than Zuckerberg. He has shepherded Zuckerberg's ascent in Washington, sort of to the summit of power. I mean we talk in our story about this charm offensive that Zuckerberg wages in Washington throughout 2019, and a lot of that has to do with Kaplan.
And so you asked earlier the surprise that some people have that Kaplan has remained in his role, even when Democrats took unified control of Washington. And that you might think that there'd be some turnover politically in terms of who's running Facebook's team in Washington, if you have a tried and true conservative, basically at the head of your Washington office. And Facebook has changed none of its leadership.
And a lot of that speaks to this famous culture that you've documented and written about the value of loyalty around Zuckerberg at the highest echelons. But a lot of it speaks to, just, I think at this point, the bond, the sheer strength of that bond between Kaplan and Mark, between Joel and Mark. That relationship is real. Loyalty can trump a lot of things at Facebook, including the vicissitudes of politics in Washington.
And who knows not much longer if Republicans take back parts of Congress as they are projected right now to do, then it will have made sense perhaps to have kept Kaplan in his role. But it has come at some cost. We have seen this year in the Biden administration, no softening of the White House's dance toward Facebook. If anything, it's hardened and President Biden specifically calling out Facebook by name, and chastising them in the State of the Union and inviting Frances Haugen as a guest in the Congressional gallery was seen by people who are very familiar with how Facebook interacts with the White House as just a huge shot across the bow. Whether or not Mark agrees that he is getting the value that he wants out of his Washington team in this relationship with this Congress is a question only Mark Zuckerberg and God really know.
It speaks to the strength of that bond and trust that he'd be willing to keep, not just Kaplan, but the entire leadership team in Washington, including Kaplan's deputy Kevin Martin in Washington despite those headwinds. Someone told me that any other company would have fired their entire Washington staff by now, but they haven't. And I think part of that is that Kaplan does bring value. He will bring value again when Republicans take back Washington. But part of it is, I think, what by all accounts is a very deep bond that's personal and is forged in the fires of controversy and scandal spanning Cambridge Analytica to George Floyd, to January 6th. I think loyalty counts for a lot.
Justin Hendrix:
Well, we'll see what event is the next point in that history, I suppose. But Benjamin, thank you so much for taking the time to walk us through your piece.
Benjamin Wofford:
Yeah, that was fun. I hope we can do it again sometime.