According to a report by Ryan Mac, Mike Isaac and Sheera Frenkel in the New York Times, Facebook is expected to announce that it is forming a commission- separate from its quasi-independent Oversight Board- to advise it on election issues:
Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.
The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.
TechCrunch’s Taylor Hatmaker, referencing the Times report, noted that like the Oversight Board, a “similar external policy-setting body focused on elections would be very politically useful for Facebook. The company is a frequent target for both Republicans and Democrats, with the former claiming Facebook censors conservatives disproportionately and the latter calling attention to Facebook’s long history of incubating conspiracies and political misinformation.”
So I guess this would mean two FECs—the Facebook Election Commission and the Federal Election Commission. https://t.co/SE9ujZYPwY— Stephen Spaulding (@SteveESpaulding) August 25, 2021
But in advance of an announcement from the company, there are many open questions. For early reactions to the concept, I contacted experts in the Tech Policy Press community:
Emily Bell, Founding Director of the Tow Center for Digital Journalism at Columbia Journalism School
Facebook- and other ad platforms for that matter- have a role in society they are ill-equipped to perform. Efforts like this could provide mitigation for the worst practices internally, and help Facebook better understand its own responsibility. Which might be beneficial – to Facebook.
But an initiative like this can only be seen as an expression of corporate self-interest, not a long-term solution for civil society. It is designed, constructed and paid for by Facebook – it will never substitute for better regulation centered on civic society. We don’t allow other critical functions that affect our safety and welfare to be regulated by private businesses. Elections are no exception. They are not Facebook’s business, they are everybody’s concern.
Yaël Eisenstat: Future of Democracy Fellow at Berggruen Institute and a Researcher-in-Residence at Betalab
I absolutely agree that Facebook should not be left to single-handedly make the most consequential decisions about elections and democracy, especially when those decisions might be incompatible with their profit model. But I have deeply mixed emotions about this idea. When I was hired in 2018, it was specifically to bring a different background and perspective to their thorniest election questions. I was sidelined for questioning why we were not fact-checking political ads and eventually pushed out for trying to help ensure that we were not allowing voter suppression to occur through these ads. Having spent the past several years collaborating with a number of academics and civil society groups to continue pushing Facebook on these issues, including specifically warning both publicly and through other channels about the likelihood of post-election violence, I have seen no reason to believe they truly prioritize protecting democracy over protecting their business interests.
That said, with a lack of government action to regulate even some of the lower hanging fruit such as digital political advertising; Facebook keeping most of its data in a black box; the company’s continued attempts to silence former employees through abusive non-disparagement agreements; and actively shutting down initiatives such as the NYU Ad Observatory, it is hard to conceive of what more the public can do to actually demand accountability.
So, a few initial questions I would have before being able to take this idea at face value include:
1. How will they select “commission” members? Will they include some of their staunchest critics, or only those whose advice won’t conflict with their business model?
2. Will those members be obligated to sign NDAs, potentially stifling their ability (in perpetuity) from discussing what they learn about Facebook?
3. Will a “commission” have any actual authority over business and policy decisions and design choices, as opposed to merely providing thoughts on content moderation?
4. Will a “commission” carry more weight than Facebook’s own employees, many of whom have advocated for (and in my case, been pushed out for) questioning policies and business decisions that are detrimental to democracy?
5. Will Facebook be open to ideas that might mean different solutions for different countries, as opposed to the “but does that scale globally” retort that often stymies any attempts to address local political realities and election laws?
Eisenstat was Global Head of Elections Integrity Operations for political advertising at Facebook in 2018.
Ellen P. Goodman: Co-Director and co-founder of the Rutgers Institute for Information Policy & Law (RIIPL) and a Senior Fellow at the Digital Innovation & Democracy Institute at the German Marshall Fund
Facebook is responsible for what’s on its platform, so it can’t outsource responsibility. To the extent that it will submit to the decisions of a self-regulatory body with concentrated expertise in electioneering (presumably country specific), that’s probably better than going it alone. However, how much better it is depends on the composition, independence, and access to data of that body. Also, electioneering is not the lawless zone that other kinds of content delivery is. There are laws that apply to other media and that really should also apply (or be updated) for digital platforms. The expectation should be compliance with public law, not with private self-regulation, and to this end, we need updated laws (like Honest Ads Act).
The other big difference between an election commission and the Oversight Board is that it’s got to work really fast. And, Facebook has to have much better-articulated policies not only about electioneering, but about the integrity of the electoral process, and much more transparency into how it’s implementing those policies for a Commission to do its work.
Daphne Keller: Director of the Program on Platform Regulation at Stanford’s Cyber Policy Center
With the Oversight Board, Facebook’s gamble was that passing the buck on difficult speech moderation decisions would have more upside than downside for the company. It hoped that avoiding responsibility for controversial choices – decisions that would inevitably antagonize large, vocal, or politically powerful groups – would be so beneficial, it warranted relinquishing control over those choices.
The reported plan to create a similar body for elections suggests that Facebook thinks its gamble has paid off. The Facebook Oversight Board has apparently taken so much heat off the company, it’s willing to try building another one. Of course, outsourcing decisions about elections is much more fraught than outsourcing decisions about breastfeeding images or tasteless jokes. Election-related decisions, almost by definition, can have serious consequences for democracy. They involve legal or public policy questions that are, or arguably should be, resolved by accountable public bodies.
It doesn’t help that journalists are calling this body a “Facebook Election Commission.” That moniker is uncomfortably close to that of the real FEC – the Federal Election Commission. As I pointed out back when Facebook was setting up its Oversight Board, “we should not fool ourselves that mimicking a few government systems familiar from grade-school civics class will make internet platforms adequate substitutes for real governments, subject to real laws and real rights-based constraints on their power.” Facebook’s privately constituted Oversight Board is not a “Supreme Court.” The new body won’t be an elections regulator. We shouldn’t want them to be.
Rebekah Tromble, Director of the Institute for Data, Democracy & Politics and Associate Professor in the School of Media & Public Affairs at George Washington University
1. There are lots of ways we might imagine Facebook setting up an elections-related advisory board, but I’m hoping the remit wouldn’t be focused on content moderation questions. I don’t think that would be particularly effective. However, I do think regular, sustained engagement with experts could help Facebook be more thoughtful about its policies ahead of elections, especially outside of the US.
2. Some have speculated that this is an attempt to dampen partisan criticism lobbed at Facebook by bringing in “neutral” experts. If that’s the aim, I’m afraid it’s doomed to fail. Whatever experts Facebook empaneled would become targets of partisan criticism and used as “evidence” of Facebook’s ill-intent.
3. I do think, properly structured, some good could come out of such an advisory board, but my greatest concern is that this will be little more than a corporate whitewashing endeavor. We’ve seen that Facebook is actively pushing back against independent scrutiny from journalists and researchers, and I worry that Facebook would use this as a way to deflect criticism and avoid real accountability.
Have thoughts on the proposed commission? Consider proposing your ideas.
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.