The Federal Government Can Act Now on the Facebook Whistleblower’s Revelations
Renée DiResta, Karen Kornbluh / Oct 29, 2021Karen Kornbluh is senior fellow and director of the Digital Innovation and Democracy Initiative at the German Marshall Fund, and Renée DiResta is the technical research manager at the Stanford Internet Observatory.
With 17 news organizations publishing a fusillade of reports on the documents leaked by Facebook whistleblower Frances Haugen, governments around the world must decide how to respond to the problem of disinformation, radicalization, and harms to vulnerable groups on social media. The volume of evidence may even finally spur lawmakers in the U.S. to act. Indeed, Senator Richard Blumenthal on Sunday told CNN’s Brian Stelter that “what we’re seeing here is a building drumbeat for accountability.”
However, the outrage may still not translate to congressional action; there have, after all, been hearings since 2017 highlighting several of these issues. As actress Cecily Strong-- playing Senator Diane Feinstein-- said on Saturday Night Live recently, "What Facebook has done is disgraceful, and you better believe that Congress will be taking action. Right after we pass the infrastructure bill, raise the debt ceiling, prosecute those responsible for the January 6th insurrection and stop Trump from using executive privilege, even though he's no longer president. But after all that, you watch out, Facebook!"
While SNL is right that members of Congress who can barely agree to keep the government running may find it tough to push past the multitude of lobbyists and campaign donations arrayed against major tech reform, the industry should read the writing on the wall and commit to adopting a code of conduct. To avoid skepticism that such a code would lack teeth, it should include third-party monitoring and the Federal Trade Commission should enforce violations of the code as violations of consumer protection. And, other relevant parts of the federal government must step up and use every tool available to address these issues with urgency.
A social media industry code of conduct should focus on the two most prominent bipartisan concerns that have emerged from the recent revelations: lack of transparency, and the algorithmic promotion of dangerous content. Transparency is needed to ensure that the public is no longer reliant on whistleblower document releases or the best efforts by independent researchers operating with extremely limited data to understand the full scope of the problem. Companies in nearly every important industry- food, aerospace, automobiles, energy.- all have to submit safety data to regulators. Platforms should commit to create mechanisms to share a variety of privacy protected data with researchers related to their performance against their terms of service, as well as insights into what is happening on the platform. Such data should include what content is driving engagement, and the impact of algorithmic recommendations and other features (such as groups) on the spread of content, particularly content related to politics and public health. Current transparency reports from the platforms are simply not sufficient; the companies must submit to third-party terms of service enforcement for audits that are routine and publicly available.
A social media code of conduct should focus on the two most prominent bipartisan concerns that have emerged from the recent revelations: lack of transparency, and the algorithmic promotion of dangerous content.
In the short term, the industry should also commit to introduce more friction to minimize the risk that platforms’ algorithms recommend violative content before moderators catch it. The whistleblower’s documents revealed that Facebook’s algorithms currently catch only a small fraction of violative content. In just one incident in July 2020, a COVID conspiracy theory video by a group calling themselves “America’s Frontline Doctors” spread to millions of users across social media – with 20 million views in 12 hours on Facebook alone – before all the major platforms took it down for breaking their terms of service. This has happened repeatedly throughout the pandemic, and previously during the 2020 election. A “circuit breaker,” like mechanisms used by Wall Street to prevent market crises, could prevent the viral spread of sensitive content in topic areas with high harm potential while human reviewers determine whether it violates platform policies. Platforms need to urgently invest in more human moderators, particularly those with language skills and cultural fluency in underserved areas, in order to ensure both that harmful content is found, but also that false positives are not removed in haste.
Nothing is stopping the industry from developing such a workable code today. But if the companies will not act on their own, the government should use its convening and other powers to prod them. The White House’s Surgeon General, who recently found that online vaccine disinformation is a public health risk, or the Federal Trade Commission, which is charged with protecting consumers, could respond to recent revelations by convening the industry to demand a code. They could press the companies to extend the code they will be compelled to adopt by the European Union to the U.S.
In the past, the U.S. government has corralled industry to address important problems. With FTC prodding, companies developed cross-device tracking opt-out processes, and Department of Justice concerns led to the creation of a hash database of child exploitation content that platforms are required to automatically block and remove. Individual companies implement these standards in ways that make sense for their particular circumstances.
Leaving some details to self-regulatory industry groups has its benefits: it solves the problem of keeping up with a fast-moving industry with varied systems, a method that has shown some efficacy in the financial industry. There, the Securities and Exchange Commission sets and enforces guardrails and self-regulatory organizations, notably the Financial Industry Regulatory Authority, write and enforce rules that are implemented by the exchanges. We will never be able to stamp out false and harmful content; trying to do so is a fools’ errand that poses a threat to free expression. But an industry code to add much needed transparency and friction can provide a counterweight to shareholder pressure for growth at all costs.
This work is urgent, and there are some signs the government is already taking action. But as Ms. Haugen said to the Senate, "Until incentives change at Facebook, we should not expect Facebook to change.”