Home

Obama's Right, Elon’s Wrong: Democracy Needs to Defend Itself Against Disinformation

Karen Kornbluh / Apr 14, 2022

Karen Kornbluh is the former Ambassador to the OECD and founding director and senior fellow at the Digital Innovation and Democracy Initiative at the German Marshall Fund of the United States. She served as policy director to President Obama during his time in the Senate.

Elon Musk’s offer to buy Twitter in order to protect free speech and therefore democracy demonstrates a lack of understanding of the threat that disinformation poses to democracies and of the changing transatlantic regulatory environment.

The Tesla and SpaceX CEO offered to pay $54.20 a share in a letter to the Twitter chairman stating, "I invested in Twitter as I believe in its potential to be the platform for free speech around the globe, and I believe free speech is a societal imperative for a functioning democracy." Musk has previously boasted he is a “free speech absolutist” and takes issue with platform moderation.

His pronouncements rely on an inaccurate portrayal of the marketplace of speech on social media platforms – where amplification algorithms are designed to maximize revenue, not debate, and moderation occurs under a cloak of darkness. But his ideas also ignore the harm disinformation causes to democracy and the ways that democracies are responding.

Musk might have taken time before composing his letter to watch former President Barack Obama’s conversation last week with The Atlantic’s’s Jeffrey Goldberg. Obama chose to focus one of his rare post-presidential policy discussions on the existential threat that disinformation poses to democracy. Before an audience in his adoptive hometown of Chicago, Obama urged democratic societies to better defend themselves.

Obama acknowledged that during his own administration, he “underestimated the degree to which democracies were vulnerable” to disinformation. Unfortunately, the United States government has done little to address the problem since—despite the accumulation of overwhelming evidence that disinformation poses a grave national security risk and a domestic governance vulnerability. Today, a staggering 42 percent of Americans do not believe that Joe Biden is the duly elected president. Meanwhile, the Intergovernmental Panel on Climate Change declared that progress on addressing climate change has been undermined by disinformation. And just last year, Surgeon General Vivek Murthy declared vaccine-related conspiracies a public health risk that put “lives at risk.”

The ability of foreign actors’ ability to manipulate our information environment is a national security risk as has been clear at least since the first volume of Special Counsel Robert Mueller’s report laid out the Russians’ “sweeping” and “systemic” social media disinformation campaign in the 2016 U.S. presidential election. Russians used government-backed outlets, like RT and Sputnik, as well as fake and conspiratorial sites, and fake pages such as Blacktavist, to spread disinformation, amplifying stories with bots and trolls. But the foreign interference has continued. During the 2020 election, troll farms based in Eastern Europe administered some of Facebook’s most popular pages serving up content for Christian and Black American voters, ultimately reaching some 140 million users per month in the U.S.

Domestic extremists are at least as grave a problem. In the days surrounding the November 2020 election, users and pages based in the U.S.—including elected officials—promoted disinformation that led to threats of violence against poll workers. Stop the Steal, militia movements, and QAnon grew to reach millions of users online before the mainstream platforms took steps to contain them. No wonder, then, that the Federal Bureau of Investigation warns on its website, “[i]nternational and domestic violent extremists have developed an extensive presence on the Internet through messaging platforms and online images, videos, and publications. These facilitate the groups’ ability to radicalize and recruit individuals who are receptive to extremist messaging.”

This ability to spread conspiracy theories and extremism threatens democracies’ very ability to govern—with public health as an obvious example. An early COVID conspiracy video hosted by Breitbart’s pages and channels on social media, America’s Frontline Doctors, reached over 20 million viewers before all the major platforms took it down for violating their terms of service. Last September, a pro-China COVID-19 misinformation campaign spread on over 30 social media platforms. As a result of these and other efforts, today 28 percent of Americans still say they are uncertain about or unwilling to get vaccinated.

Despite the ongoing attacks on the online information environment and the limited effectiveness of the platforms’ own after-the-fact, whack-a-mole approach, the debate in Congress has stalled in large part on the kinds of concerns Musk raises – that any solution would force the government or platforms to be the arbiter of truth.

In his remarks, Obama cut through the past five years of hearings and hand wringing by explaining that the problem is not simply the collateral damage we must endure for the sake of free expression. He identified instead three key ways democracies could better defend themselves against disinformation without sacrificing free expression. Europe is poised to address each of these and recent proposals in Congress could do so as well.

First, he focused on product design—notably algorithms optimized to keep users online that too often amplify and reward inflammatory content—of platforms that are too easily exploited by malign actors. Product safety is not a novel challenge for democratic governments. As he said, “if someone says, 'I've got a proprietary process to keep the meat clean'—well, take it up with the meat inspector!”

This is exactly the principle Europe is applying in its Digital Services Act which is in the final stages of negotiation. The DSA will require platforms to conduct regular risk assessments and subject themselves to independent audits. The U.S. is still far behind but the Digital Services Oversight and Safety Act (DSOSA) contains many of the same requirements.

The DSA will also address design challenges by requiring platforms to increase transparency – sharing appropriate information on their activities to regulators and independent auditors, researchers, and the public. U.S. legislation would increase transparency as well, including the bipartisan Honest Ads Act, which would apply broadcast rules for political ads to the online world. These could be supplemented with Know Your Customer rules like those used in the banking sector, to prevent dark money and foreign actors from hiding their identities when dealing with online platforms. Congress could further open the “black box” by requiring platforms to disclose other types of data to regulators, independent researchers, and the general public for greater oversight, as required by DSOSA as well as as the Platform Accountability and Transparency Act proposed in the Senate.

Second, Obama drew from journalistic traditions, urging a greater use of journalistic norms like fact-checking, using multiple sources, and separating news and opinion. Democracies are not powerless here either; Europe is updating its Code of Practice while proposals in the U.S. would task a nonpolitical agency to work with industry to develop a voluntary code of conduct as legislation introduced by Sens. John Thune (R-S.D.) and Brian Schatz (D-HI) would have the National Institutes of Standards and Technology do.

Third, Obama suggested that democracies have become “flabby and confused and feckless around the stakes of things that we tended to take for granted” including “providing people with the information they need to be free and self-governing.” Ukraine is showing the world how to counter propaganda and flood the zone with accurate information. The U.S. government clearly has much to learn as it is struggling to convince its own citizens of essential facts about election administration or public health. More resources are also needed.

If Elon Musk wants to ensure social media plays its part in safeguarding democracy, he would do well to tune in when President Obama speaks again about disinformation next week at Stanford University and learn more about the new rules coming down the pike in Europe. Rather than assume that the main problem afflicting democracies is too much attention to content moderation enforcement, he might adopt President Obama’s guiding principle: “Does this make our democracy stronger or weaker?” It’s long past time both our corporate and democratically elected leaders heed this call to defend and strengthen democracy.

Authors

Karen Kornbluh
Ambassador Kornbluh has shaped public policy since the early days of the commercial Internet as a public servant and diplomat in the U.S. and internationally. The New York Times called her a passionate and effective advocate for economic equality. Today, she continues that work in two key roles: At ...

Topics