How to Put Out Democracy’s Dumpster Fire, an article published last week in The Atlantic by Anne Applebaum and Peter Pomerantsev, starts where a lot of “whatever happened to our democratic institutions?” articles begin — with Alexis de Tocqueville, noted author of “Democracy in America.” Tocqueville argued that America’s democracy succeeded where France’s failed following the French Revolution because Americans “practice democracy” together. This practice occurs as participation in “associations” — a broad term encompassing everything from actually serving in your town’s government to participating in civic societies, church leadership, or other community activities. As political scientist Robert Putnam has noted, participation in these associations has been on the decline for more than 30 years, and this decline negatively affects democracy.
Applebaum and Pomerantsev take this argument a step further by claiming that digital platforms like Facebook and Twitter have actually displaced local associations, effectively privatizing and corporatizing the public square — threatening democracy in their wake. Certainly, capitalism and democracy are not inherently linked. You can have one without the other. The Gilded Age saw the rise in an oligarchy that was increasingly beyond the reach of law and politics. That’s what gave rise to antitrust law. Like the Gilded Age before, we are now at another turning point. Applebaum and Pomerantsev proposed two possible fixes: (1) regulating algorithms; and (2) creating new public squares in virtual space.
These are both worthy areas on which to focus — and many of the initiatives they describe are laudable; however, I don’t think this provides a complete picture of the necessary reforms required.
First, regulating algorithms is difficult, but not in the way The Atlantic article posits. The authors seem hung up on the fact that determining why algorithms act the way they do isn’t obvious — even to the companies that own them. What they propose is having researchers study them to provide more transparency into how algorithms behave.
This is important, but I think it’s even more important to not let companies off the hook for what their algorithms do. The way Google, Facebook and other tech companies speak about algorithms is as if they fell from the sky. They linguistically separate themselves from the algorithms they create. For instance, consider Facebook’s shifting explanations of how its News Feed works– these explanations tend to obscure Facebook’s business goal as well as minimize the company’s responsibility.
We can’t let these companies keep doing this. Frankly, these companies are their algorithms, and must be held responsible for their outcomes. They set the questions each algorithm answers and set the data used. New oversight may be needed, but we can begin by holding them to account with the laws we have. The FTC is already empowered to go after unfair and deceptive acts and practices. Many algorithms are unfair and many are certainly deceptive. Let’s use the tools we already have to get at these harms.
Second, while the government has a role in fostering places for public conversations to take place, I don’t think we can depend on a “public internet” (like public radio or public broadcasting) to fix the discourse alone. Our local journalism institutions, like hometown newspapers, are dying. And the local news crisis is directly related to the harmful spread of misinformation on major technology platforms. One way to get platforms to pay for the harms they cause is to create a new “Superfund for the Internet.” This Internet Superfund would be used to bolster independent local journalism, while also requiring major platforms to engage in fact-checking.
We also need to aggressively enforce our antitrust laws, as well as create new competition rules for digital platforms. Congress can also create a new regulator for digital platforms. An expert regulator can create and implement competition rules for this market, as well as focus on consumer protection. This consumer protection could include rules around use of algorithms and work in partnership with other enforcers who have concerns about outcomes of algorithmic decision making. Just as in the Gilded Age, antitrust law alone won’t fix all our problems— but it will help.
Lastly, we need better privacy laws. Ones that focus on use limitations, data minimization, and robust enforcement. This entire economy is built on data and it needs to be better regulated in order to protect citizens from the worst abuses by companies, governments and other actors that seek to manipulate individuals.
We can address these harms. We just need to be creative in our solutions.
Sara Collins joins Public Knowledge as a Policy Counsel focusing on all things privacy. Previously, Sara was a Policy Counsel on Future of Privacy Forum’s Education & Youth Privacy team and specialized in higher education. She has also worked as an investigations attorney in the Enforcement Unit at Federal Student Aid, as well as the Director of Legal Services for Veterans Education Success. Sara graduated from the Georgetown University Law Center in 2014, where she was the symposium editor of the Journal of Gender and the Law. After graduating law school, she completed a Policy & Law Fellowship at the Amara Legal Center, an organization dedicated to fighting domestic sex trafficking within the DMV area. Originally from Chicago, Sara attended the University of Illinois, where she received a B.A. in both Political Science and English.