Facebook’s Vice President of Global Affairs, Nick Clegg, recently published a defense of accusations that Facebook is contributing to polarization in American politics. The piece has produced a number of rejoinders. One in particular from media scholars Shannon McGregor and Daniel Kreiss argues that Nick Clegg is aiming at the wrong target. They contend that racial inequality and the emergence of illiberal and undemocratic norms are a larger problem than polarization in American politics. They are right, but the question then becomes: why are we becoming more undemocratic and illiberal?
Facebook has radically changed the reasons why we talk to each other about politics. I hypothesize that it has done this by creating platforms that convert political talk into material for personal identity expression and maintenance. As a result, we tend not to talk to each other about politics to find some kind of mutual understanding, but for the purposes of projecting and maintaining an online identity of which politics is a part.
Political opinion formation and expression have become performative in ways they weren’t before Facebook. In my 2012 book, Facebook Democracy, I called Facebook’s platform an “architecture of disclosure.” What I meant is that the platform encourages intimate disclosure as a means of forming connections with others. By revealing our subjective experience of the world, and finding those which share it or can empathize with it, we draw closer to one another. Facebook cleverly monetized the intimacy of friendship and transformed it into something different. It made private intimacy public by providing a platform where this intimacy can be scaled up to a broader friend network.
There are great benefits to being able to “amplify intimacy” and find a community, especially for those whose voices were previously excluded. By putting ourselves “out there” in intimate ways that reveal truisms about our collective experience, we can feel not so alone or justified in our anger. This is social media at its best. There are countless examples of movements, both social and political, that have been aided by the ability to reveal subjective experience.
However, this personalization of our everyday discourse for the purpose of connection comes with a cost. Namely, it moves us from using political conversation to understand subjective experiences different than our own. An equally important part of political talk is to do what Karl Popper insists is a hallmark of a liberal open society, conjecture and refutation. A critical part of human development requires both disclosing to connect and public disclosure to evaluate whether our views of the world are valid. Put more simply, we need to be both affirmed and challenged.
Facebook is responsible for creating an environment that reorients cultural consumption into primarily identity maintenance. In Nick Clegg’s cogent defense of Facebook, he refutes claims that Facebook’s intent is to keep users engaged by providing them outrageous novel content. He points out that Facebook’s goal is simply to use its algorithms to comb through the various posts and updates (along with other metadata) to predict what users will find interesting, or as he put it:
Every piece of content that could potentially feature — including the posts you haven’t seen from your friends, the Pages you follow, and Groups you joined — goes through the ranking process. Thousands of signals are assessed for these posts, like who posted it, when, whether it’s a photo, video or link, how popular it is on the platform, or the type of device you are using. From there, the algorithm uses these signals to predict how likely it is to be relevant and meaningful to you: for example, how likely you might be to “like” it or find that viewing it was worth your time. The goal is to make sure you see what you find most meaningful — not to keep you glued to your smartphone for hours on end. You can think about this sort of like a spam filter in your inbox: it helps filter out content you won’t find meaningful or relevant, and prioritizes content you will.
If the pre-Facebook problem was “mass society” that rendered the individual irrelevant, the Facebook era problem is the over-individuated society where “you” are given back to “you.” This is the foundation of the liberal enlightenment project. The individual is responsible for determining their notion of “the good.” But as Antoinette Rouvroy so astutely points out, you cannot “become” without some kind of reference point from which you evaluate whether “what you find meaningful” is what you SHOULD find meaningful. Algorithms that curate based on what “you find most meaningful” take us away from what Hannah Arendt referred to as a “world of things” we share in common. This world of things is where our view of the world is evaluated against other views of the world.
Arendt called this world of collectively imagining our shared world “action.” This world of action exists outside of “what you find meaningful,” but to the extent that we’re allowed to have our experience of the world personally curated by algorithms, we cease to “think” in Arendt’s terms. We cease to intellectually wander and imagine the world through the eyes of others. This is what C.S. Peirce called musement, or the purposeless playing with ideas. “Thinking” on social media is stripped of this sense of play because political talk becomes material for reinforcing or rejecting our own subjective evaluation of “what we find meaningful.”
There are dire implications from this. It goes without saying that Facebook doesn’t want users to spread anti-vax information, but they do want users to connect to one another. And if “feeling” that vaccines leads to autism taps into a similar suspicion among thousands, and iff Facebook connects these thousands in its effort to give users “what they find meaningful,” it seems hardly relevant what Facebook wants.
Facebook’s solution is to give users more control over the algorithm by allowing some customization of feeds. While these changes are valuable, they do not address the fundamental problem that social media poses: “how do we restore the balance between our needs for connection and our need to be challenged”? This is not an easy question for Facebook to resolve. Facebook could include a “challenge my view” feature or a “surprise me” button that would serve that purpose. But would people use it, or is it a fundamental truth of the human condition that we prefer the comfort of confirmation to the vagaries of conjecture?
Dr. Marichal is a professor of political science at California Lutheran University. He specializes in studying the role that social media plays in restructuring political behavior and institutions. In 2012, he published Facebook Democracy (Routledge Press) which looks at the role that the popular social network played on the formation of political identity across different countries. His most recent work (with Richard Neve and Brian Collins) looks at they ways in which social media platforms encourage antagonistic political discourse and how they could be regulated. In addition, Dr. Marichal (with collaborators) is using computational social science methods on a number of projects including an examination of fracking debates on Twitter, a study of candidate branding in 2016, and a study on political talk on Facebook. In 2018, Dr. Marichal organized a mini-conference on Algorithmic Politics for the Western Political Science Association. Currently, he is working on a book that looks at the damaging effects of algorithms on democracy by creating an “algorithmic mentality” among citizens.