Home

Nick Clegg and Silicon Valley's myth of the empowered user

Elinor Carmi / Apr 7, 2021

Facebook’s executives are stressed out, and it shows. Close observers of the most recent hearing in the House Energy & Commerce Committee on social media, extremism and radicalization noted that the tone has changed. Lawmakers finally seem to understand what journalists, academics, activists and legal experts have been saying for years - that social media, and particularly Facebook, represent a danger to democracy.

This realization has emerged slowly over the last decade, from the 2012 Snowden revelations to the 2016 Cambridge Analytica scandal. It was cemented when these lawmakers were hiding behind their desks while white supremacists were trying to kill them and take over the most sacred place for democracy in the United States. Of course, others- in Myanmar or the Philippines- for example, discovered how dangerous Facebook is years ago- but like many things that happen outside of the U.S. it is different when it hits home.

At the March 25th Congressional hearing, Representative Anna Eshoo (D-CA18) stated that she and several other Representatives are planning to propose a law that will ban the business model of surveillance advertising, which would strike to the core of Facebook’s business. Congress is perhaps answering the demands of activists, such as a coalition of many digital rights organizations that started a campaign to ban surveillance advertising, as well as citizens who are increasingly skeptical of Big Tech companies. With antitrust winds blowing on Mark Zuckerberg’s neck from the USA and Europe, and new legislation in the European Union in the shape of the Digital Service Act, Digital Markets Act, and more local changes in the UK, France and in Australia - it is clear the public is starting to wake up and try to curb the damage social media has done and continues to do to society.

So when I first read Nick Clegg's latest defense of Facebook on his personal Medium page, I thought “where to begin”? Should I mention again how Facebook’s newsfeed algorithms are mainly engineered to drive profit and engagement? Should I mention that Facebook still profits from disinformation around the vaccine, and the same happens on Instagram? That it does not remove militia groups that incite hate and violence? That it went much easier on right wing politicians and groups to avoid confrontations?

I could, but I won’t. There are scholars and journalists who have already addressed these issues. I want to focus on a narrative that Clegg promotes that deserves more attention - the myth of the empowered user.

Who is the ‘user’ in Silicon Valley’s narrative?

One of the main arguments Clegg advances in his piece, titled You and the Algorithm: It Takes Two to Tango, is that people - You - have the agency and free will to make informed decisions. Clegg is parroting liberal western ideas that assume that we are all rational people and that with the right amount of information will be able to make the right decisions for ourselves. Such an approach, developed by white and privileged men, of course neglects systemic inequalities that influence the ways in which people make decisions. Such inequalities may stem from a person’s socio-economic background, mental and physical abilities, gender, race, sexuality, religious beliefs or level of education attainment. It’s the same rationale that brought us the current broken system that ties people in a contract that comes in the shape of long and jargon laden ‘terms of use’ (that take months to read let alone understand), that reproduces power asymmetries between big technology companies and people.

This is, of course not new; technologies and standards were always created with an ideal User in mind- usually abled-body men, as Carolina Criado Perez has shown in her book Invisible Women: Exposing Data Bias in a World Designed for Men. But with social media companies, The User is portrayed as an ‘empowered’ individual who has the tools at hand to ‘train her algorithm,’ as Clegg argues. In this fantasy version of the relationship between The User and the platform, The User can fully take control over her mediated experience and the algorithms of social media. She will simply have to operate the entirely neutral and transparent features the company provides to rule her world. She is not a ‘powerless victim, robbed of her free will’ as Clegg says; she is the master of her Facebook universe.

Of course, these arguments are not new. As I show in my book Media Distortions, similar arguments were made by digital advertising associations in the beginning of the 2000s to persuade lawmakers they do not need to be regulated. They developed ‘recommendations’ and ‘standards’ that said they will act in an ethical way, and developed awareness programs arguing that if people want to know how things work they can easily look for that by themselves. However, if users don’t seek out these details, that is their responsibility.

Therefore, when people press ‘I agree’ to terms in the European Union or just use the platforms everywhere else, the responsibility to get educated on what’s happening within these platforms is on you. You have the responsibility to know that:

  1. Facebook’s main source of revenue is advertising, and thus you are not its primary customer;
  2. That Facebook sends cookies and pixels to your device whether you are on the platform or elsewhere on the internet outside of it, that it does so whether you are a subscriber or not, and that it spies on your behavior throughout time and on actions you may not know;
  3. That the newsfeed algorithm changes all the time to prioritize different things according to political or business pressures;
  4. That commercial content moderators remove things from Facebook all the time and the criteria keeps on changing;
  5. That Facebook collaborates with many other companies, data brokers and advertisers and they trade pieces of you - your data - between each other all the time;
  6. That this data can be used to manipulate your experience on and off the platform now and in the future.
  7. That the terms and conditions change all the time, without notification to Users;
  8. That even if you have ad blockers and asked not to be tracked Facebook will still surveil you, and you can’t control that but only limit it;
  9. That Facebook also owns WhatsApp, Instagram and other companies and surveils you through all of these services to produce a richer profile.
  10. That it is not only your data- what is commonly called ‘personal data’ or ‘personally identifiable information’- but also that of your friends, colleagues, lovers and acquaintances that are surveilled or obtained through Facebook ‘partners’ to make new connections and inform the platform’s understanding of you and your networks.

This is just a partial list of what people need to know and manage in order to be fully ‘informed’ and therefore participate with ‘agency’ and ‘control’. But the truth is most people, even the most educated and resourceful ones, simply cannot keep up with following all of these things all the time.

As part of the research project I’m working on with colleagues from the University of Liverpool, Sheffield Hallam and Glasgow University - “Me and My Big Data: Developing Citizens Data Literacies” - we aim to understand people’s data literacy to develop appropriate education programs. We conducted a national survey and dozens of focus groups with adult UK citizens who have various digital skills. What we discovered is that the majority of people do not even know what data are, they do not feel confident to change their privacy settings and- importantly- say that they do not think there is a point to change the settings because social media will change it anyway. Another interesting insight was that while half of the survey participants said they like to have platforms personalize their experience, more than 90% said they do not want companies to track their behaviors over time. The mismatch between these two answers comes from the fact that people do not understand that in order to deliver a personalized experience companies do indeed track their behavior all the time.

The survey and focus groups confirm that far from the ideal User in Clegg’s fantasy, people barely know the types of data companies acquire on them, and do not know anything about the other companies that participate in the surveillance ecosystem. Importantly, people who came from lower socio-economic status and lower education, including young people, have less critical understanding of how the online ecosystem works, what we call “Data Thinking,” than others. The conclusion we got from all of the focus groups was that people are very concerned by what social media companies are doing but feel they don’t have another choice but to use them. This is not a new insight- other scholars have arrived at similar conclusions.

Facebook, like other social media networks, has created specific interfaces that are meant to hide what happens on the back-end to shape and manage what we engage with at the front-end. They call it UX, but many times we experience it as dark patterns. As I show in my book, this is the reason that at the end of the 1990s and beginning of 2000s web-cookies were designed to be concealed at the back-end and at the same time sent to your device automatically, thanks to default setting designs in browsers. This was heralded then and now as a path to agency- Users can always alter privacy settings. But the truth is most people do not check their privacy settings, and many times they are not very substantial.

Clegg’s response to this dynamic is to promise more ‘transparency tools’ or controls, but so far the tools that Facebook has developed were designed against the interests of people, not for them. For example, the Audience Selector was introduced in 2014 and presented as if it gives control to people. But this was an interface design meant to tackle the ‘problem’ of people who self-censor themselves. Other features that were meant to promote transparency, such as one that Clegg mentions- the ‘Why am I seeing this’ feature- are actually useless as they don’t give any information on all the profile segments that made a specific company target the user. It is incomplete and misleading, as others have noted.

So, this whole tango is built on a deceptive contract and interface. And that’s perhaps why Clegg used the tango as a metaphor to begin with. Unlike the way he portrays the tango as a partnership between two dancers on equal footing, in fact in most styles of the dance there is a leader and a follower. It is also a dance with a complex history, born out of many cultures, including the ceremonies of people in slavery. The Facebook tango is also not with two dancers on equal footing. In a way, dancing with Facebook is more like having hundreds of electrodes from various companies attached to your body to track your movements. According to the highest bid at any given moment, Facebook will send you to dance with a different partner, sometimes several at the same time, with their own interpretations on how the dance should be choreographed. Clegg is counting on you not to interrogate the metaphor- just as Facebook would prefer you to follow its lead.

Authors

Elinor Carmi
Dr. Elinor Carmi is a Lecturer in Media and Communication at the Sociology Department at City University, London, UK. Currently, she works on two main projects: (1) The Nuffield Foundation project "Developing a Minimum Digital Living Standard"; (2) POST Parliamentary Academic Fellowship working with...

Topics