Insta-Toxic: To Reduce Facebook’s Harms to Teens, Target its Data-Hungry Business Model
Sara Nelson / Sep 17, 2021After many denials and much distraction, the truth is out. According to Facebook’s own internal research, Instagram is toxic to teen girls’ mental health.
This week, the Wall Street Journal reported on a collection of internal Facebook documents which it says demonstrate how the platform is acutely aware of the harmful effects Instagram has on the mental health of teens. The Journal reported that Facebook’s own research showed that use of the image sharing app Instagram, which it acquired in 2012, made at least one out of three teen girls feel worse about their bodies, and revealed that of the teens who reported having suicidal thoughts, “13% of British users and 6% of American users traced the desire to kill themselves to Instagram”, among other devastating statistics.
Much of the reported research focuses on the mental health effects of young users being shown certain content, which can then cause or exacerbate depression, eating disorders, or other mental health issues.
But there is an elephant in the room here. At the heart of how content is curated and shown to teens and other people on Instagram, is the company’s extensive and invasive data collection.
This data collection is what animates Facebook’s advertising systems. It feeds the profiling of its users and plays a central role in deciding who sees or does not see certain content, ensuring whatever content people are shown is likely to keep them engaged. When Instagram’s algorithm shows an image of a super skinny influencer posing on a yacht to a user who has a tendency toward unhealthy eating or depression or ambitions of wealth, this is in part due to the vast amount of data the platform has collected. That data is used to profile the user, which leads the algorithm to think that the user will be more receptive to this type of content.
Companies like Instagram want to understand who we are. What we look at on the platform, how long we hover over certain content, when we use the platform, and much more, can be used to build extensive internal user profiles. And this isn’t limited to the platforms themselves. With trackers like the Facebook pixel inserted on a huge number of websites and apps, these companies can learn more about us, even when we are not using their platforms.
Then we may start to see super skinny influencers or extreme workout ideas or diet products and the company slides us head first into a dangerous rabbit hole of more and more extreme content.
In Instagram’s case, the need to – in effect – addict people to the platform and keep them active for long periods of time comes from Facebook’s need to demonstrate growing engagement and continuous financial gains for shareholders. “Facebook executives have struggled to find ways to reduce Instagram’s harm while keeping people on the platform, according to internal presentations on the topic,” reports the Journal.
In Facebook’s 2020 annual report to the U.S. Securities and Exchange Commission, the company listed its “ability to add and retain users and maintain levels of user engagement with [their] products” as a risk it faces. It said “[t]he size of our user base and our users' level of engagement are critical to our success. Our financial performance has been and will continue to be significantly determined by our success in adding, retaining, and engaging active users of our products, particularly for Facebook and Instagram.”
In September 2021, the Wall Street Journal reported that Mark Zuckerberg had resisted internal calls to fix its News Feed algorithm, which the Journal reports internal data scientists at Facebook said was making “angry voices louder”, because “[Zuckerberg] was worried they might hurt the company’s other objective—making users engage more with Facebook.”
But going beyond tweaking algorithms and adding more content moderators, more fundamental interventions, such as substantially reducing the amount of data that is collected and used to profile people – and by doing so making the targeting of content and ads less harmful – is not on the cards. Extensive collection of data and profiling is what the industry was built on, so companies are afraid to overhaul that business model and scare shareholders off. But changing this is precisely what might keep them in the market for the long-term.
Alarmingly, in full view of its own research, Facebook says it still plans to develop a version of Instagram for children under 13 years old. Forty US Attorneys general wrote to Facebook to ask the company to abandon their plan. Facebook responded saying that the company would not show ads “in any Instagram experience [they] develop for people under the age of 13”. But again – this misses the point. While ads can be highly problematic, harm doesn’t come only from ads, but also from “native” content being pushed to users in a way that exploits their vulnerabilities. In either case, extensive data collection is what facilitates the harm.
What will the company do with these children’s data? Will young people’s exploration and expression of themselves on Instagram be held against them by the platform, to send them spiralling into self-doubt and depression?
Of course, if Instagram can get children using the platform, they may be more likely to continue to use it as they become teens and adults. And there are a whole lot of children being born every year who could help feed Instagram and Facebook’s endless desire for users and data.
At some point Facebook and other big tech platforms need to ask themselves – profit to what end? Growth to what end? As Facebook’s own research shows, the company’s technology has sadly played a direct role in teenagers taking their own lives. And teens are only one of the communities that suffer from the company’s poor policies. There are many smart and capable people leading and working at these companies who can surely develop business models not built on the exploitation of teenagers’ data and livelihoods.