Home

Donate

Addressing the "Cursed Equilibrium" of Social Media Algorithms

Rebecca Rand, Justin Hendrix / Jan 12, 2025

Audio of this conversation is available via your favorite podcast service.

Last fall, Cornell University PhD candidate Cristiana Firullo gave a presentation at the Trust and Safety Research Conference at Stanford University during a session on understanding algorithms and online environments. Titled "The Cursed Equilibrium of Algorithmic Traumatization," the talk focused on the work Firullo is doing with her colleagues at Cornell to try to understand why social media recommendation systems may produce harmful effects on users. Audio reporter Rebecca Rand spoke to Firullo about their hypotheses.

What follows is a lightly edited transcript of the discussion.

Justin Hendrix:

Good morning. I'm Justin Hendrix, editor of Tech Policy Press, a non-profit media venture intended to provoke new ideas, debate and discussion at the intersection of technology and democracy. This week I'm pleased to once again bring you a segment of evidence base, highlighting new research on how technology interacts with people, politics and power. This segment is produced by audio reporter Rebecca Rand. Nice to be back in the virtual studio with you, Rebecca.

Rebecca Rand:

Thank you for having me here.

Justin Hendrix:

I wanted to bring you in because you've recently spoken with a researcher who's digging in a little deeper on this concern about the harms of recommendation systems on social media.

Rebecca Rand:

Yeah, that's right. She's adding to the body of research that's looking at how what pops up on, say, your Instagram feed can have negative impacts on your life and your mental state.

Justin Hendrix:

We're hearing about this a lot, of course, specifically as people are talking about the possible harms these platforms have on kids.

Rebecca Rand:

Yeah, that's right. Back in June, you probably heard the US Surgeon General Vivek Murthy propose putting black box warning labels on social media products. Of course, he can't actually do that without Congress, but he's one of a number of political leaders who's trying to mitigate this rise in mental health issues among young people.

Justin Hendrix:

This is an area of some controversy in the tech policy space right now, just how much certainty do we have that there's a connection between social media and things like mental health? We see this question raised in a lot of contexts.

Rebecca Rand:

Yeah, and there's still a lot we don't know.

Justin Hendrix:

So, what did you learn from your conversation with this researcher?

Rebecca Rand:

Her name is Cristiana Firullo, and she is a PhD candidate in the Department of Information Science at Cornell. She's applying some of her behavioral economics background to try to understand why recommendation algorithms would behave in a way that makes users feel, well, bad. Interestingly, her exploration is not just academic, it's personal too. She got into this after noticing how platforms like Instagram and Facebook exacerbated her own condition when she was struggling with an eating disorder. She would open up Instagram and she noticed something.

Cristiana Firullo:

I was constantly matched with content about exercise and food in a way that didn't leave space to all their content. So, every time I was trying to use social media for entertainment, I was trapped. I was kind of constrained in having to watch these kinds of videos on how to cook healthy recipes or something.

Rebecca Rand:

What Cristiana is describing is what they call the filter bubble, which limited her ability to see a wide range of content and intensified her focus on triggering posts. This actually really confused her because she essentially feels like, don't these platforms want users to have a good experience? Isn't that the best way to keep users like me coming back?

Justin Hendrix:

This is a difficult set of questions. But the reality is that there are a lot of competing priorities even within the platforms about how to engineer their products to achieve optimal results, including engagement and profit.

Rebecca Rand:

You're right. But Cristiana has a less nefarious explanation for what's happening here because she doesn't think social media companies are out to get us exactly. Or at least that's not the whole story for her.

Cristiana Firullo:

If you just look at, what is the actual goal of digital platforms? The goal is to give users a good experience because if they are happy, they will come back tomorrow and it will monetize from them. There is a misalignment, but I don't think it's deliberate. I think it's specifically caused by the lack of sophistication in the algorithms.

Rebecca Rand:

So, she thinks this lack of sophistication leads to what she calls a cursed equilibrium.

Justin Hendrix:

Okay, what's that?

Rebecca Rand:

It's a game theory concept that basically describes when two entities are locked in a bit of a feedback loop that keeps escalating certain behavior. Cristiana looked at Instagram's recommendation algorithm and said, okay, they're prioritizing content based on user engagement, but the algorithm has no idea which engagement is healthy enjoyment and which engagement is obsessive.

Cristiana Firullo:

Engagement, I don't think it's analyzed in a way that can really help to understand whether a person is vulnerable or not. The platforms cannot recognize that the person engages with that specific content because of the obsession.

Rebecca Rand:

So users who are obsessive are providing this really strong signal to the algorithm, stronger than healthy engagement, please feed me more of this. The algorithm knows nothing to do with that except give you more. When it gives you more, you get more obsessive and thus you and the algorithm are locked in a dance where it's making you more and more engaged. And internally you're saying no, but all the algorithm is hearing is yes.

Justin Hendrix:

It sounds like this algorithm is too simple, like it isn't able to differentiate between what people actually want and what they can't peel their eyes from.

Rebecca Rand:

Correct. One thing she talks about is how engagement perhaps is a flawed signal.

Cristiana Firullo:

The problem is that measuring happiness is not possible, so as everyone in science does, we need to use proxies. The proxy that they're using is engagement and this is a bad proxy.

Justin Hendrix:

Got it. So what have platforms done to try to mitigate this, making sure there isn't harmful content on there to begin with?

Rebecca Rand:

Well, that has to do with the other side of the equation, something she calls algorithmic traumatization.

Justin Hendrix:

How does she define that?

Rebecca Rand:

She says it's this idea that a lot of this obsessive content you're going to see isn't inherently harmful. It's pretty tame. It's not going to get taken down by moderators. She described it this way.

Cristiana Firullo:

What I'm talking about is not about things that should be moderated. I'm talking about content that is individually okay. People should and could share certain things. The problem is the joint recommendation of all these things. These things are jointly harmful.

Rebecca Rand:

It's like, one video showing healthy smoothie, another video showing how to strengthen your abs. Instagram has no reason to remove this stuff. It's seemingly harmless on its own, but when it's in this relentless algorithmic context, that's when she says it becomes a problem. You get diet tip after diet tip, after diet tip, and the collective effect of those is what harms you. That's why content moderation hasn't been enough.

Justin Hendrix:

So what does Cristiana think of legislative efforts underway to shield kids in particular from these interactions with social media platforms?

Rebecca Rand:

Well, that's one interesting thing about her work. She actually thinks our focus on kids is way too narrow.

Cristiana Firullo:

So regulation is obsessed with the fact that we should protect minors because you're more vulnerable because of your age. I don't think this is necessarily true. I'm 27 and I'm vulnerable. The thing that no one has really considered is that this is not because these are generations that are different from older generations. The difference is that minors, teenagers in particular, they use social media platforms on a daily basis and more than older generations. So the real difference is how they interact with each other, how they get information about the outside world, and this is affecting their mental health. That means that when they will become adults, probably they will have the very same problems.

Justin Hendrix:

Interesting. So she's saying that the dose makes the poison, and that the average young person is getting a way higher dose.

Rebecca Rand:

Exactly.

Justin Hendrix:

Did she give any examples of vulnerable groups, aside from those struggling with eating disorders?

Rebecca Rand:

Yeah. So addiction was another one, particularly with online gambling. Another scenario she described, that wasn't even about obsession, was just sick people. Let's say, someone has cancer and they're going on Instagram and they're going there because they want a bit of an escape. But because they've been Googling about cancer treatment, they keep getting ads about cancer studies and cancer treatments instead of what they actually wanted, cat videos or whatever.

Justin Hendrix:

So on the solutions, what is she saying about how we can address this problem?

Rebecca Rand:

Well, Cristiana hopes that platforms will get on board with the idea that happy users are good for business. They need to do a better job of measuring users' overall well-being to make sure they're not accelerating someone's downward spiral. She says they need to use proxies other than engagement. I know a lot of listeners are going to be really skeptical that platforms will actually do anything with this information, so I asked her about it. One thing you brought up before is that you thought that fixing these problems would actually be appealing to social media companies, because there are some ways where their profits are aligned with user happiness and enjoyment. I think a lot of people are really skeptical of that, and I want to ask you a bit about how you come to that belief.

Cristiana Firullo:

That's a great question, and you're right because many people don't buy that. What I'm trying to say is that I don't think that Facebook wakes up in the morning and says, okay, I want to harm people. I don't think this is what is happening. But I do agree that there is a lack of responsibility when these platforms neglect these features and these aspects of human data.

Justin Hendrix:

Well, I guess that's the job of policy people listening, to figure out and get them to stop neglecting this responsibility. Rebecca, thanks for breaking this down.

Rebecca Rand:

Thanks for having me on.

Authors

Rebecca Rand
Rebecca Rand is a journalist and audio producer. She received her Master's degree from CUNY's Craig Newmark Graduate School of Journalism in June 2024. In the summer of 2023, she was an audio and reporting intern at Tech Policy Press.
Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics