Home

Gaia Bernstein on Gaining Control Over Addictive Technologies

Justin Hendrix / Apr 2, 2023

Audio of this conversation is available via your favorite podcast service.

Earlier this year, Seattle Public Schools filed a lawsuit seeking to hold social media companies accountable for harms they allegedly cause to students' social, emotional, and mental health. The complaint against companies like TikTok, Instagram, Facebook, SnapChat, and YouTube seeks to change the way these companies operate and make them take responsibility for such harms. In a statement, Seattle Public Schools superintendent Brent Jones said, “We need partners to work with us as we serve our young people, rather than companies placing a priority on profiting from the digital habits they’ve created as a way of captivating our students’ attention. Our obligation is to create the conditions for students to thrive and have high quality learning experiences. The harm caused by these companies runs counter to that.”

Across the United States, there is a growing number of such lawsuits that seek to hold tech firms accountable for various alleged harms. My guest today is tracking such suits closely. Gaia Bernstein is a Law Professor, Co-Director of the Institute for Privacy Protection and Co-Director of the Gibbons Institute for Law Science and Technology at the Seton Hall University School of Law. She writes, teaches, and lectures on the intersection of law, technology, health and privacy, and she is the author of a new book on the subject, just out from Cambridge University Press, titled Unwired: Gaining Control over Addictive Technologies.

A transcript of this discussion is forthcoming.

Justin Hendrix:

I'm looking forward to having a conversation about this book and all the effort that went into it. But for my readers and listeners who may not be familiar with your work prior, can you just tell folks where you're coming from? What do you do research on? What do you teach about? And how did you get to the place of writing this book?

Gaia Bernstein:

I've always been interested in the way that technology affects people. Many of my colleagues are interested in how we incentivize technology creation, how we incentivize innovation. And I have always throughout my career looked at how people are affected and what's the legal role, which made me look at very, very different technologies, reproductive technologies and genetics and of course information technologies on privacy.

And I started working on this because I basically noticed things were changing for me. I was suddenly realizing that I usually work in the morning and I would sit down to work. And after two hours or so I noticed that I was getting nothing done. And what did I do? At the time, I was reading lots of blogs, I was surfing the internet, I was texting, answering emails. And I started paying attention more and more to what was happening. At first I thought, well, basically people are not paying attention. This was around 2015, '16. But a certain point, people became much more aware and things were not changing. And then I realized that the book I have to write is a different book. It's not a book about what's happening. It's a book about what should be done.

Justin Hendrix:

You wrote this book, I think it's fair to say, into the pandemic and through the pandemic. And so your observations about the way that technology was influencing your behavior and your life and your children's life, which seems to be an important piece of this book, that kind of, I guess, went on steroids during the lockdown period.

Gaia Bernstein:

Yes, basically we got a feeling during the pandemic, especially during lockdown, which was quite extensive here in New York where I live, that we felt suddenly what it means to be in our screens all the time, how it feels in our bodies, how tired we feel, how we crave a human interaction. So we could suddenly see what the future could mean for us as we spend more and more time on screen. I think, at the same time, it gave me hope because everybody started noticing. Before the pandemic, parents were paying lots of attention. I ran a school outreach program of half a dozen school in New York and New Jersey, public schools in Newark, private schools Manhattan, the whole range.

My law students spoke to kids who got their first cell phone fifth, sixth grade. And I spoke to parents about basically balancing online, offline activities. And parents were very, very worried. But other people were not as worried. And something has changed through the pandemic. Many more people are paying attention and are much more aware of what they want their lives to be like and what they don't like about being on screens for so long.

Justin Hendrix:

You say in the prologue to this book that it interlaces the human with the legal. So you of course are doing this type of observation and you're bringing in stories of things like overuse and addiction. But you also do bring in the science. And I wanted to ask you just about this thing that you've just now sort of signaled just this idea that we're feels like we're almost at or maybe slightly past a tipping point with regard not only to sort of the public's awareness that there's something wrong. And I'd agree with you, even in my own life, it feels like parents are uniquely aware of there being something wrong with our use of technology, particularly among young people. But it's also the science. It also appears that there's more science piling up at the moment.

Gaia Bernstein:

Yes, there's been studies basically for years now, at least for decades, looking at the impact of screen time on cognitive development, on attention, on mental wellbeing, on social disconnection, happiness. So in the last two or three years, first of all, there's been so many more studies, also more literature reviews of the studies so you can see a bigger picture. At the same time, we got more brain imaging studies showing the impact of especially kids who are exposed to excessive screen time. And that really supplemented the psychology studies which were already showing an effect on cognitive developments years earlier. For children, at this point, it seems pretty clear that in certain areas there's a big impact. I would say cognitive development, mental health, attention. For adults, there's also lots of evidence, but for kids there seems to be more and more evidence of a public health crisis.

Justin Hendrix:

You acknowledge this in the book, but I'm certain there are people listening to this who may find themselves or think of themselves as in this category. There are a lot of folks out there who push back who say this is moral panic. These types of concerns have accompanied the advent of every new communications technology. We've always worried about this with regard to children and how it might scramble their brains from the novel to radio to television certainly. What's different at this moment?

Gaia Bernstein:

I think the best comparison is to look at television because people always said kids are spending too much time on television. They're not exercising. They're getting obese, they're not watching good content. But something is very, very different here in several ways. First of all, I like to call the television the human bonfire. At least there was something about all of us watching the same screen together, all of us conversing about the same thing. With screens, with their phones, a community could sit with people in the same room with your headphones and each person is looking at something different.

In addition to that, yes, of course we always had commercial on TV, but we've never had what we have here, which is technology companies taking our human vulnerabilities and having an entire business model which is based on keeping us online for longer. So basically we know we get everything for free. We get Gmail for free, we get Facebook for free, but it's not really free. We pay with our data and we pay also with our time. Companies need us to stay longer online so they can collect more information on us. The more data they have, they can target better advertising to us, and then they need us to stay for longer, so we're exposed to the advertising and purchase products. So they have this incentive to manipulate us and to have us stay online for longer. And they're using very well known psychology principles which basically make us stay in ways we don't even realize.

I'll give one example. So one example is they take away our stopping cues. There's a famous psychology experiment where people were given soup to eat and people who were given a normal ball would just eat a soup. But people who were given a bowl where you couldn't see the bottom, they ate 70% more. Why? Because the stopping cues were taken away. This is what's happening all over the internet. You have the infinite scroll on Twitter, on Facebook and Instagram. There's never an end. You go on YouTube or Netflix, you have the autoplay. There's never an end. So there is something much bigger going on here than just us watching something preferring not to get out. There's somebody else fighting against our willpower in a way that's sort of slanted against us.

Justin Hendrix:

You make comparisons, of course, to the way that cigarettes were dealt with, and of course also to food and the degree to which we've had to kind of contend with the problem of too much fat, too much sugar in our diets and processed meat and that sort of thing. But I want to ask you about how you think about these kind of comparisons as useful ways of thinking about the type of litigation that may be necessary with regard to tech firms. You call them the choice makers. I'd agree. They're the folks sort of essentially establishing the framework and the kind of game dynamics that we're all responding to in the digital environment on some level. Or what are the lessons that you see from litigation in tobacco and in food that could be potentially applied to tech?

Gaia Bernstein:

I think there's a lot to learn. The main thing, the lesson I think is most important is the way cigarette companies and food companies clatch onto the personal choice and personal responsibility argument. So basically what happened when the tobacco litigation started and smokers and their families started suing, the tobacco industry argued, well, nobody forced them to smoke. They wanted to smoke. They're responsible for their health consequences, the lung cancer, the death. And courts accepted this arguments for years. Basically they didn't attribute the responsibility to the tobacco companies. The same thing happened with food. A group of teens sued McDonald's because they were obese because they suffered from diabetes. They used to eat at McDonald's every day. McDonald's argued and the New York's court agreed. Nobody forced them to eat at McDonald's. Nobody forced them to supersize. And therefore they, and not McDonald's, are responsible for this.

So this is exactly what, we're already seeing this with tech. Game manufacturers had to go, when they went to the FTC workshop on loot boxes a couple of years ago, they immediately pointed out that nobody forces gamers to play loot boxes as this addictive feature in games. Nobody forces them to play and they or their parents are responsible. And in a way, tech is taking this already further because tech is giving us digital health tools in order to show us how we can control our urges because we're irresponsible. So we get screen time on our iPhone showing us how much time we spend on our iPhone. You go on Instagram, you can set it so you'll know how long you've spent there. All of these are tools which do not go to the heart of the addictive features. They are really putting the ball back in our court exactly like the tobacco industry did and like the food industry did.

Justin Hendrix:

Want to pause here and ask you about a couple of lawsuits that probably emerge I suppose after your book was already with the publisher. In the Pacific Northwest in Seattle, I think at least one other school district in Washington, we're seeing these complaints about social media's negative effects on youth mental health brought by school districts against companies including Meta, Google, Snapchat, ByteDance which owns TikTok. What do you make of these suits? Are you following them closely? Do you think they represent the type of model litigation that you're imagining will lead to change?

Gaia Bernstein:

I think this is incredibly important. And again, I just want to look back for a second because there's so much to learn from the past. When we saw that tobacco companies started losing, part of it was because the attorney generals started suing for the costs to the public health system of smoking. So suddenly now this broke the personal responsibility arguments because suddenly somebody who had nothing to do with smoking was bearing the cost.

Now what's happening here is interesting in two ways. First of all, it's exactly the same thing. You cannot blame the school districts who are saying social media is addicting the kids, and they stay it on there for so long and they suffer from all these mental health consequences and we have to deal with it and the costs are on us. So again, you have somebody who you cannot blame for choosing bearing the cost. Very, very similar. Very interesting. They're also using an interesting legal theory. They're saying that social media is a public nuisance, which is a very, very different way to think about it. So I think this is very, very interesting and we'll see how this develops.

Justin Hendrix:

I'd say when I read about these suits and kind of looked at this idea of social media as a nuisance, I couldn't help but think about how basically public workers in a range of different areas are having to kind of contend with the nuisance that's created for them by social media. I spent a lot of time over the last year looking at the plight of election workers, the additional expense and security and just cognitive load they've carried because of the violent threats against them following the essentially proliferation of false claims about elections. And this is a phenomenon that's true not just for election workers and teachers, but also for doctors during the COVID crisis. We could just go down the list of a range of different public entities that have had to sort of bear real costs, but also to bear personal costs among the people who are involved.

Gaia Bernstein:

So I think that is part of the issue. I think all of us in a way, even employers are bearing the cost of this when their employees are online and they spend all this time. Instead of working, they are sitting there and they are on Facebook. And nobody can really block. Now, when I started working in the law firm, it was the beginning of the internet, so they blocked our access to the internet. You cannot do this anymore. You cannot really block people because they need it for different things. So I think lots of workers, lots of entities are bearing the externalities of this.

Justin Hendrix:

So strategic litigation is one route to change that you see. The other is privacy and the kind of activism around privacy, new legislation around privacy. You point to California and its privacy legislation is perhaps the boldest yet. But we're also seeing, of course, other states, cities, towns taking up that call. Why do you think folks need to take a different look at privacy?

Gaia Bernstein:

Privacy is important for containing technology overuse for two reasons. First reason, it's part of the same business model. But the thing is the Pfizer] through privacy been going on for a longer time. So just by looking at privacy, we can see that there's already so much pressure on the business model. And if there's restrictions on company's ability to collect data, this may affect the whole model because if you cannot collect data, then it doesn't matter as much how much time people spend online because you cannot target advertising to them as well as you did before. So I think there's a lot of hope that looking at the pressure on privacy will help technology overuse.

We can also learn a lot about privacy because it's the same industry and they're operating in the same way. And you can see how, first of all, for years the technology industry with privacy said, "We can self-regulate, we will solve the problem." After they deny the problem, they always said, "We have solutions, we can solve this." So we could already see how it's impossible to trust an industry where most of what's done is you cannot see. It's not transparent. So heading the self-regulation route after what happened with privacy I think is very, very problematic and is not a good idea.

Justin Hendrix:

You raised the reality that of course in this country there's a big hill to climb called the First Amendment with regard to any legislation or even litigation perhaps against tech firms. How do you think that that hill will be climbed or how will that particular obstacle be scaled?

Gaia Bernstein:

I think we're likely to see the tech company bringing up First Amendment claims in several situations. One will be if there will be a requirement, for example, to post warnings. If you go and like we have on cigarettes. This is hazardous for your health. Imagine if somebody goes on their social network and they see warning, this could affect your mental wellbeing. Any required warnings, I'm pretty sure we're going to see First Amendment reaction. We saw this with food. We saw this when San Francisco wanted to put labels on sodas warning about their impact on health. The beverage industry fought and they won in court. They said this is coerced speech. They're not willing to say that. This is a violation of the First Amendment and they won. So we're likely to see this as well.

We can also see if there'll be any prohibitions on addictive elements in the design. My big concern is that we are going to see arguments that design features are speech. There have been cases that already have been going in this direction. There have been cases, there was a case with Snapchat and a speed meter designing a Snapchat that caused people to film themselves. They were speeding and they tried to argue that was speech and they failed. Many of these arguments don't fly, but some of them do fly. And if we're looking at Gonzalez and what's happening now, we don't know how this is going to come out and what Supreme Court will focus on. But in a situation where they focus on algorithms as protected under Section 230, I can see this also [inaudible 00:21:15] basically also strengthening claims that algorithms are speech when they're there to make us spend more time online.

Justin Hendrix:

So one of the things I like about this book is that you don't just stop with the criticism and the kind of problem definition, but you also set out a set of redesign principles. You name a sort of opportunity that perhaps is opening about how to rethink tech. What are some of those principles? What would you tell the makers that perhaps are in the audience that they should take into consideration?

Gaia Bernstein:

I think there are three principles and they're broad principles and there are many ways to implement them as we've seen for many bills that have been floating around. The first one is eliminating clearly addictive features. So some features are clearly up to no good. You take a feature like Snapstreaks. All Snapstreak does is get teens back on Snapchat. Basically if you send a snap within 24 hours and somebody sends it back, you have a streak. And then you start counting the days. You each have to sound to send the Snapstreak back and forth every day. And then you have this chart with all your friends. Let's say it could say 152 with a certain person. You get a special badge. But if you miss a streak, you lose everything. And for kids, you lose all your friendship. This is there for nothing but to get people to go on Snapchat and see the ads. There's no content, there's nothing there.

So there's some elements like this, which are clearly that or the infinite scroll, which I mentioned earlier that takes away our stopping sign, which are clearly just to keep us on for longer. Some elements could be prohibited. Of course, as long as the business model is based on prolonging our time online, there will always be new ones. So any kind of action has to predict or to have some kind of provision that will affect any future features, which are just to prolong our time online. That's one principle.

The second principle is about default settings. We now have an option on our iPhone. We can restrict our time on certain apps. We can change it and we can also change it back. We also have an option of turning our phones gray. We can do all these things, but none of them are the default option. Now default matter, because people, they think a few times before they change the default. If the default was I have two hours on my phone, I can extend it, people view it as a recommendation. It changes. So making these restrictions the default that users can override could change the pictures.

And the third principle, having devices which are mainstream devices, not some things which are on the market which lets you have a light phone as an additional phone with fewer features. But have an iPhone for people who want to have all the features they need, like their alarm or maybe even their Google Maps and their texting, but they're not going to be able to go on Facebook or just browse. And have a good phone like that for kids so you can give your kid a phone that doesn't look embarrassing but has features that they need. So that's the other principle, having basically Google or Apple manufacture phones which are not made to make you overuse but are more moderate in a way.

Justin Hendrix:

You also describe a need perhaps to redesign the physical world and the spaces that we enter on a regular basis.

Gaia Bernstein:

The way we live, the places we spend time in affect us a lot. And when I'm talking about spaces, I'll give two examples. One is space that really matters in this whole story, and that is schools. What goes on in school doesn't stay in school. It infiltrates the home. Now, if your school incorporates screens into classwork, if everything is done, if now after the pandemic we're having more and more schools incorporate games like Minecraft or Roblox into the curriculum, these game manufacturers have very active education departments. Teachers are posting on TikTok. All these things are basically increasing the amount of technology in the classroom. Federal policy encourages technology in the classroom. It's part of a model we had for many years, a laptop for every kid. Rethinking use of technology in these spaces is important because if your kid is working on screen in the classroom, homework is also on the screens when they get home. You don't even know what the kid is doing at that point. How can you tell your kid to get off Roblox if that's an educational tool?

So it's very important to think about what happens in school. France bans cell phones in schools. They want kids to talk to each other during recess. That's another option. It's happening in individual schools or municipality in the US. And so that's one example of spaces. But spaces go beyond that. In New York City, if you go to the airports, all three airports have iPads at every table. On every table you would have four iPads. You order for them. You cannot see the people sitting in front of you because the iPad is between you. You could do nothing but use the screens. This is designing for overuse. We can design to not have people overuse when they go to restaurants or to airports.

And I'll give an example that's a more recent thing. After the pandemic, lots of restaurants still use the QR code. What does that mean? It means that the moment you sit down, you have to take out your phone in order to order. And from that moment on the phone is on the table. Using regular menus sets a different norm. So there are many ways in which people, not just lawyers, can change how their businesses basically encourage use of technologies.

Justin Hendrix:

I want to ask you a little bit about the political context in which you're putting this book out. Just a couple of weeks ago, there was a hearing in the Senate about kids' online safety. And there appeared to be a general kind of consensus among the senators that something needs to be done, that laws need to be introduced, et cetera. And yet you had a pretty wide range of ideas on display. You had folks like Josh Hawley, Republican, suggesting, "Hey, let's ban all social media for kids under the age of 16." And then you had of course some of the witnesses in the room suggesting, "No, that's not the right way forward. Let's focus on design interventions." Perhaps things more similar to what you suggested here in the book. What do you make of the political context in the US at the moment? Perhaps then we'll widen it out and talk about other parts of the world.

Gaia Bernstein:

I think I'm hopeful. I'm hopeful this is the one issue where there seems to be bipartisan support. And I think that the types of proposals that are made are usually not connected to a specific party. You could find Democrats who also think that you might want to ban social networks for kids. So I think there's hope for movement. And I hope also people are realizing the urgency. What we talked about before, the fact there's so much data there. There's a whole generation of kids that spent over a decade exposed to excessive screen time. And many of the senators are parents. I think we might be able, in this space, unlike the issues related to freedom of speech and social networks here, we might be able to move forward.

Justin Hendrix:

Some of your suggestions or even reference some of the interventions that countries have tried abroad. You mentioned China, some of the restrictions that it's putting on use of gaming platforms, things of that nature. Critics of this way of thinking will look at this and say, "Really? You want to kind of handle social media and games in the model of the Chinese?"

Gaia Bernstein:

My goal in this book was to open the spectrum of options because I think we've wasted a lot of time and nothing has practically been done. And southeast Asian countries have spent the last decades experimenting, not just China, Thailand, South Korea, Japan. Actually, it's interesting because right now China and Japan have similar systems, which are very restrictive systems, basically restricting how much time kids can spend on games and on social media to very few hours a week.

And other countries experiment with it and decided to stop. They decided the systems did not work. They thought it not work because they had Cinderella laws and they had the kids turn off at midnight and the kids went crazy because they could couldn't play anymore and then they tried to get into their parents' IDs. But there's such a wealth of data there. I think it's just worth to look at what happened there and what we could consider and learn from them and then make up our minds, and not waste another decade fighting science wars and not looking at so much data that's been collected from trying out laws, some of them fail, some of them succeeded, in many different countries in southeast Asia.

Justin Hendrix:

A last question I would ask you is really kind of back to that question about moral panic. There will even be some from the left perhaps who would read this book and say, "Listen, it's more important on some level to give access to youth to these types of platforms to discover who they are, to be able to express their identities, perhaps to find their identity outside of the context of their home or immediate community." That's worth it, that whatever sort of social pain we're experiencing from the adoption of these media, the trade-off is there. How would you address that?

Gaia Bernstein:

So two things. First of all, not all content is alike. Social media is not like reading the New York Times. So we have to distinguish also, if you are reading, you're making your own decision to stay online. That's fine. Read as much as you want. But don't think you're going online for 20 minutes and realize an hour and a half have gone by and you've done nothing but click from one thing to the other. It's more about the way we're doing it, the lack of autonomy in which we're doing it than being exposed to content. And I think it's important not to mix the two.

The one thing I would like to say is that there's already a lot of action underway. So people should realize that there's lots... We mentioned the Seattle school lawsuits. There are lots of class actions by parents. All these bills that are coming up, they will fail, but looking at the past. Things fail until eventually they succeed. When we're not just starting out, there are lots of people involved in action. Here in Europe for example, many countries have restricted loot boxes, which is an addictive features in kids games. So there are things going on and I think it's important to remember the ways in which we can influence things. One, through exerting pressures on tech companies to redesign.

Another is through indirect pressure in order to change the business model. This is already taking place for antitrust action against big tech. Any change, any success there will destabilize these markets, create more competition. We might see new business model not based on time. And also to remember that people can do things collectively. The main thing is, instead of focusing on how unsuccessful we are fighting it ourselves, is to realize that we can take it to the public sphere and do things through collective action. We've been stuck for too long trying to fight with ourselves in front of our computers, fight with our kids, try to grab their technology. It's not going to be resolved this way.

Justin Hendrix:

Can you paint a picture of what the world would look like if everyone reads your book and the sorts of lawsuits that you're talking about here are successful and the kinds of laws pass that you suggest and tech executives maybe turn over a new leaf? What does it look like 20, 30, 40 years from now?

Gaia Bernstein:

Right now we're heading in a very clear trajectory. We're heading towards more immersion in virtual reality, towards smart cities, towards being connected everywhere. So the first thing is basically to stop and think for a second where we're heading. Now my goal is not, we're not going to go back to a screenless world. Connectivity is here to say and there are lots of good things about this. But on the other hand we can make autonomous decision about what's healthy online-offline balance for ourselves and spend time on screens, but be able to decide when I want to get off because it's my decision. It's not some power I have not even seen that's manipulating me to stay on.

So I think as a society, we never realized what was happening. We're a bit like the frog in the water. We took all these small steps, adopting another app, texting on the go. And we didn't pay attention. We never stopped to think if that's what we want, how much of this we want. I think the opportunity of taking a pause and thinking exactly how we want to balance things is very important. And that's the reason why I gave so many options in the book. I'm not trying to dictate a specific balance. I just think that we have to think about what balance works and the trajectory in which we were heading I thought was scary.

Justin Hendrix:

Well, I appreciate you taking a moment to pause and talk about your book with me. Of course, that book is Unwired: Gaining Control Over Addictive Technologies. Thanks so much for speaking to me today.

Gaia Bernstein:

Thank you so much for having me.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics