Skip to content

Protecting Kids Online: a Conversation with Sara Collins & Joe Jerome

On May 18th, U.S. Senator Richard Blumenthal (D-CT), the Chair of the Subcommittee on Consumer Protection, Product Safety, and Data Security, convened a hearing titled, “Protecting Kids Online: Internet Privacy and Manipulative Marketing”.

For reactions to the discussion, I spoke with Sara Collins, Policy Counsel at Public Knowledge and previously a Policy Counsel on the Future of Privacy Forum’s Education & Youth Privacy team, as well as Joseph Jerome, Director, Platform Accountability and State Advocacy at Common Sense Media, where he focuses on legislative and policy solutions related to children and digital media. What follows is a lightly edited transcript of our discussion.

Subscribe to the Tech Policy Press podcast via your favorite service here.

Justin Hendrix:

I was not surprised to find the two of you paying very close attention to this hearing yesterday. Tell us a little bit about what happened Sara, who was on the hot seat?

Sara Collins:

So the hearing yesterday was at the Senate Commerce Subcommittee of Consumer Protection. And it was all about the kids. Kids’ privacy, kids’ safety, kid tech. We were doing it all. And no one was getting grilled, although there were lots of call-outs of TikTok for not showing up. They still have not had their time in the congressional hot seat yet- but it was, again, academics and other advocates who do a ton of research and work in the intersection of kids and technology.

Justin Hendrix:

So give me a couple of the top line things that you heard yesterday that you thought were interesting. And in particular, what you heard from lawmakers in terms of the direction they were taking things.

Sara Collins:

The direction was kind of all over the place and that’s one of the things that’s so tricky about when you’re talking about the kids space because there are impulses around protecting kids data. There are impulses around making texts safer in a safety sense. And then there are also impulses around giving parents more control. And those impulses don’t always align, don’t lend themselves to the same policy solutions.

Joseph Jerome:

I don’t think it’s any secret that at this point that there is a bi-partisan interest in kids and tech. I think if you’ve been paying attention to any of these tech hearings over the past six months, this has come up in the House and the Senate, but it really is in some respects a little bit unfocused. Sometimes it’s focusing on this larger section 230 debate and whether that is impacting kids. It gets into this, I think, a really sort of corrosive narrative around tech being addictive for kids. And this hearing was supposed to be about privacy.

All the witnesses came ready to discuss challenges that they’ve had with trying to monitor the app ecosystem, trying to understand whether and how and to what extent companies are complying with COPPA, the Children’s Online Privacy Protection Act. And they were still being put on the spot by lawmakers that were asking a whole lot of questions that really weren’t focused on COPPA and what COPPA addresses. And on one hand, I think that’s good- COPPA isn’t the solution to everything. But on the other hand, it’s a privacy and marketing law and there are a lot of other tech issues out there. So, I think sometimes the lawmakers have a really big appetite for trying to protect kids from tech, but what they want to protect kids from and how they want to do that legislatively is an open question.

Justin Hendrix:

There were some specifics put forward about the challenges around privacy. We did get to hear a little bit of the concern around something like 50% of apps in the children’s space seemingly evading privacy protections for children.

Joseph Jerome:

I think all the witnesses sort of hit on the fact that COPPA has this actual knowledge standard. The way it’s been interpreted, they basically need to know with a hundred percent efficacy whether a user is a child. And of course, when you’re dealing with a random user online, you’re never going to necessarily know a hundred percent if it’s a kid. And so, this standard makes it really hard to hold companies accountable when they’re swooping up a lot of data that’s probably from kids, but they don’t actually know is from kids. The other big issue that I think a lot of the folks are- and Serge Egelman, one of the witnesses at yesterday’s hearing- has done a lot of work at AppCensus, evaluating third-party data practices. It’s a complicated app ecosystem.

I don’t think either Sara or I can adequately describe it to you, but we don’t know what we don’t know. Instead, we just know that there are a lot of software development kits, third-party trackers that are potentially scraping a whole lot of data using it for ad targeting, using it for profiling, but we can’t actually confirm that. And I think there’s just a lot of uncertainty as to who should be responsible for cleaning up this ecosystem. Should it be the Federal Trade Commission, with more authority? Should it be the app stores, Google and Apple, to actually police these apps a little bit better? Or is there a role for self-regulation, which I imagine most advocates including myself are a bit skeptical of.

Justin Hendrix:

I was struck by one thing that Serge Edelman, who you referred to- the research director of the usable and privacy group at the International Computer Science Institute- said. He actually used this word that kind of stood out to me- this idea that social media platforms are “grooming” children towards certain behaviors that he regards as perilous. He mentioned ads that they’re seeing, perhaps that promote weight loss, unrealistic beauty and body standards, things of that nature. Drinking, vaping, all sorts of things that are apparently evading the filter.

Joseph Jerome:

It’s not all about ads. I think lawmakers get sort of confused, there’s been a lot of discussion right now about Instagram Kids. And Facebook has already said that Instagram Kids won’t have ads in it. That’s sort of missing the forest for the trees. You’re describing body image issues, all sorts of unhealthy cyber bullying behaviors on social media. Well, a lot of that stuff is targeting teenagers. A lot of that stuff is going on right now, and that is not really implicated or necessarily solved by COPPA, which is a children’s privacy law that’s geared toward kids under the age of 13. So, I think there are gaps in protection that exists. And also, we need to be clear what we’re trying to solve for.

Sara Collins:

I also want to sort of push back on the idea- I really dislike the use of “grooming” or other sorts of behaviors like that. That’s the language of sexual predation. Social media networks, content creation networks, not having rigorous standards for their ads is not grooming. It isn’t good. We shouldn’t be defending that. And these platforms need to have more responsibility for the content that generates some money, i.e. ads and what they’re showing, whether they’re predatory, discriminatory, whether they’re being shown to children- that’s all very important. But I think what makes it so difficult to sometimes engage on kids issues is that it gets too sensational really fast. And that’s just something I dislike because you’ll notice when it gets that sensational and we frame it around kids or teenagers, or more specifically teenage girls, oftentimes the solution then becomes, how do we keep them off the internet? And that is something- as a former teenage girl- I am vehemently against.

Justin Hendrix:

And we do see some of the language of concern about female behavior definitely having a different tinge. And I don’t know if Serge meant to use that term in a way to evoke that double entendre or not, but it nevertheless did. So you’re right to point it out. I noted that Angela Campbell at Georgetown Law, who was one of the other witnesses, pointed out that in 21 years, the FTC has only moved to prosecute violations of COPPA 34 times. Is that just sort of another piece of evidence that not much is being done on this front? Or that COPPA is not the right framework to address these harms?

Sara Collins:

I will say the FTC has a lot of laws they have to enforce, to just be at least sympathetic to part of it. And another part is the FTC has an incentive to take cases that are quick to settle. They’re trying to minimize costs. They’re trying to get precedents out there that show that they’re getting wins and getting money. And I think that’s where we’ve seen problems with this actual knowledge standard becoming you have to be a hundred percent certain, you found a kid and this is the kid. You could see a different FTC over the years. If we had different history being much more aggressive about actual knowledge and having something that’s a little bit more malleable, and having something that’s a little bit more intuitive or at least encapsulate some of the behavior that gets waved away, like when companies sort everyone into under 15, and are like, “well, we don’t know.” You could imagine a regulator who took a much more aggressive approach, and they just didn’t.

Joseph Jerome:

I agree a hundred percent with that. We need to give the FTC some credit. I think a lot of advocates want them to do more enforcement. I would like them to do more enforcement, but they are resource constrained. They have taken some really bold cases. Certainly like, if you look at the big YouTube settlement, where functionally YouTube is a general audience service and they were saying, they did not have actual knowledge of kids, but at the same time, they’re telling advertisers that they can reach X numbers of children. And they had functional knowledge that there are plenty of younger users using their service.

I think the FTC sort of reached there to expand COPPA, but they’re constrained by what the law actually does. At the end of the day, they’re trying to provide redress to consumers and they’re limited. I also think we should acknowledge that COPPA can be enforced by state Attorneys General. That’s something that I spend a lot of my time working on, and AGs have brought cases too. I think Sara is a hundred percent correct that we want to bring cases that we can close. And there are elements of COPPA that are just tremendously under enforced. The best example of that is that, if you read COPPA, it has data minimization provisions. This is the type of thing that I think privacy advocates are really strongly pushing for in a general privacy law.

.

COPPA compliance should require apps to not collect more children’s data than is necessary to provide an app or service. But that is not the part of COPPA that’s enforced. Instead, what we end up seeing is lots and lots of enforcement about whether there was verifiable parental consent, whether companies had certain knowledge of who their users were, and then what did a privacy policy say? So the larger policy questions posed by children’s privacy law like COPPA- those are the things that I think haven’t been enforced and litigated. And if we’re being honest, it’s very challenging to litigate.

Justin Hendrix:

On Friday, you had Senators, Markey and Cassidy putting forward this Children and Teens Online Privacy Protection Act, which I guess would do some of the updating that you’re talking about to COPPA. What do we make of it? Is there any good stuff there?

Joseph Jerome:

So speaking as a representative of Common Sense, we have been a big proponent of Senator Markey’s work in this space and have supported this law to the extent we see lots and lots of lawmakers on both sides of the aisle say they want to do something to expand privacy protections for kids and teens. Here’s a bill that’s out there and we’d like to see people’s reaction to it. I’m hesitant to say this professionally, but I’m a little bit of a cynic personally. And the reality is there’s bipartisan interest, and it seems in some respects, the children’s privacy component is where there’s most consensus. The problem is the larger privacy debate is being fueled by concerns about the alleged patchwork of state laws, like the CCPA and Virginia, and then this sort of industry interest in global interoperability with GDPR and the range of data protection frameworks we’re seeing in Brazil and China and Japan. And so, as a practical matter, I think a COPPA update would get a lot of votes if it ever reached the floor of the House or Senate, but that isn’t where I think that the relevant committees are at this point.

Justin Hendrix:

Sara, you’re more of an optimist?

Sara Collins:

I don’t know if I’m an optimist. I am going to be a bit of a wet blanket about kids’ privacy. One thing that I sort of dislike about the kids’ privacy discussion is when you start with kids’ privacy, you’re inherently building a privacy law and then adding kids stuff on top of it. And there’s a lot of stuff in COPPA, as Joe said, that would be very at home in a good federal privacy law. So to the extent that COPPA 2.0 does that, that’s really great. It’s just when I want to talk about kids and what kids need, especially for privacy, I want to be having that conversation on top of knowing that no one’s data is getting exploited, that there is data minimization across the entire ecosystem. I think that does two things. One, it makes it less fraught to comply with that type of law. If you are to comply with the kids’ law, if everyone has to operate under the same sort of baseline privacy standards. And then two, we can really start deciding in the kids’ space, what exactly we’re trying to accomplish- whether it’s more parental control, better security, different sort of privacy questions, like maybe an eraser button, because while we don’t like the right to be forgotten in the US we’re a little bit more warm to the idea for that, for under eighteens. 

And the last thing I just want to flag about COPPA and its update- it still relies on parental consent. And as a privacy advocate who’s very skeptical of how useful consent is as a regulatory measure or as a measure that actually protects people’s privacy, I don’t know why we put it in a kid’s law when most advocates at least don’t really want it in a federal comprehensive standard anyway. Again, it’s the same problems. Nobody has time to read the policies. You’re not really going to know what you’re consenting to. It’s really hard to understand how privacy works across apps and third-parties and servicers, like let’s just have a better baseline. And that’s just something that I see again in this COPPA 2.0, it still does function under a model of consent.

Joseph Jerome:

Can I gently push back on that? And I don’t necessarily disagree, except I think it is important to acknowledge- and this is certainly something I’ve heard that’s alarming from our friends in industry where they say, we should not expand COPPA because we should not be treating teenagers the same as kids. Well, the proposals currently before Congress in the Markey, Cassidy bill, would require consent by teenagers. I do take Sara’s point that that doesn’t solve the larger problem that we have with consent. I tend to think of consent, when we talk about consent and privacy law, we are not talking about informed consent, akin to say, when you donate a kidney or something. Really, I think consent is a proxy for how much friction and challenge we want to put into how companies can collect and gather data.

I don’t know if people want to be honest about that, but I do think it’s probably not accurate to really be talking about consent in any of these privacy laws because it doesn’t work. I think it’s important, particularly, tech policy press readers to be aware of. I think this entire debate just again, shows how far behind the United States is on all sorts of tech regulation. The United Kingdom is about to go into force. They have an age appropriate design code that goes into force in September. A lot of the issues that I think came up at this hearing also are echoed in the online safety bill proposed by the United Kingdom. I know that there’s a lot of controversy about those proposals. But if you just look top to bottom at the sort of regulatory repulses that are being proposed in Europe, they’re night and day. They’re on to the next issue while we are still arguing about various opt-ins, opt-outs of privacy rules in the United States. And I find that, as a privacy advocate, to be tremendously frustrating.

Sara Collins:

Absolutely. I wish we were at the point where we can start talking about design decisions. I actually think that’s a really fruitful area of regulation. I think you could make really meaningful differences on how these platforms perform. If we’re talking about design and how design encourages or discourages certain behavior and what it leads to, especially if we’re taking that civil rights lens that we’ve had from the Facebook audit and Airbnb audit and other learnings. I think there’s a really broad horizon of interesting regulation and safety precautions we can be putting or just like different ways of thinking about tech policy and tech regulation.

And we’re still just stuck in, did we get a parent’s consent or did we get their consent? Is it opt in or opt out? What ads are shown? We can do so much more than that. We could be so much more creative than that.

Justin Hendrix:

Well, on the safety question, maybe I’ll just switch gears a little bit, kind of going back to one of the earlier hearings that you referred to earlier in this spate of hearings that have talked about children’s issues- at that March 25th hearing with Mark Zuckerberg and Sundar Pichai and Jack Dorsey, there were a number of Republicans and a couple of Democrats as well that brought up real safety concerns, concerns around the role of social media in causing depression among children and teens. And there was a report on NPR today that suggested on some level there’s a bit of a disconnect between what the platforms are saying and what researchers are saying. What do you make of this?

Joseph Jerome:

We don’t know what we don’t know. And I think everyone should take with a grain of salt anything that the largest tech companies say, and their commitment to making a healthier and safe environment. I mean, the reality is I think we could all question that perhaps they don’t put their money where their mouth is, as a matter of policy. So Common Sense has worked with a number of stakeholders in industry and outside of industry to push the Children and Media Research Act, CAMRA. There are sponsors in the House and the Senate, and this is a proposal that would give money to NIH to conduct some of this independent research into how social media impacts young people.

Because the reality is a lot of this research is happening behind closed doors within the largest tech companies. And they aren’t exactly willing to share it, particularly if it gives them a potential black eye, there are solutions to this. And, I just sort of pop out my advocacy chest. If lawmakers are seriously concerned about this, there are proposals that are out there that are widely agreed upon that would fund useful research that would actually give us answers to this situation so that we don’t frankly, have to rely on begging and pleading with Facebook to give us scraps of what they’re doing internally.

Sara Collins:

So, absolutely everything that Joe said. We have no idea what’s happening in these platforms because it’s so hard to do independent research. And I think you’re absolutely right. The only way to get this research done is by government mandate. One thing that I want to make clear and something that stuck out with me while I was watching the Social Dilemma- Tristan Harris, very famously was like, “Nobody panicked about the bicycle when it was introduced.” And then a whole bunch of scholars that study history pointed out the panic around bicycles. And they were centered around young single women going who knows where on bicycles. The reason I bring up the bicycle story is we have all of this research about tech panics being linked to greater independence for young women.

I want the research that tells us if social media or even mechanisms within social media have a higher propensity to cause depression or anxiety. That is absolutely important. That’ll inform design decisions and that’ll inform the types of work we’re trying to do. What I do not want to see is something which is the research we’ve sort of been seeing, or at least the summaries of research, which is like Facebook causes teen suicide. One, that’s not helpful. Two, I’m not a public health expert and I’m not sure how you can get quite from A to B in that instance. So, that’s just one thing that I want to keep at the front of my mind. 

And the other is at these hearings, and at this one, and at others, we very rarely hear from teenagers. And I think LGBT groups would be so important to be hearing from because oftentimes these are the teens where parents or government’s interests are not necessarily aligned with their own. And so regulation that makes it harder for them to communicate, it makes it harder for them to find community. Is troubling for me. So again, this is all things that have not happened yet. I don’t think we’re there yet, but I am wary of it. I am sensitive to it. And maybe it’s because I got to be a teen girl on the internet with all the good and bad that that entails.

Justin Hendrix:

One of the comparisons that the NPR piece made to Facebook’s behavior was perhaps the easy comparison to the tobacco industry. And maybe that cake is a little too far, Sara, do you think you’re seeing that same kind of muddying of the waters, you do know what I mean?

Sara Collins:

I mean, there was a researcher in that article who said that goes too far. There are probably mechanisms on social media that are more problematic than others. And again, I would love to see research on that. There is a lot of consternation about the like button and different commenting mechanisms, and that visual ways of communicating are more harmful than text-based. That is all research I would really love to see, because I think there is really important disambiguation there. The other thing I’d like to point out, most of the social media research focuses on Facebook, Twitter, Instagram, but that’s not the only place kids interact. There’s also Snap. There is AO3, Tumblr, Reddit. There are tons of places teenagers use what we would consider social or connective applications. So I would also want to see, are there good things about them or are there better applications or better ways of communicating and growing your community than others.

Joseph Jerome:

So again, complete agreement. I think the only platform Sara didn’t mention was YouTube. And I think you’ve done a lot of work in highlighting how YouTube often is underappreciated in its impact here. I would echo again that yeah, we spend a lot of energy and time focusing on Facebook, when I think the reality is kids and teens aren’t on Facebook. Personally, I’m very interested in some of these up and coming social VR applications, the rec rooms of the world. Roblox is a fascinating place for younger people to congregate. And we don’t see that really appreciated, particularly as we get into conversation about teens. 

And I guess as a final conclusion, I don’t want to give a free pass to the tech companies. I have as many problems with Facebook as the next person, but I think we’re sort of, frankly, letting ourselves off the hook to suggest that the social media companies are the cause of everything that’s going wrong right now. The reality is we’ve got some systemic societal issues. I’m a tech policy person, but working at Common Sense over the past few years, I’ve had a really growing appreciation for the role, and frankly, the neglect that we have done for things like civics education, digital citizenship, media literacy.

I worked on these types of issues like civics issues. And I think we should acknowledge that our K-12 high school social studies courses don’t give students an honest understanding of our society and our country. And that’s a real problem. And efforts to improve that are getting bogged down in the same hyper polarized situation we’re seeing across the board, where we can’t even really agree what the truth is. And so, if you have a situation where parents don’t really agree on a shared societal truth and policymakers aren’t willing to force the issue in terms of school curriculum, what do we expect of our young people at this point?

They’re trying to make their way through society and social media offers a wealth of options to connect with people and learn new information and has opened a Pandora’s box of problems. But we’re not operating from a shared foundation of what is true. And frankly, how to be respectful and thoughtful of your peers that may disagree. That might be a little bit of a cliched ending, but I think it’s really easy for lawmakers on a bipartisan issue to blame Facebook and not frankly, take a cold hard look at themselves in the mirror. I think it’s easy for lawmakers to blame Facebook and less easy for them to frankly, take a cold, hard look at themselves in the mirror.

Sara Collins:

As I worked through the entire pandemic, there are tons of issues that I’m aware of that I don’t touch on professionally, whether it’s abortion access, climate change, policing and the policing of black men. These are all really salient social points for me. And I can’t imagine what it would have been like as a teenager to be confronted with all of this, especially with the education that I had gotten and I was getting. Again, I don’t want to let social media off the hook. I think there are tons of things we can do to improve it for kids. But I do think there is a much better awareness among Gen Z of current societal issues. And I think… And maybe this is because I’m on Twitter too much, and I will own that. There is a bit of a fatalism and there’s a bit of a despair that maybe we won’t fix these big problems.

Joseph Jerome:

Look, we’re all in tech policy. And I think Twitter is a perfect medium for folks that are based in DC working on policy. I love Twitter as much as the next person, but I don’t think it’s reflective of reality.

Sara Collins:

Yeah. I wonder what it would be like to be experiencing the world, or at least a lot of the world, through the lens of social media and all the different ways it bends your perception and how that changes your view and your interaction with the world. I’m sure it would energize some kids, but I’m just thinking of myself at 16, and it might’ve been overwhelming and almost despairing. That’s unsatisfying. I have no answer to that question at all, but it is something I do think about.

Justin Hendrix:

Well, that seems like it’s a complicated place to end, but probably actually ends up right where it needs to be, which is that it’s our job to think through some of these complexities  as adults and hope to get to something better. So Sara and Joe, thank you very much.

Sara Collins:

Thank you.

Joseph Jerome:

Thank you.

.