Home

Donate

Transcript: Children’s Safety in the Digital Era: Strengthening Protections and Addressing Legal Gaps

Prithvi Iyer / Feb 20, 2025

February 19, 2025 - Washington, DC. The US Senate Committee on the Judiciary hearing entitled “Children’s Safety in the Digital Era: Strengthening Protections and Addressing Legal Gaps.” (Joshua Sukoff/Medill News Service)

On Wednesday, February 19, 2025, the US Senate Committee on the Judiciary held a hearing titled “Children’s Safety in the Digital Era: Strengthening Protections and Addressing Legal Gaps.” The hearing focused on the growing dangers children face online, particularly child sexual exploitation and other online harms, highlighting the failure of Big Tech to implement adequate safeguards, the inadequacy of current legal frameworks like Section 230, and the urgent need for bipartisan legislative action to hold tech companies accountable and protect children.

Witnesses included:

  • Rep. Brandon Guffey, Representative, South Carolina House of Representatives
  • Carrie Goldberg, Founder, C.A. Goldberg Law Firm
  • Mary Graw Leary, Professor, Catholic University of America, Columbus School of Law
  • John Pizzuro, CEO, Raven
  • Stephen Balkam, Founder and CEO, Family Online Safety Institute (FOSI)

What follows is a lightly edited transcript. Please refer to the original video when quoting.

Sen. Chuck Grassley (R-IA):

Good morning, everybody. In today's digital era, our young people face risks that previous generations couldn't even have imagined. Even though technology brings amazing opportunities for education and growth, it also opens doors to new dangers that we must confront. This isn't the first hearing we've had on this issue, and unfortunately, probably won't be the last. We held a hearing on this same subject roughly a year ago when we brought CEOs from some of the largest social media companies to discuss safety issues on their platforms, and we held a similar hearing a year before that.

On the one hand, this is alarming because the problems is getting worse. In 2023, as an instance, the NCMEC CyberTipline received 36 and two-tenths million reports of suspected online child sexual exploitation, a 12% increase over 2022. And even though the numbers haven't been published for 2024, it seems that they're expected to go up.

Additionally alarming are the new technologies that are being used by bad actors to exploit children. Online predators can use generative AI, for instance, to take normal images of children and manipulate them to create novel forms of CSAM. In 2024 alone, NCMEC reported over 60,000, almost 61,000 instances of generative artificial intelligence CSAM.

Despite this, so far, Congress has enacted no significant legislation to address these dangers against children, and tech platforms have been unhelpful in our legislative efforts. Big Tech promises to collaborate, but they're noticeably silent in supporting legislation that would affect meaningful change. In fact, Big Tech's lobbyists swarm this Hill armed with red herrings and scare tactics, suggesting that we'll somehow break the internet if we implement even these very modest reforms.

Meanwhile, these tech platforms generate revenues that dwarf the economies of most nations. So, how do they make so much money? They do it by compromising our data and privacy, and keeping our children's eyes glued to the screens through addictive algorithms. Indeed, in one recent study, 46% of teens reported that they're online "almost constantly." This has had severe mental consequences for adolescents. It has also led to a rise in sexual exploitation, as some algorithms have actually connected victims to their abusers.

Should such tech platforms be allowed to profit at the expense of our children's privacy, our children's safety, and our children's health? Should they be allowed to contribute to a toxic digital ecosystem without being held accountable? I believe to everybody, the answer is very clear. When these platforms fail to implement adequate safety measures, they're complicit in harms that follow, and they should be held accountable.

That said, there are some signs of encouragement. Just as new technologies are being developed that exacerbate harm to children online, so too are technologies being developed to combat exploitation. As one example, with AI rapidly evolving, open-source safety tools are being developed to recognize and report CSAM. Some of the witnesses here today will speak to the effectiveness of these tools.

Additionally, on a committee with some of the most diverse viewpoints in the United States Senate, we have actually advanced bipartisan legislation that addresses legal gaps in our current framework, especially those related to the blanket immunity that Section 230 provides. Last Congress, for example, we reported several bills, online safety bills, on a committee with overwhelming bipartisan support, and there are a number of bills that are being considered and refined this Congress, which will give attention to in due course.

That being said, we can't come up with a wise and effective legislative solution without first understanding the nature and scope of the problem. And so, that's why we're having this hearing today. Our witnesses come from various backgrounds and represent very diverse perspectives, all of which point to the need for our committee to improve legislation and continue our work to keep kids safe.

So with that, I'll open things up to ranking member Durbin to give opening remarks. After that, we'll hear from Senators Blackburn and Klobuchar. Then I'll introduce the witnesses and swear. Go ahead.

Sen. Dick Durbin (D-IL):

I want to personally thank you, Senator Grassley. This is unusual, a change in leadership in this committee, and yet an issue which we took up very seriously in the last few years on a bipartisan basis has survived the change, and in fact, this hearing is evidence of the determination of the chairman. I'd like to join him in that assurance that we're taking this issue very seriously.

It was almost exactly two years ago, this committee held a similar hearing. We heard from six witnesses about the harm social media does to our kids and grandkids. A mom whose son took his own life after he was bullied online. A young woman whose mental and physical health suffered as she chased the unattainable lifestyle depicted on Instagram and other apps. Experts who told us how Big Tech designs their platforms to be addictive, keeping users online for longer and longer and longer time so they can be fed more targeted ads. Individuals combating the tidal wave of child sexual abuse material, or CSAM, flowing across the internet.

At the end of that hearing, I told the witnesses and the many parents and young people in the audience, I was going to roll up my sleeves, get to work and pass legislation to protect kids from online safety concerns. That spring, the committee reported five bills that help protect kids online. It included my Stop CSAM Act, and I want to thank Senator Hawley for joining me in that effort, which we hope to renew soon, along with bipartisan bills from Senators Graham, Blumenthal, Klobuchar, Cornyn, Blackburn, and Ossoff.

These bills were reported out of this committee unanimously. For anyone who's a newcomer to Capitol Hill or to this committee, you have the American political spectrum from one end to the other on this committee, and for us to do anything unanimously is nothing short of a political miracle. We did it. The Senate Judiciary Committee contains members across the spectrum, the most conservative Republican to the most progressive Democrat. It's almost unheard of to pass a bill unanimously, yet we did it five times.

One of these bills, the Report Act, later signed into law by President Biden, strengthened the CyberTipline, run by National Center for Missing and Exploited Children. As for the rest, different story. Big Tech opened up a $61 and a half million lobbying war chest to make sure these bills never became law.

Now, let's be clear, none of these bills are the silver bullet that would make the internet completely safe for our kids, but they would be significant steps towards finally holding tech companies accountable for the harms that they caused, the damages they caused, the death that they caused. And that's why the tech companies opposed them as strongly as they did. They didn't do it publicly. Publicly, oh, it's such a great idea, but privately, they just beat the hell out of us.

So just over a year ago, I called in the CEOs of the five major tech platforms, some I had to issue subpoenas, to demand answers on the record under oath. And that hearing produced some results. Several companies implemented child safety improvements just days before their CEOs came to testify, and Meta's CEO, Mark Zuckerberg, under pressure from Senator Hawley, artful questioning, gave a long overdue apology to the parents his platform had hurt.

But apologies and too little, too late reforms are simply not enough. The dozens of parents and survivors in that room, the thousands impacted across the country, demand more, and I, for one, plan to follow through. In the coming weeks, Senator Hawley and I will reintroduce the Stop CSAM Act. This bill will finally open the courthouse door to families whose children have been victimized due to Big Tech's failure to safeguard their online platforms. I hope Senator Grassley will help me schedule a timely markup on that bill.

And this week, I'll join Senators Graham, Whitehouse, Hawley, Klobuchar and Blackburn to introduce the bills to sunset Section 230 of the Communications Decency Act in two years. This is long overdue. Section 230 and the legal immunity it provides to Big Tech has been on the books since 1996, long before social media was part of our lives. To the extent this protection was ever needed, its usefulness for this so-called fledgling industry has long since passed. I'm under no illusion that it'll be easy to pass legislation to protect kids online and finally make the tech industry legally accountable for the damage that they're causing, but they ought to face the same liability as every other industry in America.

Just last year, a Big Tech and its allies in the House killed a bill, the Kids' Online Safety and Privacy Act, a bill introduced, I believe, by Senators Blumenthal and Senator Blackburn as well, that would have imposed a basic duty of care on tech platforms. It passed the Senate 91-3, but Big Tech did it in the House. Couldn't even get it up for a vote.

The National Center for Missing and Exploited Children receives 100,000 reports to its CyberTipline every single day. That's not just a statistic. Each of these reports involves a victim. It could be anything from images of a toddler being raped, to a teenager being coerced, extorted, groomed, and encouraged to commit suicide. 100,000 reports in the United States every single day. I hope everyone keeps that in mind as we hold this hearing. We cannot wait. We have to move. I hope it drives them, the public to demand Congress finally do something.

Sen. Chuck Grassley (R-IA):

Senator Blackburn.

Sen. Marsha Blackburn (R-TN):

Thank you, Mr. Chairman. And I want to say thank you to our witnesses for being here today, and Mr. Guffey, we appreciate that you are here and sharing your story.

Mr. Chairman, you mentioned that it was over a year ago that we had tech execs in front of us and that nothing much has changed. That is the tragic part of this situation, that nothing much has changed. There's been window dressing, there have been ads that have been run saying, "Look at us, look at what we're doing." But unfortunately, there is no enforcement to this.

That is why it is still dangerous for kids to be online. They're still facing online threats, exposure, sexual exploitation, drug trafficking, promotion of suicide, eating disorders. And the thing that is so interesting is, in the physical world, there are laws against this. It is only in the virtual space that it remains the Wild West and our children can be attacked every single day, nonstop, 24/7/365.

It is long overdue, and the Kids Online Safety Act that Senator Blumenthal and I have worked on for years now has been mentioned already this morning, and there is such a broad bipartisan coalition, whether it's parents, principals, teachers, pediatricians, child psychologists, even teens themselves have come to us and have said, "Something needs to be done about this." We have had companies like Microsoft, X, Snap, who have supported this bill.

Unfortunately, kids are still being harmed online. I talked to a mom recently whose child died. They met somebody online who sold them supposedly a Xanax. They met him on Snap. They took what they thought was a Xanax and they died. It was fentanyl. So, these are the dangers that are there, and while there is broad bipartisan support, Senator Grassley mentioned the lobbying efforts of some of the Big Tech firms and how they went with distortions and lies to the House, and this bill did not get through. So, it is time to stop this and get it passed.

Now, Senator Durbin mentioned the bills we sent out of committee here last year. There was one that got signed into law, and it was the bill that Senator Ossoff and I did, the Report Act. And this deals with NCMEC's CyberTipline and increases the time that evidence submitted to NCMEC has to be preserved, and it gives law enforcement more time to investigate to get these criminals into court and then get them locked up. And we still have so much work to do.

Now, Senator Klobuchar and I are going to lead the Privacy Technology and Law Subcommittee, and these issues will be coming before us. We've got plenty of work to do. We're looking forward. Mr. Chairman, I look forward to convening this committee, working to make certain that we are pushing this legislation, that we are going to protect our children in the virtual space. Thank you, Mr. Chairman.

Sen. Amy Klobuchar (D-MN):

Well, thank you so much, Mr. Chairman, and I am truly looking forward to working with Senator Blackburn on this important subcommittee. As many of you know, Senator Lee and I chaired the Antitrust Subcommittee for a long time, but I actually think this situation right now with the possibility of moving on these bills is going to be a very positive development.

As Senator Blackburn just pointed out, despite the strong support that we have had from Senator Durbin and Senator Grassley and Senator Graham when he chaired this committee, or was the ranking on this committee, we've just continued to run into roadblocks to passing these laws, and it's getting absolutely absurd. Senator Grassley's well aware of the anti-trust tech bill that he and I lead, that hundreds and hundreds of millions of dollars are spent against it in TV ads, and despite the fact that the companies, FANG, as we call them, have agreed in other countries to some of these consumer protections, that did not happen in America.

And I think that this piece of it, of whether it's Instagram's promotion of content that encourages eating disorders, frightening rise of non-consensual AI-generated pornographic deepfakes, or the tragic stories of kids losing their lives to fentanyl-laced pills will most likely be leading the way as we continue to push our antitrust and privacy and news bills.

Just this month, this committee heard from Bridget Norring of Hastings, Minnesota. Her son, Devin, was struggling with migraines and bought what he thought was a Percocet over Snapchat to deal with the pain, but it really wasn't a Percocet, it was a fake pill laced with Fentanyl. And with that one pill, as we say, one pill kills, he died at age 19.

For too long, the companies have turned a blind eye when young children joined their platforms, and used algorithms that pushed harmful content. They have done that and provided a venue for dealers to sell deadly drugs like fentanyl. We know that social media also increases the risk of mental illness, addiction, exploitation, and even suicide among kids. I will never forget the testimony of the FBI director telling us that in just one year, I believe it was 2023, over 20 kids had committed suicide just because of the pornography and the images that had been put out there, when they were innocently sending a picture to who they thought was a girlfriend or a boyfriend.

That's why this committee has taken this on on a bipartisan basis, and I'm hopeful that this hearing will be the beginning of actually passing these bills into law. Representative Guffey, you and I met through Senator Cruz and the bill that he and I have, the Take It Down Act. We have an additional bill that Senator Cornyn and I have that's really important that's passed through this committee, the SHIELD Act. And as you know all too well the threat of dissemination alone can be tragic, especially for kids.

We need to enact the Kids' Online Safety Act, which thanks to Senator Blumenthal and Blackburn have passed the Senate on a 91 to 3 vote, as we know some of these are stalled out in the House. We need to get the federal rules of the road in place for safeguarding our data. According to a recent study, social media platforms generate 11 billion in revenue in 2022 from advertising directed at kids and teenagers, including 2 billion in ad profits derived from users age 12 and under.

I am supportive, as was mentioned by Senator Durbin of the legislation that he and Senator Graham and Hawley and many others to open the courtroom doors to those harmed by social media by making those reforms to Section 230. That legislation was enacted long before any of this was going on, and somehow with respect to other industries, we've been able to make smart decisions to put more safety rules in place just as those passengers that were on that flight that flipped upside down in Toronto who are in those seats that were the result of safety rules that were put in place. And yet when it comes to this, we just put up our hands and say, "No, they're lobbying against US, or they have too money, or we like some of the people that work there," and we do nothing. And by doing nothing, instead of reaching some reasonable accommodations of settlements or things we can do on legislation, we just let them run wild at the expense of our kids' lives. Thank you.

Sen. Chuck Grassley (R-IA):

When you consider five bills got out of this committee last Congress. And over the last few years, Congress has only been in session about two and a half days a week, it's supposed to be a new regime, I'm not sure that it is. And I would hope that some of you folks on the Democrat side would push Republicans to make sure we keep the Senate in session more than two and a half days a week so we can get some of this done because we had hardly any important legislation the last two years. We were basically just a confirming body. Take that, I hope you enjoyed doing that like I enjoyed complaining because we're only meeting two and a half days a week when Democrats controlled the Senate. Going to introduce our guest today, our first witness, Mr. Brandon Guffey.

Rep. Brandon Guffey:

Yes sir.

Sen. Chuck Grassley (R-IA):

You serve now in the South Carolina House of Representatives following tragic loss of your son, Gavin, Mr. Guffey became an advocate for mental health awareness and combating online crimes. And as Senator Blackburn said, we're sorry for the loss of your son, Mr. Guffey. It's probably hard for you to be here to talk about it, but thank you for being here.

Next we have Ms. Carrie Goldberg, plaintiff's attorney, founder of the law firm, CA Goldberg PLLC. She specializes in representing victims of sexual abuse, child exploitation, online harassment, and other forms of digital abuse.

Professor Mary Leary, a former federal prosecutor, current law professor at Catholic University of America. Professor Leary directs the Law School's Modern Prosecution Program and her scholarship focuses on exploitation of women and children. Professor Leary has an upcoming article that dives deeply into the history of Section 230, and its role in facilitating child sexual abuse material.

Mr. John Pizzuro is CEO of Raven. Started as a former law enforcement, Raven gathers subject matter experts across multiple disciplines to help protect children from online exploitation. Mr. Pizzuro is a former commander of the New Jersey Internet Crimes Against Children Task Force Program.

Mr. Stephen Balkam, CEO and founder of Family Online Safety Institute. This international non-profit is dedicated to making the internet safer for kids. Before founding the institute in 2007, Mr. Balkam spent 30 years as a leader in a non-profit sector championing online safety. His work at the institute brings together those in government, industry, and the non-profit sector to create a culture of responsibility.

Now, I'd like to ask you to stand and be sworn. Raise your right hand. Do you swear or affirm that the testimony you're about to give before this committee be the truth, the whole truth, and nothing but the truth, so help you God?

All:

Yes.

Sen. Chuck Grassley (R-IA):

They have all answered the affirmatively. Mr. Guffey We'll start with you and go from my left to my right.

Rep. Brandon Guffey:

Thank you Mr. Chairman, distinguished senators. Thank you for the opportunity to testify today. My name is Representative Brandon Guffey, and I'm here to share why protecting youth from online dangers and holding big tech companies responsible is now my life's mission. Sometimes God sends you down a path that you never thought you would be on. In July of 2022, I lost my oldest son Gavin Guffey to suicide. On Gavin's last Instagram post, a week prior to his death, he said, "This week helped me look up to where my head should have been years ago. Jesus in His word, has given me a high that no other can compare to His love." He ended that post with a less than three sign.

On July 27th, Gavin would send out the less than three sign again on a black screen to his friends and his younger brother, Cohen Guffey, who's with me here today. At 1:40 AM, Gavin took his life. We quickly learned that Gavin was contacted on Instagram around midnight, in which he told his friends that he would jump off the game to chat with her. And just one hour and 40 minutes my son was gone. The predator that contacted Gavin was recently extradited to the US two weeks ago from Lagos, Nigeria. The Predator not only attacked my son Gavin, who was 17, but also began to extort my 16-year-old son, my 14-year-old cousin and then myself. One of the messages I received read, "Did I tell you that your son begged for his life?"

I hope you ask, how is this possible? It's possible because Instagram removed the profile that attacked Gavin, but left up the additional profiles that predators use. One of those is the account that began to attack my family after Meta was fully aware of this predator. I vowed from that moment that I would make it my life's mission to protect children online and would not stop. I was shortly elected to the South Carolina House, and within four months of taking office successfully passed what is now known as Gavin's Law. Sextortion is mandated in an education throughout the state of South Carolina, and every kid at least has to have some awareness so they don't feel alone like my son did that night.

I've worked with many states on similar legislation. I started a nonprofit speaking to teens about mental health and the dangers from big tech. I filed a lawsuit against Meta in January, 2024, sold my businesses and went to work for a tech company that provides tools to protect children. I've also become an advocate on the Hill urging members to see this is what it is, and that is the greatest threat to the next generation. In the two years of my advocacy, I've seen Big Tech lobby fight us every inch, and Congress cave instead of listening to we, the people. I witnessed KOSA passed to Senate 91 to 3, go to the House where the speaker refused to let it be heard.

Senators Graham and Durbin have the Defiance Act. Senators Blackburn and Coons the No Fakes Act. These are great bills, I've even taken notes and reintroduced them on a state level in South Carolina like the ELVIS Act in Tennessee. Senators Cruz and Klobuchar led the Take It Down Act, and has already passed the Senate. I'd like nothing more than to be proven wrong about the inefficiency of Congress by having the House pass the Take It Down Act soon. PSpeaker Johnson, Chairman Guthrie, the ball is in your court with a bill to protect American lives. Please don't let us down again.

I've witnessed over 40 teens take their lives since Gavin just due to sextortion, while we as lawmakers fight amongst ourselves. Will it take one of your own children or grandchildren to finally get fed up enough to move? Sextortion is only one of the many harms due to our children due to Big Tech's lack of accountability. Big Tech is the big tobacco of this generation. We see groups such as NCMEC and NCOSI give statistics over and over. We see parent survivors knock on your doors daily, and Section 230 will go down as one of the greatest disasters, allowing Big Tech to run rampant without repercussions.

We watched companies spend millions lobbying, fighting us in court and continuously absolving themselves of responsibility. In this very chamber last January, I stood holding a photo of Gavin while Mark Zuckerberg offered a forced pathetic apology. Where I'm from we have a saying that says, "Don't talk about it, be about it." And until these companies can be held responsible, and the billions they make off of advertising to our children, Big Tech will simply never be about it. I use this as an example, Meta pulled down 63,000 accounts in one day in one country just from Lagos, Nigeria, and just off of Instagram. Now, ask yourself, did they pull those down to actually help our children? And if so, why haven't they done more since? Or did they pull it down for a PR stunt? I beg to say that that is nothing more than a PR stunt, so they can get that pat on the back as if they are doing something good, but have done nothing since.

I got way offline. But I want to focus on my main message to Big Tech as lawmakers, I think we have to say either get in line or get offline. And right now we have too many politicians making decisions based on their next election, and not enough leaders making decisions based on the next generation. Are we politicians or are we leaders? We can't just talk about it, we have to be about it. And if we can't protect our next generation, then what are we even fighting? For tomorrow needs you, and our children need you now.

Sen. Chuck Grassley (R-IA):

Thank you Mr Guffey. Now Ms. Goldberg.

Carrie Goldberg:

Chair Grassley, Ranking Member Durbin, and distinguished members of the Senate Committee of the Judiciary. My name is Carrie Goldberg, and I'm a lawyer who represents families catastrophically injured by Big Tech. I want to tell you about a few of the cases I've been working on for the past decade. I'm the originating attorney in a case against SNAP where our client's children were matched with drug dealers, and sold counterfeit fentanyl-laced pills that killed them. The case now has 90 families in it from all over the country, including families that you heard from last week. And I'm also joined by my client Amy Neville, the mother of 14-year-old Alexander Neville.

In another case against SNAP, criminals are exploiting a known security vulnerability to access CSAM and blackmail and extort kids with it. Yesterday the Ninth Circuit dismissed one of my cases, representing a 15-year-old severely autistic boy who at age 15 was funneled into Grindr's marketing campaign, and was recommended to four different pedophiles who raped him over four consecutive days. In court, Grindr's lawyers said that they had no duty to restrict children's access to their hookup app.

In another case of mine, a 13-year-old LS was lured to the site BandLab, another site with no age restrictions. She thought she was meeting a 17-year-old boy, but it turned out to be 40-year-old Noah Pedrona from Portland, Oregon. He posted openly on this music-sharing platform, songs about one called Pedophile in a Minor. A song called Pedophile in a Minor. On June 24th, 2022, Medrona drove 15 hours to her home, abducted her on the way to school, stuffed her in the trunk of the car, and raped and abused her for eight days. Despite there being a national manhunt, BandLab refused to provide law enforcement with key information that could have led to her fast rescue. They wanted to respect Medrona's privacy, they said.

Finally, I represent the family of 16-year-old Aidan Walden from Colorado, who in July, 2020 discovered a website that glorifies suicide, and learned on that website about a product that he could buy from Amazon and get Prime delivered to him and use it to end his life. Two months later, his grieving mother exchanged 57 messages with Amazon telling them about their product being amplified on a suicide message board. And yet Amazon, despite knowing there was no household use for this product besides suicide, continue to promote, sell, and deliver it for 26 more months. I now represent 27 other families who bought it after Amazon sold it to Aidan Walden and heard from his mother.

In all of my cases, tech has two main defenses, Section 230 and that they didn't know. Now, I was here a year ago with my clients, including Amy, when this committee so powerfully told the CEOs of Meta Twitter, Discord, Snap and TikTok that you were done with discussions and you wanted solutions. The most important thing I can say is that families want legislation like KOSA, Sunsetting Section 230, the Defiance Act, SHIELD. They want laws that increase accountability, that create protection boards at the FTC, the cyber tip line, create procedures to contest a platform's failure to remove CSAM. They want injunctive relief. And families want civil remedies against platforms when they've increased the risk of harm.

Now, take for example, my case representing AM, one of the first cases to overcome Section 230 on trafficking and product liability. At age 11, AM, lived in a normal town, living a normal life in Michigan when she went to a sleepover, and discovered a website called Omegle, it matches strangers for private live-streaming. Omegle matched her with a man who made her his online sex slave for three years. Extorting her, making her at the back and call of he and his friends to perform for her, sometimes interrupting her at the dinner table or at school, even forcing her to go back on Omegle to recruit more kids. The abuse eventually ended when his home, which he shared with his wife's daycare was raided and images of AM and other young girls were found.

In that case, Omegle did not intend my client's injuries. I could not claim that they knew who she or the offender was. Instead, I pointed to the mountain of evidence that Omegle had, knowledge of how prevalent the harm was on its platform. I pointed to criminal cases, articles, exposes, academic journals. I just want to say two more things. As a result of how we pled the case, we advanced into discovery and acquired 60,000 documents exposing the extent of injured children, and that led to them agreeing to shutter Omegle forever on November 8th, 2023.

Now, we are at a consensus today, we are all here to not repeat history. Section 230 was supposed to incentivize responsible content moderation, instead it did the opposite. And as we look into the future, on behalf of the victims I represent, we are here to support laws that pressure platforms to know about the harms and to fix them. Thank you and I look forward to questions.

Sen. Chuck Grassley (R-IA):

Now, Professor Leary.

Mary Graw Leary:

Thank you, Chair Grassley, Ranking Member Durbin, and all the members of this committee. As has been mentioned, I'm really grateful for all the work this committee has done on this issue. The experience our children are having in the digital space is one fraught with danger for them. And one might want to ask, why do you have to work so hard? Why do you have to keep passing these laws? Why are not the laws that Congress has had on the books regarding exploited crimes working? And there's lots of answers to that to be sure, but the common thread through this morning so far is Section 230 of the Communications Decency Act, which has been transformed into what I label a de facto near absolute immunity regime. And what I mean by that is exactly what Ms. Goldberg just said: this was a law that was designed to incentivize platforms for protection, and instead, it has incentivized them to harm.

I want to make about five points that I think will help frame our discussion about Section 230. The first two are what I call framing principles. When one reviews the text, the history, the structure of Section 230 of the Communications Decency Act, it is clear that this is a law that is not stand-alone law protecting freedom of the internet as tech and its surrogates will try to argue, it is a law that is born out of a landscape of child protection. When you go back to the legislative history, there is no question the Senate with the Communications Decency Act, the House with the Internet Freedom and Families Act were wrestling with the same question, how as you look to the Telecommunications Act, how could you as Congress have a safer internet and other media for youth? Not whether, but how, first point.

Second point, Section 230 of the Communications Decency Act must be regarded as an experiment. And I say that because when you look at the promises tech made back in 1996, and when you look at the supporters of the IFFE and the House, what you see is they represented to you and to America that this would be a way in which we could protect our children. That was their claim, that was the promise.

Point number three, the experiment has failed. The experiment has failed for all the reasons that have been said already. And why has it failed? And I would say to you, in addition to the reference to what happens here in Capitol Hill with regard to tech, the transformation of Section 230 of the Communications Decency Act into a law that incentivizes harm was not by accident, it wasn't sort of something that just emerged from the internet. It was a systematic effort by tech and its surrogates to litigate that throughout this country. And they went across the country over 30 years arguing, not for the narrow limited protection for good Samaritans that the Act states, but rather broad immunity. Interestingly, immunity is nowhere in Section 230 of the Communications Decency Act, as a side note.

And that result has had human consequences, which we've heard today. And I think to highlight one that's been said, 99,000, 100, 000 reports today will happen on the cyber tip line, but it also has important effects in the courtroom that Ms. Goldberg alluded to, and I want to highlight a couple of them. One is, keep in mind this has become an immunity not a defense, and that is essential for two important reasons. First, as an immunity, these cases are thrown out at a motion to dismiss, so there's no access to discovery. So when we say that victims, survivors, state's attorneys generals are shut out of the courtroom, we don't mean it's very hard to win these cases. We mean they are shut out of the courtroom that they do not have their day in court, notwithstanding the harm that they've experienced. And I label this reality, the dual danger of de facto near absolute immunity.

First, that shield, which has allowed platforms to engage in a list of criminal activities having nothing to do with publishing, that has allowed this industry to grow to a massive scale where one individual or one small company can cause massive harm as we've heard. But the other part of that dual danger is because it is an immunity, there is no access to discovery. There is no way to look under the hood of this incredibly dangerous industry, there's no guardrails. And that means that, and as Senator Klobuchar pointed out, what that tells us as I wrap up is that there is no guardrails for the harm that these folks will experience.

So I offer some suggestions of reform in my papers, but I think the key thing here is to keep the good Samaritan protections that Section 230 has, but to get rid of the C-1 protections that have so distorted this incentivization for harm. And I would just encourage the Senate to listen to the words of Justice Thomas, where he has lamented about the reality of Section 230 and stated, "Make no mistake about it, there is danger in delay," and that danger we can do at the math. If we accept that 99,000 reports of NCMEC will be made today, that means that 12,375 reports will come in during this hearing, and in the last five minutes I've spoken, there have been 344 reports. And if that's not reason to act enough, I don't know what is. Thank you Chair.

Sen. Chuck Grassley (R-IA):

Thank you, Professor. Now Mr. Pizzuro.

John Pizzuro:

Chairman Grassley, Ranking Member Durbin, and distinguished members of the Senate Judiciary Committee. Thank you for the opportunity to testify today. As a CEO of Raven, an organization dedicated to transforming the nation's response to child exploitation, I am here to urge decisive legislative action. Despite multiple testimonies before Congress, progress has been slow, hindered by special interest groups and financial incentives that favor the status quo. We must prioritize our children's safety and support those who protect them above all else.

New threats continue to emerge while all ones remain unaddressed. Artificial intelligence now enables offenders to manipulate regular images of children into explicit content, create images of children who do not even exist, and groom children at mass. Offenders now increasingly exploit children for financial gain in addition to their depraved sexual gratification, yet legislative inaction allows this crisis to persist. The tech industry has not meaningfully reduced online victimization, their voluntary cooperation with law enforcement is minimal, allowing offenders to continue exploiting children with impunity.

In 2023, for example, there were 36 million cyber tips, yet Apple holding a 57% market share in the US, only reported 275. According to investigators in the field, Discord notifies users of legal process and subpoenas, enabling offenders to erase evidence before law enforcement can act, allowing offenders to continue to target our children. Electronic service providers permit offenders to rejoin platforms under new aliases with the same IP address while failing to block foreign IP addresses used for sextortion. This lack of enforcement emboldens criminals and leaves our children unprotected. Poor moderation, the lack of parental controls in relation to age identification and inadequate safety measures further expose children to these dangers.

As mentioned a lot today, social media algorithms push harmful content, enabling predators to reach victims globally. AI-powered grooming will allow offenders to manipulate children at scale, mimicking their language and behaviors to establish trust. Troubling, even ChatGPT-like tools can provide information on grooming tactics when framed in seemingly innocent ways. These dangers extend beyond child exploitation to drug access with platforms facilitating the sale fentanyl and illicit substances. Law enforcement is overwhelmed and under-resourced. Undercover operations have been highly successful in apprehending offenders, but increasing volume of cyber tips has made proactive investigations nearly impossible.

In the US, there are 229,000 IP addresses currently right now trading peer-to-peer images of known child sexual abuse material, yet only 923 are being actually worked. Studies indicate that over 50% of those individuals are hands-on offenders with 8 to 13 victims each. The mental toll on those who investigate these crimes is severe. Prosecutors, child advocates, and law enforcement officers are exposed to daily horrific content leading to burnout and PTSD. We must provide them with adequate wellness resources to ensure they can continue their critical work.

As a retired New Jersey State Police Commander, I have seen first-hand what can happen. Despite its critical role, the Internet Crimes Against Children Program, ICAC, has been chronically underfunded despite being responsible for most of the child investigations in the US. While authorized for 60 million in 2008, only 31.9 million has been appropriated, that's $522,000 per task force per year to investigate child exploitation. That's why I urge everyone here to co-sponsor the Protect Our Children Reauthorization Act of 2025.

Children are our most valuable resource, and their victimization has lasting consequences on society. Raven stands ready to collaborate with members of the Senate, House, Trump administration, and the CEOs of Big Tech to develop effective solutions. Quite frankly, the phrase, talk is cheap is a 100% accurate, action is the only remedy. How many of our children and those who protect them will be impacted as a result of our inaction and debate? Make no mistake, right now offenders are winning, children are suffering, those fighting to protect them are left to struggle without the support they need to rescue victims, hold offenders accountable, and bolster their own mental health in the process. Legislative action is overdue. The solutions are within your power. Our children are counting on you and I'm counting on you. Thank you so much.

Sen. Chuck Grassley (R-IA):

Mr. Balkam.

Stephen Balkam:

Good morning, Mr Chairman Grassley, Ranking Member Durbin, and distinguished members of the committee. Thank you very much for the opportunity to speak with you today. My name is Stephen Balkam, and I'm the founder and CEO of the Family Online Safety Institute. For nearly two decades, FOSI has worked with industry, government, academia, and the nonprofit sector to create a safer digital world for children and families. I'm also here as a father and a newly minted grandfather. Chairman, this is my third time testifying before this committee, having first appeared in July of 1995, a committee hearing called Cyber Porn and Children. While much has changed, our mission remains the same. We believe in a three-pronged approach to online safety, enlightened public policy, industry best practices, and good digital parenting. Our goal is to create protections for kids as well as empower young people to navigate digital spaces safely and responsibly.

We want to protect kids on the internet, not from it. Parents of younger children should have the strongest protections possible, including easy-to-find and easy-to-use parental controls. But as kids grow, our role as parents shifts from being helicopter parents to co-pilots, guiding them as they build digital resilience. Research shows the teens value online safety tools like blocking, muting, reporting and privacy settings, teaching them to use these effectively fosters independence and self-regulations. We have found that empowerment is often the best form of protection. We must prepare young people to engage safely and thoughtfully with the digital world, equipping them with digital literacy and an understanding of their rights and responsibilities. Now, recently there have been calls to ban young people from social media and other online spaces. Blanket bans deprive children from any positive experiences they may have, are difficult to enforce and open up too many possible unintended consequences.

After all, children have rights, including the right to safely access the web, to information, free expression, and to connect with others. Instead of blanket bans, we need thoughtful restrictions that include input from young people and that account for children's evolving maturity. While technical solutions such as age assurance are improving, there is no universally approved system as yet. It is challenging to get the balance between safety, privacy, and effectiveness right. And as I said recently at our annual conference in front of 350 industry leaders, "You can and must do better to create easy-to-find and easy-to-use controls for parents and online safety tools for teens and young people. You can and must do better to publicize and promote those controls and tools and you can and must do better to collaborate with each other to harmonize your tools across the ecosystem so that parents and teens are not overwhelmed with the task of setting and managing controls across countless apps, games, websites, and social media platforms."

In the meantime, Congress has taken some important steps in this space passing COPPA 27 years ago and the Camera Act three years ago, which funds essential research on children's development and well-being, but there's still much more work to be done. Federal action is critical because states are now beginning to fill the gaps with their own online safety laws. Unfortunately, even the most well-intentioned laws often face legal challenges and create a fragmented regulatory landscape. A strong federal framework would provide clarity while allowing states to build upon this. So, Congress has the opportunity to lead with balanced and thoughtful policies, including passing a comprehensive data privacy law, funding ongoing research to inform evidence-based policymaking, prioritizing specific targeted bills like the Take It Down Act and the Kids Off Social Media Act, encouraging industry cooperation to simplify parental controls and online safety tools, rejecting blanket bans in favor of thoughtful restrictions that include young people's input and critically supporting digital literacy programs to build resilience in young users.

So, to conclude, let us challenge ourselves to reimagine what online safety can look like, not just as a range of restrictions, but as a foundation for resilience, confidence, and opportunity. Thank you and I look forward to your questions.

Sen. Chuck Grassley (R-IA):

Thank you all for your testimony. We'll have five-minute round of questions. I'm going to start with Mr. Pizzuro. AI has opened up new possibilities for bad actors to generate novel forms of CSAM. In fact, one recent report found that over 3,500 AI-generated CSAM images were posted in a single dark web forum over a nine-month period. Could you explain the challenge of AI-generated CSAM poses for law enforcement and tech companies?

John Pizzuro:

Well, there's a couple of things. One, right now it's going to be hard to tell the difference, especially without forensic software, what is AI and what isn't. Secondly, I could just take my phone now, take a picture of a senator and then I can age regress them. For example, you can be a forty-year-old male. I can now make you a 21-year-old female and now I can make you a ten-year-old girl. And with that and AI within these apps, I can actually then nudify those apps. So I think the challenge becomes with this and AI, especially from sextortion. I don't even need to groom someone right now. I can just get an image off the clear web in order to do that. So, that's going to be the complexity and the challenge is going to be is how do we determine who is a real victim and who's not in a lot of instances.

Sen. Chuck Grassley (R-IA):

Thank you. Professor Leary, 230... Well, first of all, I heard your five points, so I'm not asking you to repeat any of them, but how would you advise reforming Section 230 in light of the current online ecosystem?

Mary Graw Leary:

Thank you, Senator. Well, first, as I say, when we talk about Section 230 and the provisions that tech points to, as this committee well knows, C-one and C-2. C-2 is the Good Samaritan provision and I would recommend that that stay in place. That incentivizes a platform to be able to remove harmful material from their platforms without being sued. The C-one part of the statute should be removed. As has been pointed by so many of you in your opening statements, it serves no purpose if it ever did. Now, a myth has been created about it that it somehow created the internet and somehow the internet will break without it, and that's just simply not true. And if it ever was true, this is no longer a fledgling business that needs that kind of support. Instead, it needs to be treated like every other business.

Another important thing I would encourage the Senate to do with Section 230 is to listen to the National Association of Attorney Generals who repeatedly has written and asked Congress to include in it the ability for them to enforce their state laws, which has also been ruled to be something that they cannot do when these courts, when tech has argued for an expansive interpretation of Section 230. In my mind, that is another courthouse door that is closed. It's a state's rights issue, and the entire architecture of combating exploitation of our children involves prosecution, protection, and prevention. And within that involves multiple pressure points including civil litigation, state prosecution, and federal prosecution. And that, I think, would be an important amendment.

Sen. Chuck Grassley (R-IA):

Mr. Pizurro, obviously it has taken a long time and maybe it will take a longer time for Section 230 to be reformed and also putting some more things in that can slow the process up. Beyond that reform and liability for big tech, what steps could companies take to protect children online today?

John Pizzuro:

Well, one of the things that they know is they know, for example, I had mentioned in my testimony, Discord. They notify users. They notify users when they get legal process. Those are certain internal policies. They know what IP addresses that there are because if I get banned, I just create a new username. So those are associated IP addresses. There's also IP addresses beyond the scopes of the US where children are targeted here. So, these are things that the companies actually know and can do something.

Sen. Chuck Grassley (R-IA):

Professor Leary, the bill's reported out committee last year that would impose liability on platform for knowingly promoting CSAM and others for recklessly promoting CSAM. As we continue workshopping bills in this committee, do you believe we should pursue recklessness standard or a knowing standard in the pros and cons?

Mary Graw Leary:

Thank you. I absolutely believe a recklessness standard is superior to a knowing standard. And again, Senator Durbin referred to the red herrings. One red herring that's out there has recklessness, some very low standard that will somehow expose these businesses to an onslaught of litigation. A couple of comments on that. First, most businesses function having to act responsibly and they face often a negligent standard. Anybody who says that recklessness is an easy standard to make, I invite you, please come to my criminal law class and meet my criminal law students who will be able to tell you the definition of recklessness, and they will tell you that it is challenging and specifically it is a conscious disregard of not just a risk, a substantial and unjustifiable risk.

That is the definition of recklessness in the criminal context, and it can be used in other contexts as well. That requires not just an objective measure, but a level of subjectivity. It's referred to sometimes as risk creation. So that kind of standard is hardly a day in the park for litigants. It is still quite challenging, and that's why it is a far better standard than knowingly, in my opinion.

Sen. Chuck Grassley (R-IA):

Okay. Senator Durbin.

Sen. Dick Durbin (D-IL):

Representative Guffey, thank you for coming back. I'm sorry for the circumstances which bring you, but it shows real courage and I know your family and friends have joined you-

Rep. Brandon Guffey:

Thank you.

Sen. Dick Durbin (D-IL):

... in coming here today. I recall the first time we met after a hearing a year or so ago before this committee, so thank you very much.

Rep. Brandon Guffey:

Thank you.

Sen. Dick Durbin (D-IL):

Mr. Chairman, was it a week or two weeks ago we had a hearing on Fentanyl.

Sen. Chuck Grassley (R-IA):

Yeah.

Sen. Dick Durbin (D-IL):

Yeah.

Sen. Chuck Grassley (R-IA):

Last week.

Sen. Dick Durbin (D-IL):

And we had another parent of a victim who ordered what he thought was a Percocet, turned out to be laced with fentanyl, and took his life. So this is a life or death proposition that we're dealing with here, and you've lived it and living it still. I think we ought to keep it in that context. Professor Leary, I am struck by one of your statements that you've given to the committee that this notion that we are preparing 230 as an immunity as opposed to a defense precludes evidence being gathered and discovery taking place. And you say in your remarks to the committee that, that diminishes our knowledge of the actual goings-on at these tech companies and what they're doing and gathering. I recall what Representative Guffey said in his opening remarks, this is bigger than big tobacco. I know that issue. Over 30 years ago in the house, I introduced a little bill to ban smoking on airplanes.

It passed because Congress is the biggest frequent flyer club in the world, and we were sick of it. And it triggered a conversation and a discovery process, and AGs from across the country gathered together and did something significant with this industry. So I'd like you to expound a bit, if you will, as to how this standard precludes our knowledge of what's actually going on in big tech in their response to this challenge. I think that the gathering of that information for the tobacco companies, the demonstration of their lying to the public about the safety of their product, for example, really led to their downfall. I think the same could be true here.

Mary Graw Leary:

Thank you, Senator. I think that you are a hundred percent correct on that. When a defendant has a defense, as I know the committee knows, but to be responsive to the question, there's a period of discovery beforehand.

Sen. Dick Durbin (D-IL):

Like contributary negligence.

Mary Graw Leary:

Exactly, exactly. Or things of that nature. There's a period of discovery where the plaintiffs who've made a good faith claim can get information to build on their case, and the defendants can also provide information which may exculpate them. The way that Section 230 has been interpreted, it is an immunity. And so prior to discovery is when these platforms are coming into court and saying, "Judge, we don't have to defend ourselves. We don't even have to litigate this case. You should dismiss it now. Motions to dismiss Prior to discovery." The only way that I would say the public has learned a lot of the information about big tech, for example, that I believe led to KOSA and some of the other duty of care, has been through what? Congressional investigations. I'm reminded of the back page of Congressional investigation, which was a two-year investigation or whistleblowers and hearings.

That's how we are learning this information, and only by getting this information can we then make informed choices about what's the appropriate legislative text. If I could just say quickly, Justice Thomas commented on this, and he has underscored this when he said, "Look, let's keep in mind, if we fix Section 230..." That's not exactly what he said, but after that he said, "It would simply give plaintiffs a chance to raise their claims in the first place. Plaintiffs must still prove the merits of their case and some claims will undoubtedly fail. But states and the federal government will be able to update their liability laws to be more appropriate for an internet-driven society."

Sen. Dick Durbin (D-IL):

If I can make one final point in the closing seconds here, going back to my analogy, smoking on airplanes and ultimately dealing with the tobacco issue in a much larger context, the initial bill that I introduced and passed in the house, banned smoking on airplanes and flights of two hours or less. People said, "What are you talking about? If it's dangerous, it's dangerous regardless of the duration of the flight." The reason was I had a Minnesota Congressman who was a chain smoker who was holding up my bill and I went to him. He's passed.

Sen. Amy Klobuchar (D-MN):

It's so timely that I arrived.

Sen. Dick Durbin (D-IL):

I went to him and I said, "Marty, how long can you go without a cigarette?" And he said, "Two hours." So I put that in the bill and he didn't object to it and it moved forward. There are things that we're dealing with in some of these bills, which are compromises to try to move the issue forward to make progress toward our goal. So don't assume that any language is final. It is all in flux and subject to negotiation. But thank you for joining us.

Sen. Chuck Grassley (R-IA):

Senator Lee.

Sen. Mike Lee (R-UT):

Thank you, Mr. Chairman. First, I'd like to thank all the witnesses for being here and for testifying on this important issue. These are not easy issues to talk about and not easy in particular because of the tragic circumstances that have regrettably brought you here. Representative Guffey, I want to express my sympathy to you for loss of your son, Gavin. No parents should ever have to go through that, and I want to commend you on your courage and the strength that you've shown as you continue to fight to protect all children.

Rep. Brandon Guffey:

Thank you.

Sen. Mike Lee (R-UT):

And, Ms. Goldberg, with what you've gone through, likewise, my heart goes out to you and to anyone else who has experienced the things that you're describing. For the past several years, I've strongly advocated for reforming Section 230 of the Communications Decency Act. And this is due to increasing concerns about how social media platforms are operating and how they're utilizing Section 230. The platforms have enabled child sexual exploitation and promoted harmful challenges to children and facilitated drug trafficking in many cases to minors. Now, first, I introduced the Protect Act on this point, which mandates stricter safeguards on websites hosting pornographic content. Victims of online exploitation faced an uphill battle for years struggling to get online platforms to remove images that were non-consensually obtained. The bill would require platforms to airify the age and also obtain verified consent forms from individuals uploading and appearing in content.

And the bill would require tech companies to take stronger measures to prevent the exploitation occurring on their platforms and force immediate removal of child sexually explicit material and revenge porn upon receiving notice that the content in question was uploaded without the legally required consent. Second, I introduced another bill as a complement to the Protect Act called the Screen Act. The Screen Act would require all commercial pornographic websites to adopt age-verification technology to ensure children can't access the site's pornographic content. In the 20 years since the Supreme Court last examined this issue in earnest, technological advances have demonstrated that prior methods of restricting minors access to pornography online were ineffective. Nearly 80% of teenagers between the ages of 12 and 17 have been exposed to pornography. This is especially alarming, given the unique physiological effects that pornography has on minors, effects that are much better understood and to a much more alarming degree today than they were 20 years ago.

Finally, I introduced a third bill called the App Store Accountability Act, which would prevent underage users from downloading apps with pornography, extreme violence, and other harmful content while making it easier for parents to sue the gatekeepers of the content in question. Technology has advanced significantly over the last two decades. Modern age verification technology is now the least restrictive, least intrusive, and most effective means to which Congress has ready access to protect our children from exposure to online pornography. Ms. Goldberg, if it's okay, I'd like to start with you. In your view, should app stores such as the Google Play Store and Apple's App Store be held legally accountable for allowing minors access to harmful content?

Carrie Goldberg:

A hundred percent, app stores should have a duty. They are just a seller in this situation, and as we've said in our cases against Amazon, there's standards of seller negligence. So if you know that you are selling an unreasonably dangerous product, then there's liability.

Sen. Mike Lee (R-UT):

Liability. Liability the sin. There would be if you sold a tangible physical object unsuitable for minors to someone with knowledge or reckless disregard for their age. Professor Leary, do you believe requiring pornographic websites to adopt age verification technology for visitors and for all people featured on those websites in pornographic images, while imposing serious consequences for uploading and hosting non-consensual pornographic content, you think these are things that would help children?

Mary Graw Leary:

I do think age verification, obviously with any piece of legislation, the words matter, but the idea of anything that will create friction between children and their exposure to pornography is an important thing, and age verification can be one of them. I think the danger here is to suggest, not to suggest... The danger here is what I see is tech directing things away from them often, right, as there's the solution when we have to have a multi-tiered, multi-level approach. And that's why a combination of all of the acts you've talked about, the SHIELD Act, the Defiance Act, the Take It Down Act, the No Fakes act all together, really provide much more protection than one or two approaches.

Sen. Mike Lee (R-UT):

Thank you.

Sen. Amy Klobuchar (D-MN):

Thank you, Grassley. It is wonderful to be here and to hear your incredible testimony. Got through three of you, I think, and I first want to lead with you, our representative, Guffey. Watching your family behind you I can friends at how difficult this must be and how heartfelt your testimony was. I don't know how anyone can listen to you and now want to get something done here. So I want to thank you for that. You've described how victims of these crimes often suffer from mental health trauma. Can you quickly elaborate on why even the threat of the non-consensual distribution of explicit images can be tragic?

Rep. Brandon Guffey:

The threat is the most dangerous part of it. Not even the sharing of the images themselves are as bad of the threat because you were taking your deepest darkest shame or your most private moment. And the threat of sending it out to complete strangers is complete vulnerability. And I believe that in this country, we've lost grace and we have too often kick people for the mistakes that they make. And we tell our kids that everything you do online will stay with you forever. Well, imagine if you just took your darkest moment and just posted it online.

Sen. Amy Klobuchar (D-MN):

Exactly. Federal. Why should this be federal?

Rep. Brandon Guffey:

Well, on a state level, I can tell you from passing or submitting legislation, passing legislation, I have submitted things such as the Protect Act, the App Store Accountability Act. We need help on a federal level because Section 230 is causing states to go at this 20 different directions.

If 230 isn't going to fix it, and the states are fed up on how ineffective Congress has been, we're going to continue to try to go at it any and every way we can. But it would be a whole lot nicer to have uniform code across the country instead of just protecting children in one state.

Sen. Amy Klobuchar (D-MN):

Like you might do with the airplane seat rules that I just brought up.

Rep. Brandon Guffey:

Yes.

Sen. Amy Klobuchar (D-MN):

You don't have those state-by-state. That would be very difficult.

Rep. Brandon Guffey:

Yes.

Sen. Amy Klobuchar (D-MN):

... to get any results. Mr. Pizzuro, could you talk about why it's important that Congress pass these bills to give federal law enforcement tools? As you know, Senator Cornyn and I have this SHIELD Act, which is really important ahead of its time, and then the Take It Down Act requires the platforms to take these down immediately, the non-consensual images, but also make sure that there is criminal liability for those that are posting it. Could you talk about why that helps federal law enforcement?

John Pizzuro:

Sure. The challenge becomes, in investigating between state and federal, there's a lot of gaps. So as an investigator, there's areas where I can't successfully prosecute or have the actual law in order to facilitate things. So especially if you go to rural areas where there's not a state perspective, where there's not really good laws, you're going to need that federal law and that aspect. So what SHIELD does, it fills that legislative gap in order for us to actually effectually do our jobs.

Sen. Amy Klobuchar (D-MN):

Good point. And another question on the Fentanyl and the drug trafficking. The DEA recently found that one-third of Fentanyl cases they investigated had direct ties to social media. Others, like the National Crime Prevention Council, estimate that 80% of teen and young adult Fentanyl poisoning deaths can be tracked back to social media. It's not a statistic; it's actual lives lost. How do the design of an algorithmic recommendation by online platforms contribute to the facilitation of drug sales?

John Pizzuro:

Well, I could tell you this, even going back when I first started, not aging myself, but when there was clone pagers and we were doing cartels, this is just the advent of technology. And with these tech companies and the AI algorithms, what they push, that's what they're going to see. So it doesn't matter. One of the things I asked Meta, I asked Snap, I asked a lot of these companies, "Can you explain your algorithms?" No one can and no one will because, again, it's about business. It's about pushing that content, and that's what children are seeing. So that's what they're at risk.

Sen. Amy Klobuchar (D-MN):

Thank you. My last question of you, Ms. Goldberg, you have represented over a thousand victims of revenge porn, just to give people a sense of those numbers. Of course, there's tens of thousands out there that never were represented. Can you discuss the challenges you face in getting justice for your clients and why passage of federal laws like the SHIELD Act and the Take It Down Act would make a difference?

Carrie Goldberg:

Sure. When I started representing victims of revenge porn 10 years ago, there were three states that had laws, and everyone wanted to blame the victims and said, "You shouldn't have taken that picture in the first place." And it wasn't until we were testifying about it and actually making people realize that the liability needs to be in the hands of the offenders. There's a responsibility in being the recipient of it. But the bigger problem, though, was that the platforms were the ones that were distributing the content at scale. So back in the old days, revenge porn could be photocopied and put on a car windshield. But now with Snapchat and Google and Meta, one picture can be seen by millions and millions of people, and we need the uniformity like Mr. Pizzuro was saying.

Sen. Amy Klobuchar (D-MN):

Thank you. And I know my colleagues ask about Section 230, which I feel very strongly about, so I'll let that go. Thank you.

Sen. Chuck Grassley (R-IA):

Senator Klobuchar. Thank you.

Sen. Amy Klobuchar (D-MN):

Thank you, Chairman Grassley.

Sen. Chuck Grassley (R-IA):

Senator Hawley.

Sen. Josh Hawley (R-MO):

Thank you, Mr. Chairman. Thank you for calling this hearing. Thanks to the witnesses for being here, Mr. Pizzuro, let me just start with you. You've been working in the anti-exploitation space for a long time, both inside and outside government. Have I got that right?

John Pizzuro:

That's correct.

Sen. Josh Hawley (R-MO):

And so you know the trends about what we're facing online, what kids are facing online, probably as well or better than anybody. Is that fair to say?

John Pizzuro:

I would say pretty so.

Sen. Josh Hawley (R-MO):

Would you say that that CSAM, child sexual exploitation or abuse material, would you say that there's getting to be more of it or getting to be less of it?

John Pizzuro:

Oh, a hundred percent more, hundreds and thousands of more. I mean, I can't even percentage it.

Sen. Josh Hawley (R-MO):

Yeah, enormous amounts, right?

John Pizzuro:

Yeah.

Sen. Josh Hawley (R-MO):

Here's a measure of it. In 2023, there were 104 million images and videos of suspected child abuse material uploaded onto the internet compared to 450,000 in 2004. So 450,000 in 2004 to 104 million in the last full year for which we have data. Here's another statistic. According to the National Center for Missing and Exploited Children, the number of reports of child exploitation material went from one million in 2014 to 36.2 million in 2023.

John Pizzuro:

That's right.

Sen. Josh Hawley (R-MO):

So, in other words, it's just an enormous explosion. It's absolutely everywhere. So let me ask you about some of the remedies for this. If you are a parent, and I'm the parent of three young children, three little kids, if you're a parent of a victim of child sexual abuse material and your child's image has been used, they've been exploited, it's been used online, and you've got companies who have hosted that content recklessly, or intentionally, or negligently, or ... They've done it. If you're a parent, can I sue them and get them to take it down?

John Pizzuro:

Right now, no. You could sue them, but I don't know how successful you're going to be.

Sen. Josh Hawley (R-MO):

So if I went into court, if my kid is abused, their content is up online, we know the abuser but we've got these companies that are hosting the content and making money on it by distributing it, and I go to the company, let's say I go to the company and I say, "This sexual abuse material, this is my kid. This is online. I'm reporting it to you. I want you to take it down." Let's say they don't take it down. You're telling me I can't go into court and sue them?

John Pizzuro:

You're going to probably end up losing, and I think that's part of the problem.

Sen. Josh Hawley (R-MO):

It's a huge problem, is it not?

John Pizzuro:

Yeah.

Sen. Josh Hawley (R-MO):

You're exactly correct. You're exactly correct. The state of the law is I cannot go into court and hold these companies accountable. In fact, we had testimony just a few weeks ago of somebody sitting right where you're sitting, a parent whose child was sold drugs in this case over one of these platforms, over Snapchat. This parent went in, reported it to Snapchat. Snapchat said, "Oh, well, we'll do our best." They did nothing. The parent said, "I'm going to sue you." And the Snapchat executives laughed in her face and they said, "Oh no, you're not. You're not going to sue us because federal law prohibits you from suing us."

Let me just ask you this. In 2019, Facebook was fined by the FTC five billion dollars, five billion with a B, and their stock price went up. Now, what does that tell you about what these companies fear? Do you think they fear these government regulatory agencies that almost never bring suits and almost never bring enforcement?

John Pizzuro:

Oh, absolutely not.

Sen. Josh Hawley (R-MO):

Do you think that they fear lawsuits from parents who might get into court and get a billion dollar or a 10 billion dollar judgment?

John Pizzuro:

For them, it's the cost of doing business, right?

Sen. Josh Hawley (R-MO):

Yeah, exactly. And they're willing to pay it. Facebook paid that five billion dollars. Their stock price went up. They went right on doing what they were doing. But I tell you what they do fear, what they're absolutely terrified is they're absolutely terrified of a parent coming into court and getting in front of a jury and holding them accountable. And that is why it is high time, it is past time that this Congress gave parents the ability to do that.

And I will just say again, for the approximately three millionth time in this committee, until Congress gives parents the ability to sue, nothing will change. These companies don't care about fines. They don't care about the regulations. In fact, the companies regularly come and sit here and offer to write the regulations. They say, "Oh, we're great public citizens. We'd love to help you write the regulations, Congress, and we promise to comply. We'll write them and then we'll comply." They won't comply. They buy off the regulators. What they fear are juries.

And this is why what Senator Durbin has done with his bill that we worked on together to give parents the right to get into court and have their day in court is absolutely vital, and I'm proud to be working with him on this. It passed unanimously out of this committee last year when he was the chairman, and I look forward to reintroducing it. We make it even stronger, even better this year. But I just say again there is nothing more important than this Congress can do to stop this than to give parents the right and victims the right to get into court and to hold these companies accountable. Thank you, Mr. Chairman.

Sen. Chuck Grassley (R-IA):

Thank you. Senator Hirono.

Sen. Mazie Hirono (D-HI):

Thank you, Mr. Chairman. Thank you all for testifying. Representative Guffey, our hearts go out to you. We have been here many times already. Yes, I agree that we have to do something about Section 230, but one of the things that Professor Leary mentioned, and before I get to that, by the way, enforcement is really important, and I just want to note that last week when I was questioning Mr. Blanch who is President Trump's nominee for Deputy Attorney General, I noted that protecting children online is an issue that unifies the members of this committee. As you can see, that is why I was disappointed, it has not answered to one of my questions. I explained that if we want to protect children, the last thing we should do is fire prosecutors who fight child exploitation and impose a hiring freeze that stops them from filling these vacancies. But that's exactly what's happening. So I think we should note the environment in which we are having this hearing.

Moreover, there was a funding freeze briefly that cut off funding to internet crimes against children task forces that fight child exploitation in every state. So child exploitation is a multifaceted issue, and I want to get back to Professor Leary who said that the states ought to have the right to go after child exploitation in court and that they are not able to do so because of Section 230. Does that cover both criminal as well as civil prosecutions by states?

Mary Graw Leary:

It has been interpreted that way. The states, and the way it's been interpreted, is there's language in Section 230 of the Communications Decency Act, which talks about not supporting a state law that is in contravention with Section 230 of the Communications Decency Act. So courts have interpreted that as, "Oh, that means you can't enforce your state criminal laws," which was an attempt in the [inaudible 01:34:30] cases, and I assume it would happen in civil cases under the C-1 provisions of the statute.

Sen. Mazie Hirono (D-HI):

Well, so you would support legislation at the federal level that would allow the states to enforce their own child protection laws?

Mary Graw Leary:

One hundred percent. And I believe that my written testimony has a quote from the letters from the national attorneys generals laying out again for the third time and, again, speaking of unanimity, I don't know exactly how you can have over 50 attorneys generals, I believe it's the territories as well, all in agreement on this point.

Sen. Mazie Hirono (D-HI):

And Representative Guffey and Ms. Goldberg, you would agree that we need to do something that would enable the states to support their own laws?

Rep. Brandon Guffey:

I would certainly agree with that. I think that's one of the tools in the tool belt. But yes, states need to be able to have the tools.

Sen. Mazie Hirono (D-HI):

For Ms. Goldberg, you noted very briefly the Ninth Circuit and their decision in a case that you were involved with. Could you provide some background on the case and how Section 20 was involved and what you think that it demonstrates? This case demonstrates the state of the law around Section 230.

Carrie Goldberg:

Yes. So that case is called Doe v. Grinder and it accuses the dating app Grindr of advertising to children using Instagram and TikTok with child models in school settings and luring them onto the dating app. And, as I said in my complaint, there's statistics that 50% of gay kids who are sexually active have their first sexual experience with an adult that they meet on Grindr.

Now, Grindr has no age verifications and just absolutely turns a blind eye to the fact that there are so many kids that use their product and inevitably are recommended to adults. Now, I claimed that this was a defective product and Grindr, because they knew about the problem, as I stated in my lawsuit and were refusing to institute any sort of age verifications, they were also condoning trafficking. And the case got thrown out by the district court and that was affirmed yesterday by the Ninth Circuit, so I never got to go to discovery.

Sen. Mazie Hirono (D-HI):

So it was thrown out because of Section 230 immunity?

Carrie Goldberg:

Of Section 230 and because of the incredibly high knowledge standard of actual knowledge that they were imposing, which they didn't have to impose, but they imposed in the trafficking claim.

Sen. Mazie Hirono (D-HI):

I support the general proposition, thank you, Mr. Chairman, that anyone who gets injured by someone else's actions ought to be able to pursue legal remedies. Therefore, I agree that we need to remove Section 230 immunity somehow and still pay attention to various other unintended consequences that may flow from that kind of a change but it's not where we ought to be because this is a growing problem. Thank you, Mr. Chairman.

Sen. John Kennedy (R-LA):

Thank you, Senator. I believe I'm next. Representative, I'm sorry, but your boy's proud of you. You're doing good work.

Rep. Brandon Guffey:

Thank you.

Sen. John Kennedy (R-LA):

My late father used to tell me that you'll never know love until you know the love of a child. And I didn't believe him, but I do now. I don't know what I'd do if something happened to my boy. I'm just so sorry.

Rep. Brandon Guffey:

Thank you so much.

Sen. John Kennedy (R-LA):

Mr. Pizzuro.

John Pizzuro:

Sir.

Sen. John Kennedy (R-LA):

Social media is now a big part of childhood, isn't it?

John Pizzuro:

Yes.

Sen. John Kennedy (R-LA):

Can we agree that big parts of social media have just become cesspools of snark?

John Pizzuro:

I can probably attest to that, yes.

Sen. John Kennedy (R-LA):

Can we agree that social media has lowered the cost of being an a-hole?

John Pizzuro:

Yeah.

Sen. John Kennedy (R-LA):

Can we agree that big parts of social media have become cesspools of sexual exploitation?

John Pizzuro:

For sure.

Sen. John Kennedy (R-LA):

And I assume you'd agree with me if I said that social media has lowered the cost of being a pedophile, hasn't it?

John Pizzuro:

Absolutely. It made it easy access.

Sen. John Kennedy (R-LA):

Yeah. You're familiar with the National Center for Missing and Exploited Children's CyberTipline?

John Pizzuro:

Yes.

Sen. John Kennedy (R-LA):

Are the social media companies required to report instances of child sexual exploitation to the National Center?

John Pizzuro:

Of what they see.

Sen. John Kennedy (R-LA):

So the law says the social media companies have got to report these instances of sexual exploitation to the National Center. First, they have to look, don't they?

John Pizzuro:

Yes.

Sen. John Kennedy (R-LA):

Do they make any money when they look?

John Pizzuro:

No. If you just look at the Apple statistics I gave before at what, 36 million, there's 275 came from Apple.

Sen. John Kennedy (R-LA):

But they're not paid to look?

John Pizzuro:

No.

Sen. John Kennedy (R-LA):

Okay. In fact, they want people coming to their social media platform.

John Pizzuro:

More users, more money.

Sen. John Kennedy (R-LA):

Yeah. They want eyeballs so they can sell them advertising. So for them to look is inconsistent with their economic interest, isn't it?

John Pizzuro:

Correct.

Sen. John Kennedy (R-LA):

All right. Now, once they look and they find it, then they have to report it to the National Center, is that right?

John Pizzuro:

That's correct.

Sen. John Kennedy (R-LA):

Are they paid to report it to the National Center?

John Pizzuro:

Absolutely not.

Sen. John Kennedy (R-LA):

I know this is a difficult question. How many instances do you think of sexual exploitation of children are occurring and not being either looked for and/or reported by the social media companies?

John Pizzuro:

Well, I don't have NECMEC statistics but I could tell you that there's a lot of ESPs that don't actually even report. So some over report, some don't report at all. So that's part of the challenge. And then, secondary, it's voluntary, right, so whatever they give them, there's no uniformity in data as well.

Sen. John Kennedy (R-LA):

What happens if they don't look and/or they don't report? Are they punished?

John Pizzuro:

No.

Sen. John Kennedy (R-LA):

You're familiar with the Safer program?

John Pizzuro:

A little bit.

Sen. John Kennedy (R-LA):

Okay. It's a tool. They use AI to scan conversations and look for patterns that might be sexual exploitation of children. It's not the only algorithm out there. Do social media programs all use that?

John Pizzuro:

I don't know how many do, I don't know if they're using technology, but they should.

Sen. John Kennedy (R-LA):

Are they required to use it?

John Pizzuro:

Nope.

Sen. John Kennedy (R-LA):

We've got to do something. This is my last question. Do you find it ironic that all of these people in Big Tech who dreamed about and talked about creating a utopia have managed to generate more hate and more harm than anyone could ever have possibly imagined, all to make money?

John Pizzuro:

And lots of money they made.

Sen. John Kennedy (R-LA):

Do you find that ironic?

John Pizzuro:

Very.

Sen. John Kennedy (R-LA):

Thank you all for being here. Senator Blumenthal, he's not only next, he's the only one left, but it's nice to see him.

Sen. Richard Blumenthal (D-CT):

Am I recognized, Mr. Chairman, Mr. Ranking Member or are you-

Sen. John Kennedy (R-LA):

I'm the Chairman.

Sen. Richard Blumenthal (D-CT):

You're presiding.

Sen. John Kennedy (R-LA):

I'm presiding, Blumenthal.

Sen. Richard Blumenthal (D-CT):

That may be the reason I'm the only one left.

Sen. John Kennedy (R-LA):

Could be. I'm looking forward to your questioning, and I'm going to turn the gavel over to Senator Blackburn.

Sen. Richard Blumenthal (D-CT):

Thank you, Chairman Blackburn. Representative Guffey, thank you for being here today and I think our hearts go out to you. I know I'm not the first to have said it, but your courage and strength makes an enormous difference. I know how strongly you supported the Kids Online Safety Act, and I am deeply grateful to you for your support and your activism In going to Louisiana, for example, seeking to talk to Representative Scalise and Representative Speaker Johnson on behalf of that bill.

You did an article that I would like to have entered into the record if there's no objection and there seems to be none. When you went to see Representatives Scalise and Johnson, were you given an opportunity to talk to them?

Rep. Brandon Guffey:

No, sir. Myself or the other parents, we did meet with Representative Scalise's staff, which, of course was in district at that time, but even coming up here to the Hill, unable to meet with either one of the representatives.

Sen. Richard Blumenthal (D-CT):

Would you like to meet with them?

Rep. Brandon Guffey:

I would love to.

Sen. Richard Blumenthal (D-CT):

Well, we'll try to arrange it for you.

Rep. Brandon Guffey:

Thank you, sir.

Sen. Richard Blumenthal (D-CT):

I'm hoping they'll hear from you. I'm hoping they'll support the bill this time. Do you agree?

Rep. Brandon Guffey:

I one thousand percent agree.

Sen. Richard Blumenthal (D-CT):

Why don't you tell us, as a parent but also as an advocate and the author of that article, why you think some of the arguments made against KOSA based on a supposed free speech thesis are incorrect?

Rep. Brandon Guffey:

I believe it's all follow the money. If you look at big tech and their lobby and you look at the narratives that get put out there and you look at the representatives that fight against it and you follow the money and where it ends up, I believe that that fear, and as an elected official, I see it myself. You're often worried about what this will look like. And that's one of the reasons whenever I was presenting, I used the phrase that we have too many politicians worried about their next election instead of leaders worried about the next generation. And I believe that it's a false narrative that has been put out there. The argument has been had over and over and people will agree with you and then they will turn right around and share a false narrative.

Sen. Richard Blumenthal (D-CT):

I think the United States Senate has recognized that it's a false narrative through a strong bipartisan vote here, 91 to 3 in the last session. I'm hoping that we'll have that same kind of support again, and I thank my Republican colleagues, particularly Senator Blackburn, who has been such a steadfast partner in this effort.

I'd like to turn to Professor Leary. I think I misattributed the article to Representative Guffey, but maybe you can expand on his response on that free speech false narrative.

Mary Graw Leary:

Sure. Thank you, Senator. So first thing about free speech. Well, first, as you know, I believe the article you're referring to, the op-ed, myself and other scholars wrote this piece and it really dispelled these arguments about KOSA and really that we see again. In fact, if you look back in history, it's interesting to look at what some of tech has said over the years. I can go back to 2014, they are making this argument. We can go back actually before that to 1996. They told us the Communications Decency Act was going to ruin free speech. Then they said it about the Save Act, then they said it about SESTA-FOSTA and, lo and behold, we still have plenty of free speech.

The thing to keep in mind with free speech is the First Amendment is that is an amendment designed to help inform us on how to handle these sticky issues. It is not a reason to not engage in legislation, and that it's being used in that manner. There's a distinction between speech and conduct and specifically with KOSA. KOSA addressed conduct, not content. And so the speech argument was particularly misplaced with regard to that piece of legislation.

Sen. Richard Blumenthal (D-CT):

Thank you. In fact, KOSA affects the conduct involved in product design. There's no more limitation on free speech than there would be, if and when because it does, the federal government regulates the safety of the design of an automobile or a toaster or a washing machine. If they explode, there is liability for it. It's not free speech to design a defective and harmful product. It's conduct and there is no censorship, no blocking of content in KOSA. Thank you all for your testimony today. Thank you, Madam Chair.

Sen. Marsha Blackburn (R-TN):

Thank you. And, Professor Leary, I'm going to stay right with you for my question. I appreciated so much that op-ed that you would put together and the difference that you're making there that it was not a free speech infringement. This is, as Senator Blumenthal said, product design, as you said, conduct. But we know the reason that Meta and Google and the groups lobbied, just millions of dollars spent lobbying against this, is because they have assigned a dollar value to each and every kid. And I think the dollar value is $270. And so our kids are the product and it is so unseemly. To me, it is absolutely disgusting that they devalue the lives of young people in this manner.

Mr. Pizzuro, I want to come to you. Senator Klobuchar and I have the National Human Trafficking Database Act which would establish a database at DOJ's Office for Victims of Crimes and incentivizes states to collect and to enter and share their data. What we're trying to do is get a full picture of what is happening in each of the 50 states when it comes to human trafficking. And we have really had a tough time doing this and finding those people that are behind these human trafficking rings. I know your organization, Raven, has been supportive of the bill.

John Pizzuro:

Yes.

Sen. Marsha Blackburn (R-TN):

I'd like for you to talk for just a minute about why having a national database is so vitally important to breaking this modern day slavery apart.

John Pizzuro:

Well, the more data we have, the more we're able to understand and see and react to. And I think that's part of the challenge is that, as states, we're so fragmentized, so we're getting data from just certain areas. So the more data we're able to actually collect, the more likely we are able to put a comprehensive plan and understand how to go after certain trafficking.

Sen. Marsha Blackburn (R-TN):

And I want you to touch for just a moment on the use of AI-generated CSAM because what we hear from law enforcement is they're having to sift through so many images to figure out what is AI-generated and what is actual.

John Pizzuro:

That's a challenge. So right now a detective investigating something ... I can't tell the difference between what a real image is and what is not a real image. So technology exists now I can make those images whoever I want. I can make a child from an adult. And the challenge really becomes is now I could take your images off the clear net, off of social media, off of open profiles, and then turn that person into a child or, better yet, have sexually explicit images. The challenge is going to be we can't see it unless we have the software capabilities in order to actually do that which, again, we don't have.

Sen. Marsha Blackburn (R-TN):

I appreciate that. And Senators Coons, Klobuchar, Tillis, and I introduced the No Fakes Act to deal with AI-generated voice and visual likeness of individuals. We think that this will play an important role in a remedy for AI-generated CSAM. Representative Guffey, I'd love to get your thoughts on that.

Rep. Brandon Guffey:

On the No Fakes Act, I personally love it. I've actually resubmitted the use of a bill that's very similar within the state, but I love the idea of using the name, image, and likeness. I think that is a very easy thing to hit. And as we talk about using AI-generated pornography, one of the problems that we have is stating that this is not a real person, therefore, is it really a crime? Bills such as a name, image and likeness protects our citizens as opposed to focusing solely on what the image is. It protects the citizens.

Sen. Marsha Blackburn (R-TN):

Let me ask you this, and congratulations on getting Gavin's Law passed.

Rep. Brandon Guffey:

Thank you.

Sen. Marsha Blackburn (R-TN):

Is there a way you can amend provisions of No Fakes onto Gavin's Law and begin to expand the protections there at the state level?

Rep. Brandon Guffey:

In South Carolina, unfortunately, no.

Sen. Marsha Blackburn (R-TN):

So it's going to be two separate, so you'll have to have a group of bills-

Rep. Brandon Guffey:

Yes, ma'am.

Sen. Marsha Blackburn (R-TN):

... that will do this. Okay. And I am over time. I am going to recognize Senator Schiff and turn the gavel to Ms. Moody.

Sen. Adam Schiff (D-CA):

Thank you, Madam Chair. Thank you all for being here and, Mr. Guffey, I appreciate your advocacy and want to express my condolence for the loss of your son. I can't imagine the trauma that you and your family have been through, but I appreciate your taking that trauma and using it to protect other families.

I have not had a chance as a new member of the Senate to really study the multiple approaches of the various bills, although some I've supported in the House. But I wanted to ask you, Professor, we established Section 230 for the reasons I think you implied, which is it was a nascent industry. They urged us to do so so that we would not stifle innovation. They also made the argument that, without 230, they would not moderate content because they would be sued if they did, and this would encourage them to moderate content. Well, there may have been a time where they moderated content, but those days seem to be over.

It certainly wasn't enacted because it was believed necessary for the First Amendment. The First Amendment stands on its own two feet. In the absence of 230, companies could still plead a First Amendment defense to any case. What is your preferred approach, that is, it a repeal of 230? Is it changing it from an immunity to some form of defense? Is it to cabin 230 in some way by narrowing the scope? What are the merits of the various approaches?

Mary Graw Leary:

Thank you, Senator. I would say that it's important, a couple of things, I would say that there was discussion in the deep background about this free internet nascent industry. And when we look at the policies and the findings at the beginning of Section 230 of the Communications Decency Act, there is languages to that. But I would repeat, the overwhelming background and discussion was about the child protection piece, and therefore I think that the concern, I think that better than repealing the entire thing is to keep the C2 language, which gives a cover, gives a protection to a platform if they do, and specifically if they remove anything they consider to be obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable, that will protect them. That's all they need. They do not need C1, which is what has been turned into this de facto near-absolute immunity.

I think also adding the state's ability to proceed is an important thing. I think outside of 230 is holding them liable when they host this material. That has been discussed at length as well. So I think that those, and some of the other things that I've listed, but I don't want to use up too much of your time, all work together to respond to the complex crime.

Sen. Adam Schiff (D-CA):

And Counsel, in your representation of clients in this area, what do you believe would be most helpful in terms of making sure that you can get the discovery you need and that we have established the right protections and the right burdens in terms of the platforms and the pipelines?

Carrie Goldberg:

Thank you. I agree, we are long overdue to just abolish Section 230, but what's important is that clients need to get into discovery so that they can actually know the extent of the problem. And the only way we can do that is if the standard is reasonable for parents to plead. If we have to show that the company knew about that picture or that exact victim, that exact perpetrator, there's no way a client's going to be able to overcome a motion to dismiss and get into discovery. So we need to actually have standards like negligence, which the law already affords in almost all causes of action.

Sen. Adam Schiff (D-CA):

So let me ask this question. I don't think there's any doubt that if the company's devoted their technological capability to trying to solve this problem, that they could make enormous gains. They wouldn't be able to eliminate the problem altogether, but nonetheless, they could prove very effective. What would you propose the new standard be, then? That is, if it's not going to be possible to completely do away with this, the standard can't be perfection, how would you define the standard of care that you would expect the industry to follow, given that it hasn't had to follow any standard with the protections of 230?

Sen. Ashley Moody (R-FL):

And if you could just quickly answer that.

Carrie Goldberg:

Oh, sure. Well, these are products, so strict liability should apply here. If these companies have created a defective product, then all users should be able to sue them without having to even prove a duty if the product injured them.

Sen. Adam Schiff (D-CA):

Thank you.

Sen. Ashley Moody (R-FL):

Thank you, Senator. Appreciate you being here today. I have been so impressed with this committee. I am a Senator all of four weeks, so get ready. I bring to it with an array of passions I have acquired, not just as an Attorney General but as a mother of a teenager right now. And I'm so impressed with the topics that we have focused on, and specifically this one. It was shocking to me that the Senate was able to move forward pretty unanimously on some protections for children, and they ran right into the house that did not go along with some of those things. And I'm hoping that we can change that.

As Attorney General, obviously, I fought in court against many of the platforms. I investigated platforms for harms to children. I am the mother dealing with this now. In fact, I tell people all the time, it is really hard to be one of the first generations of parents trying to parent children and we don't understand what we're doing because we don't understand the technology to the degree they do. In fact, when I'm going through some of the controls, I often have to ask my kid what that means, which seems to defeat the purpose. But here we are.

And while I can break down what we're addressing today by privacy concerns of children, certainly, harm to children, whether that is mental effects, addiction, or materials that they never would have been exposed to in the past but they now have ready access to, one of the things I want to talk about quickly is the access to our children by predators and bad actors. I think this is this third lane that we read about repeatedly in the paper every single day. In my state, from doctors to predators to you name it, they're getting access to our children. And parents in the past could lock our children's bedroom doors and know they were safe at night. But that is not the reality anymore.

In fact, in my own child's school, there were five teenage boys. A woman was arrested for posing as one and luring them and molesting them online, I think using Snapchat and TikTok. And of course when you engage with the platforms, they will often deny that this is happening. But it is happening, and the best people that can represent that are the parents where it's happening to their children in their homes while they thought they were safe. And so I really commend you, as a fellow parent, Mr. Guffey, for taking your pain and channeling that into just frustration and anger, because that is what is going to get the attention of lawmakers.

I mean, we've tried to get the attention of the platforms. We've talked a lot about what needs to be done to force some restrictions in them acting on their own, but we need to talk about what needs to be done through laws. And what I'm specifically concerned about, and I would open this up to whoever wants to answer this question, what is the thing that we can do as lawmakers right now to stop predators from getting access to our children?

Let me start down there and then we'll come to you.

Rep. Brandon Guffey:

I want to use this as an example because the comment was made that we essentially have different laws for the outside world than we do for the inside world. If I had a storage facility and I stored only guns in there, and you as Attorney General, and someone was breaking the law and I said, "Okay, the majority are law-abiding citizens, but we also have terrorists, and we're going to store guns for criminals." And if I told you that you had to have the digital ID to get into that locker, you'd think that's ludicrous. But that's exactly the way that we treat CSAM.

I mean, to me, I believe that if you're housing CSAM, you should be held responsible. But nothing is going to change until we open up civil liability. These are the world's richest companies since the inception of man, and yet they are immune.

Carrie Goldberg:

And I'd say that if you have designed a product where you are exposing children to predators and you can't stop that from happening, then it's a defective product. And all a parent or a victim should have to do to be able to sue you is just to show that you know about the problem and the extent of it.

Sen. Ashley Moody (R-FL):

And in your experience, and I understand that this has happened, when parents have demanded and shown that this harmful material is online and demanded that they take it down, they've now told them about it, they know about it, and there have been refusals to take it down.

Carrie Goldberg:

Absolutely. And those cases get thrown out of court because the online platform says, "Well, I didn't know about that specific incident." Of course, they're not going to know about that specific incident, or they're going to say, "I didn't intend to harm that exact child."

Sen. Ashley Moody (R-FL):

And Mr. Pizzuro, I know you have law enforcement experience, and I'm grateful for that. Thank you. My husband is career law enforcement. Understanding that predators can now get to our children through online platforms and online, what is the number one thing you would recommend to prevent that, that we can do as Congress?

John Pizzuro:

Device-based age verification. You mentioned parental controls. If there was a framework for a parent just to shut the spigot off and make it easy, rather than go through 25 different apps. The companies have this. We can stop it at the device level. That's where we prevent children from getting onto some of these images and offenders getting access to those children.

Sen. Ashley Moody (R-FL):

Thank you. And since I am the acting chair, I don't want to exceed the boundaries of time, so I will turn it over to Senator Whitehouse.

Sen. Sheldon Whitehouse (D-RI):

Thank you. And I understand that I've been given permission to close out the hearing at the end of my questionings. I know you have another place to be, so don't hesitate to go where you need to be.

First of all, Ms. Goldberg, you said that repeal of 230 was long overdue. I'm hoping that that day is coming fairly soon and that a bipartisan bill to do just that will be filed by a group of members from this committee before very long. As you also pointed out, there are standards by which to evaluate the conduct or misconduct of these big platforms that the law already affords, and everybody else has to abide by those same standards; if you're a radio station, if you're a newspaper, if you're a manufacturer, if you're an individual. Some of them go back to the English common law that came over with the First Settlers. And the idea that, what Representative Guffey described I think quite well, as institutions that are the richest since the inception of man shouldn't be bound by the law.

I adore Ron Wyden. I think he's a wonderful senator. He put Section 230 in when these platforms were in people's garages. And they've gone from that to being the richest companies since the inception of man. I'm not going to forget that phrase of yours, Representative Guffey. I like it. With no change in Congress's response to the original rationale for having that Section 230 protection and also repeated grotesque failure by these entities to police themselves. It's not as if we're dealing with an array of platforms that have a demonstrated record of meeting the public interest in the safety of their product. Not at all.

As lawyer to lawyer, Ms. Goldberg, talk a little bit about when the Section 230 defense first kicks in and what that means in terms of you and your clients actually being able to get discovery, to take a deposition, to find out the truth of what actually transpired.

Carrie Goldberg:

What happens is that I file a lawsuit with all my facts, with everything that I can know, even though there's so much asymmetry of knowledge. I don't know the extent to which the platform knows about the exact problem or the overall problem. I can just base it on what's happened to my client, if they're alive. Otherwise, I have to go through their parents. So immediately, within 30 days-

Sen. Sheldon Whitehouse (D-RI):

So you file your complaint. You've made a claim.

Carrie Goldberg:

They file a motion to dismiss-

Sen. Sheldon Whitehouse (D-RI):

And they file a motion to dismiss.

Carrie Goldberg:

... saying, "We're just a publishing platform. We're not a product." This is just speech. And then they attempt to get it dismissed. Oftentimes judges will do it without even oral argument, and then we never get into discovery. So we never get the opportunity to even show or know exactly the extent to which the platform has been tolerating and making money off of this exact harm. We don't have any information about other similar incidents. Nothing.

Sen. Sheldon Whitehouse (D-RI):

So it's a vehicle not only for evading responsibility for bad acts, but it's a vehicle also for covering up what actually took place. It would be slightly different if the Section 230 dismissal motion was something that you made at trial, for instance.

Carrie Goldberg:

Yes.

Sen. Sheldon Whitehouse (D-RI):

So that you'd have a full chance. But not even that.

Carrie Goldberg:

I also believe that even more terrifying to tech than facing a jury eye-to-eye is the discovery. It was the discovery that made Omegle shut down. I had 60,000 documents showing all these other similar incidents of child sexual abuse, and they shuttered their platform because they had no defense.

Sen. Sheldon Whitehouse (D-RI):

Discovery is a beautiful thing. Senator Padilla.

Sen. Alex Padilla (D-CA):

Thank you, Mr. Chair. Mr. Guffey, I just want to begin with you and let you know that my heart goes out to you for you and your family's experience, and I really appreciate your willingness to be here today to share your testimony.

Rep. Brandon Guffey:

Thank you.

Sen. Alex Padilla (D-CA):

I want to draw my colleagues' attention to what's represented to minors by a relatively new consumer product, character-based AI chatbot apps. Many of these services have been flooded with age-inappropriate chatbots which may cause young users to be exposed to sexual or suggestive AI-generated imagery or conversations. As a father of three school aged children, this is personal. Further conversations with these chatbots can end tragically, as we've heard reports. Since 2023, at least two individuals have died by suicide following extensive conversations with AI chatbots. So the threat, colleagues, the risk is real.

Mr. Guffey, how would you recommend that this committee begin to think about or think through the risk posed by this emerging consumer product category?

Rep. Brandon Guffey:

Whenever it comes to AI, I would have to lean more on some of the other panelists up here on their expertise when addressing chatbots. Chatbots is something new that I've just started really looking into. But on the legal side, I'm not an attorney. I'm an angry parent that tries to throw it against the wall. Whereas the attorneys are the ones who have to say, "This is what will hold up in court. This is what will not."

Sen. Alex Padilla (D-CA):

Well, we'd have to figure out the legalese, but I do think you have the most important voice here given your experience. I mean, the chatbot piece is just the next iteration of this technology.

Rep. Brandon Guffey:

Yes, sir.

Sen. Alex Padilla (D-CA):

We know what technology was when you and I were much younger. With what children have to contend with today, we can only imagine what's coming.

Rep. Brandon Guffey:

Well, and that's the exact problem. It's not just the problem. That's the problem of today, but it's as tech is evolving, our laws don't move fast enough to keep up with. And I believe that having that liability and being able to hold these companies responsible for what they are presenting. Instead of taking online services and treating it as a service, if we can simply treat it as a product, then we can hold them to consumer protection laws.

Stephen Balkam:

Senator, if I may. I think we should think about the international context which this is playing out because the recent AI summit in Paris was called the AI Security Summit rather than the AI Safety Summit, which had taken place in the UK and, I believe, in Korea. And there's been a shift away from the prevailing thought that we must make these products safe, and instead, and particularly this administration, is urging the vast and quick expansion of these tools. And I think you have a role, and your colleagues have a role, to bring that focus back. And I dearly hope you do.

Sen. Alex Padilla (D-CA):

Ms. Goldberg, you seem anxious.

Carrie Goldberg:

I do. One of my close friends, Matthew Bergman, is actually litigating a case against Character.AI where the bot encouraged addictive behavior and ultimately led the child to die by suicide. And I think what we'll find is that there's a possibility that courts will perceive this speech as the corporation's own speech, and in that case, Character.AI is owned by Google, and that it won't overcome a Section 230 challenge.

Sen. Alex Padilla (D-CA):

Okay. Very good point, actually. So what I would do, just in the interest of time, is invite all of you to respond to the same question after the hearing as part of our questions for the record, because I do want to get to at least one more topic. And I understand Senator Graham is on his way back as well.

Last Congress we had a hearing very similar to this, but instead of you five sitting in front of us testifying, it was actually the CEOs of the five largest social media companies testifying to the committee. And I had the opportunity then to ask them each about the parental tools that they offer or didn't. But I think all of them have offered some sort of parental tool to help parents help minors safely navigate the use of their respective services.

I asked them to describe what those tools were, and more specifically what the adoption and use rates of those tools were. Because you can have tools and protections out there and you can debate whether it's sufficient or not, but if people aren't even utilizing them, then what's the point? And sadly, they either didn't share how widely used these tools were, didn't provide the data, and they're all very big into data, or what data they did show demonstrated to us that the usage rates were actually very low.

So the conclusion, unavoidable, undeniable, is that the industry isn't doing enough to let parents know what resources are available and aren't investing enough into understanding why these so-called protections aren't being adopted at greater rates.

Mr. Balkam, in your testimony, you observed that these controls would better serve minors and their guardians if they were standardized, interoperable, and unified between apps, devices, and brands. How do you think we can make that a reality?

Stephen Balkam:

Well, I often use the example of the automobile industry. Back in the '50s and '60s, if you got out of one car and into another, you may not necessarily know where the blinkers are or the light switches were. Even the logos for those were different in different car makes. Well, laws came into place in the '60s, and in fact now when you get into a new rental car, you know exactly where the indicators are, you know exactly where the lights are, and the symbols are all the same.

Well, I'd like to see the industry come together, ideally voluntarily, but if not, perhaps with some coercion, to standardize the ways in which parental controls and online safety tools, which by the way are the ones that teens and young people use, to stay private, to report and to block, oftentimes without even their parents' knowledge. But in other words, let's have a standardized way of keeping our kids safe and that teens can keep themselves safe that is not as confusing as we got at the moment.

Sen. Alex Padilla (D-CA):

Yeah, industry standards. It's not a new concept, and it tends to happen one of two ways. It either gets imposed by some level of government and that industry comes away kicking and screaming, or they can actually live up to their responsibility and come together as an industry and put forward a model that is transparent and that either works or at least we can measure and hold them accountable when and where it doesn't.

I know it's been a long morning for all of you. Very, very much appreciate your participation in today's hearing and the work that you do and the perspectives that you've offered. I'm told Senator Graham is not coming after all, and so it falls upon me to not just thank all of our witnesses, but remind folks that the hearing record will remain open for one week for statements to be submitted into the record. Questions for the record may be submitted by Senators by 5:00 PM on Wednesday, February 26th. Unless there's anything further from the name plates, this hearing is adjourned. Thank you.

Authors

Prithvi Iyer
Prithvi Iyer is a Program Manager at Tech Policy Press. He completed a masters of Global Affairs from the University of Notre Dame where he also served as Assistant Director of the Peacetech and Polarization Lab. Prior to his graduate studies, he worked as a research assistant for the Observer Resea...

Related

Advocates Call for Federal Legislation at US Senate Judiciary Committee Hearing on Child Online Safety

Topics