Home

Transcript: Senate Judiciary Committee Hearing on "Protecting Our Children Online"

Justin Hendrix / Feb 16, 2023
Senate Judiciary Committee Hearing on Protecting Our Children Online, Washington DC, February 14, 2023

On Tuesday, February 14, the U.S. Senate Judiciary Committee hosted a hearing titled "Protecting Our Children Online," convened by Chairman Dick Durbin (D-IL) and Ranking Member Lindsey Graham (R-SC). Witnesses included:

  • Kristin Bride- Survivor Parent And Social Media Reform Advocate (Testimony)
  • Emma Lembke- Founder, Log Off Movement (Testimony)
  • Michelle C. DeLaune- President And CEO, National Center for Missing & Exploited Children (NCMEC) (Testimony)
  • John Pizzuro- CEO, Raven (Testimony)
  • Mitch J. Prinstein, PhD, ABPP- Chief Science Officer, American Psychological Association (Testimony)
  • Josh Golin- Executive Director, Fairplay (Testimony)

This transcript may contain errors. Check quotes against the video of the hearing before reproducing.

Senator Dick Durbin (D-IL):

Today, February 14th is already the anniversary of two horrific mass shootings-- in Parkland, Florida five years ago; in Northern Illinois University in DeKalb 15 years ago. Now the friends and families of Michigan State students joining that grief, my heart goes out to them. Last Congress, this committee held 11 hearings on our nation's gun violence epidemic, and the Senate passed the most significant gun safety reform in nearly 30 years. But it's not enough. We have more to do. We've lost 5,200 Americans to gunfire already this year and we're only halfway through February. We, we were able to come together on a bipartisan basis last year to close gaps in our law to help reduce shootings. We need to continue the efforts in this committee and this Congress and all work to do so. We owe that to the families and communities who have lost so much. Today, the Senate Judiciary Committee will focus on an issue that impacts every family, keeping our kids safe in the internet age.

This little device here is an amazing source of information and communication, but it also has some properties which we'll discuss today, but are not obvious as you glance at it. Why is it that children who can't really walk on their own, maybe not, not even talk yet, can operate one of these, can punch the screen to move things? There is a captivation that's taking place there in the minds of young people that continues. It is addictive, and we know that, and we also know that it's threatening and we're gonna hear some stories today. Tales of terrible results of communication through this device. The online world offers tremendous opportunities and benefits, but it's a serious risk and danger to our kids. In almost every aspect of the real world child safety's a top priority. We lock the door and teach our kids not to talk to strangers. But in the virtual world, criminals and bullies don't need to pick a lock or wait outside the playground to hurt our kids. They only have to lurk in the shadows online, of Facebook and Snapchat. In those shadows, they can bully, intimidate, addict, or sexually exploit our kids right in our own homes. I'd like to turn to a brief video at this point about the risks our children face.

Video:

A new report points to teens use of social media and smartphones as a potential cause of mental distress.



The American Academy of Pediatrics and the American Academy of Child Medicine and Psychiatry declared a national mental health emergency. This is a phenomenon that we like we haven't seen before.



In fifth grade, literally every single one of my classmates had a phone at that time, right?



YouTube, YouTube, YouTube .



Like I couldn't live without it. Social media is its own world.



Increasingly, we're leaving a trail of digital crumbs as we go through life because the Internet's becoming the basis for social discourse, for commerce, for healthcare, for entertainment.



Senator Dick Durbin (D-IL):



Mr. Zuckerberg, would you be comfortable sharing with us the name of the hotel you stayed in last night?



Mark Zuckerberg:



Um, no.



Senator Dick Durbin (D-IL):



I think that might be what this is all about.



Video:



Any society that fails to protect his children has lost their way.



We are the Phoenix 11. Sexually abused as children, reduced to child sex abuse images, and stripped of our dignity and humanity. We are survivors of sexual torture, child rape, erotic photoshoots, pedophile sleepovers, elementary school sex shows, streaming BDSM and twisted sexual desires whose digital images were trafficked worldwide. Hear our voice, see our strength, answer our call. We will not be stopped. We not be silent.

Senator Dick Durbin (D-IL):

Exploitation of children is an urgent growing threat. A report last year from Pew Research found that nearly half of American teens report being harassed or bullied online. Nearly half. As too many families know cyber bullying, which is often relentless, cruel, and anonymous, can lead to tragic results. Social media can also cause a variety of mental health problems in teenagers, including anxiety, depression, stress, body image issues. This has been well documented and the big tech companies know it, but despite all these known risks and harms, online platforms are doing everything they can to keep our kids' eyes glued to the screens. In the process they're vacuuming up tons of data they can use to build profiles and target our kids with even more ads and content. It's a lucrative business at the expense of our kids' privacy, safety, and health. We don't have to take it today.

We'll hear from an outstanding panel of witnesses about the challenges to protecting kids online and the steps we in the Senate and this committee can take to help. I want to thank our witnesses, Kristen Bride and Emma Lembke, who've been personally impacted by this issue. They speak on behalf of many others and they advocate for change to help spare others what they and their families have gone through. Thank you both for being here today. I want to acknowledge Rose Bronstein from Chicago, who is in the audience. She lost her son Nate to suicide last year after he was viciously bullied over Snapchat and other social media platforms. Ms. Bronstein, I'm sorry for your loss. We're also joined by experts representing the National Center for Missing and Exploited Children, law enforcement, the American Psychological Association, and the advocacy organization Fair Play. The big tech platforms are not here today, but don't worry, they'll have their chance.

We'll invite their leaders to appear before this committee, soon to discuss how they can be part of the solution instead of the problem. Today's discussion builds upon years of important work by this committee. Ranking Member Graham held important hearings on this issue when he chaired the committee. I thank him for his partnership in organizing today's hearings. We consider it a bipartisan call to action. There are a number of worthwhile legislative proposals to protect our kids, such as the Earn It Act, which enjoys strong bipartisan support in this committee. Additionally, for months, I've been working on a comprehensive bill to close the gaps in the law and crack down on the proliferation of child sex abuse material online, the Stop CSAM Act. Today I'll be releasing the discussion draft of this legislation and I hope to move forward with it soon.

I also want to acknowledge, she's here now, both senators here now, Senators Blumenthal and Blackburn of this committee who have been leaders on this issue in another committee, the Commerce Committee, for a long time. I look forward to hearing our witness' ideas for reform, and I hope they can provide the basis for advancing legislation like we do in the real world. We need to protect our kids in the virtual world. This is not a partisan issue. It's an issue that keeps parents and children up at night. It deserves the attention of this committee and this congress, and it deserves action. I now turn to the ranking member, Senator Graham.

Senator Lindsey Graham (R-SC):

Thank you Mr. Chairman. One I want to congratulate you for calling this hearing. It couldn't commit a better time. It's a great panel. I want the people testifying to understand that we're all listening to you, that all our barriers are open and our hearts are open to try to find solutions. This is the one thing I think at unites most Americans, is that most of them feel helpless. The American consumer is virtually unprotected from the adverse effects of social media-- that needs to, and I think will change. How do you protect the consumer? Well, you have regulatory agencies that protect our food and our health in general. In this space there are none. You have statutory schemes to protect the consumer from abuse; in this space, there are none. You can always go to court in America if you feel like you've been wronged except here.

So the American consumer is virtually unprotected from the abuses of social media, and of all Americans, I think young people are the most exposed here. Parents feel helpless. There's somebody affecting your kids you'll never see, and a lot of times it's a machine who's watching the machine, if at all. And, the Surgeon General issued a report that's pretty damning, about the business model is to get people to watch things as much as possible, whether or not those things are good for you. They make money based on eyeballs and advertising. There is no regulatory agency in America with any meaningful power to control this. There are more bills being introduced in this area than any subject matter that I know of. All of 'em are bipartisan. So I want to add a thought to the mix, Mr. Chairman, I'm working with Senator Elizabeth Warren from Massachusetts.

We have pretty divergent political opinions except here. We have to do something, and the sooner the better. We're gonna approach this from consumer protection. We're gonna look at a digital regulatory commission that would have power to shut these sites down if they're not doing best business practices to protect children from sexual exploitation online. There are 21 million episodes last year of sexual exploitation against children. It was a million, I think, in 2014. This is an epidemic. It is a mental health crisis, particularly for young teenage girls. And we have no system in place to empower parents and empower consumers to seek justice, to fight back and protect themselves. That's going to change in this Congress. I hope so, Mr. Chairman. I look forward to working with you. I know Senator Blackburn's been very involved in the privacy space. I've worked with Senator Blumenthal with the Earn It Act. So we're gonna work together the best we can to find solutions to empower consumers who are pretty much at the will of social media and some people are having their lives ruined and it's now time for us to act.

Senator Dick Durbin (D-IL):

Thanks Senator Graham. I'm gonna ask our two colleagues, Senator Blumenthal and Senator Blackburn to give brief opening remarks. As I mentioned earlier, they've both been pioneers in the subject matter. Senator Blumenthal.

Senator Richard Blumenthal (D-CT):

Thanks very much Mr. Chairman, and uh, I want to personally thank you not only for having this hearing, but your very important interest and work on protecting kids online. And I'm grateful as well to Senator Graham for his partnership on the EARN It Act. This cause is truly bipartisan, which Senator Blackburn and I think we are showing in real time here. The work that we're doing together, the EARN It Act can be a meaningful step toward reforming this unconscionably excessive Section 230 shield to big tech accountability. Uh, I think we need to be blunt from the beginning because we know right now the central truth: big tech has relentlessly ruthlessly pumped up profits by purposefully exploiting kids and parents paying. Young people like Emma Lembke have been victims of big tech's hideous experiment, as President Biden rightly called it. Parents like Kristen Bride have lost beautiful children like Carson, parents whose tears and raw grief, as you came to see me in my office, have moved me with heartbreaking power.

But beyond heartbreak, what I feel is outrage. Outrage at inaction, Congress's inexcusable failure to pass the bill that you advance courageously and eloquently, the Kids' Online Safety Act. Outrage at Big Tech pillaging the public interest with its armies of lobbyists and lawyers, despite their pledges of collaboration, outrage that you and other victims must relive the pain and grief that break our hearts and should finally be a moral imperative to action. We came so close last session. We need to seize this moment. We face a public health imperative, not just a moral reckoning. Our nation is in the midst of a mental health crisis. If you have any doubt about it, read the latest CDC survey that says three out of five girls in America experience deep depression, sadness, and helplessness that drives many of them to plan suicide.

It's a public health emergency, egregiously and knowingly exacerbated by big tech, aggravated by toxic content on eating disorders, bullying even suicide, driven by big tech's, black box algorithms leading children down dark rabbit holes we have to give kids and parents, yes, both kids and parents. The tools, transparency and guardrails they need to take back control over their own lives. And that is why we must and we will double down on the Kids' Online Safety Act after five extensive hearings last session with Senator Blackburn at our Commerce Consumer Protection Subcommittee. And I thank Senator Maria Cantwell for her leadership. After deeply painful conversations with young people and parents like Emma and Kristen, after testimony from brave whistleblowers like Francis Hogan, who presented documents, not just personal anecdotes, but smoking gun proof that Facebook calculatedly drove toxic content to draw more eyeballs, more clicks, more dollars more profits after Facebook hid this evidence from parents even misled us in Congress.

It's big tobacco's playbook all over again. The evidence of harm is heartbreakingly abundant beyond any reasonable doubt. Action is imperative now, and I think these brave victims at our hearing ought to provide the impetus and momentum right now urgently. The Kids' Online Safety Act can be a model for how bipartisan legislating can still work a message to the public that Congress can still work. We need to reform Section 230. Senator Graham and I are working on the Earn It Act. I commit that we will work on major section 230 reform and it will be bipartisan. This mental health crisis will persist, take more young lives unless Congress cares more about the Kids' Online Safety Act than it does about big tech. It's urgent that we move forward. And I am haunted by what one parent told me and all of us in advocating for the Kids Online Safety Act. She said, Congress must act. It's a powerful call to action. And she asked, how many more children have to die before we make them a priority? Now is the time. Let's pass it. That's her quote. Mine is Congress needs to act and heat that call and do it now. Thank you.

Senator Dick Durbin (D-IL):

Thank you Mr. Blumenthal. Senator Blackburn.

Senator Marsha Blackburn (R-TN):

Thank you Mr. Chairman. Uh, thank you for calling the hearing today. I appreciate that you and Senator Graham are turning attention to this. Uh, as many of you in the audience know, this is something that Senator Blumenthal and I have worked on for quite a period of time. We started on this about three years ago and what you saw over the last couple of years was a series of hearings and Kristen and Emma and others who came in to tell their stories and to provide us with information and to walk us through what was happening. So we have heard from parents and kids and teachers and pediatricians and child psychologists who are all looking at us in saying, this is an emergency and anybody who ha doubts it, Senator Blumenthal just held up. And I have also read the CDC report that just came out where you talk about youth risk behavior and guess what?

Social media is one of those items that is a part of that risk. And we are, we have just taken to heart, we've listened to not only the testimony in the hearings, but to many of you that came separately to our offices to talk to us and to say, this is our experience and we want somebody to know about this because something needs to be done. It is almost as if these social media platforms are operating in the days of the Wild West and anything goes and when these children are on these platforms, they're the product. They're the product. Their data is taken, that data is monetized and then it is sold to the advertisers who are going to feed more information to these children. And we've come up with this Kids Online Safety Act. Now we got close last time and we almost got it through the finish line and we didn't.

So new congress, a new start on this and we're so pleased that Judiciary Committee is working with us with Commerce Committee and we hope to get it on. There's some things that ought to be a given. These social media platforms ought to be required to make these platforms safer by default, not just safer. If you go through the 20 next steps, but safer by default, that ought to be required. We should also have requirement that these platforms have to do independent audits, go through independent audits, not their research. Now, some of you have said in these hearings we've done and you've heard these social media companies say, well, we're always auditing ourselves, but whoever knows what that audit shows, not you, not me, nobody knows. They like to keep that to themselves because, as Senator Blumenthal has said, eyeballs on that site for a longer period of time.

It's more money, money, money in the bank. And who pays that price? Our kids, our kids. Um, our legislation was supported by 150 different groups. Now in a time where politics is divided and you hear left and right, to get 150 different groups to come together and support something, I think that's a pretty good day. I think that shows a lot of support. So we realized that much of the reason these groups were coming out and supporting the transparency and the accountability and the duty of care was because they realized talking to these social media platforms was like talking to a brick wall. They could not get a response and cause of that something different was going to have to be done. Senator Graham said it well in his comments. It is imperative that we take an action because this is a health emergency. If you don't believe it, read the CDC report. When you have a majority of children that are experiencing adverse impacts from social media platforms, you have to step in and do something. And that is what we are working to do. We welcome all of you, thank you to our witnesses and we look forward to the hearing today.

Senator Dick Durbin (D-IL):

Thank you Senator Blackburn. Let me say at the outset that, uh, to explain, uh, to any newcomers, we have two roll call votes that are gonna start in just a matter of minutes. So members will come and go. That has no disrespect to the subject matter to our witnesses and guests, but we are gonna do a tag team to make sure there's always someone here, uh, to follow your testimony and try to gather after the roll calls, but that's the circumstance. Let me welcome the six witnesses. Kristen Bride is a survivor parent to Carson Bride and she's a nationally recognized social media reform advocate, founding member of the Screen Time Action Network Online Harms Prevention Group. She advocates for online safety for kids as a member of the Council for Responsible Social Media. She collaborates with other organizations to raise awareness and advocate legislation to hold big tech accountable.

Emma Lembke, she's from Birmingham, Alabama, second year political science major at Washington University in St. Louis, and the founder of Log Off, a youth movement that works to uplift and empower young people to tackle the complexities of social media. Ms. Lembke is also co-founded Technically Politics, a youth lobbying campaign dedicated to advocating greater regulation for big tech. Michelle DeLaune is president chief Executive Officer of the National Center for Missing and Exploited Children, the first woman to lead this organization. During her two decades at NCMEC, Ms. DeLaune has witnessed firsthand evolving threats to our kids, including the explosion, explosion of child sexual exploitation online. John Pizzuro serves as CEO of Raven, an advocacy group that focuses on protecting kids from exploitation and supporting those who fight for them previously. Mr. Pizzuro spent 25 years in the New Jersey State Police with the last six years as commander of their Internet Crimes Against Children Task Force.

There he led team of 200 individuals and 71 law enforcement agencies. They apprehended over 1500 people who preyed on innocence. Dr. Mitchell J. Prinstein, chief science officer for the American Psychological Association, responsible for leading their scientific agenda before assuming this post. He was the John Van Cedars distinguished professor of psychology at University of North Carolina at Chapel Hill. His research is focused on adolescent interpersonal experience and psychological symptoms including depression. Josh Golin, executive director of Fair Play, the leading independent watchdog of children's media and marketing industries Fair Play holds companies accountable for their harmful marketing and platform design choices, advocates for policies to protect children online. In his role, Mr. Golin regularly speaks to parents, professionals, and policy makers about how to create a healthier environment.

After we swear in the witnesses, each will have five minutes for opening statements. Then senators will have rounds of questions. So first, let me ask that all the witnesses stand to be sworn in. Please raise your right hand. Do you swear or affirm the testimony you're about to give before this committee will be the truth, the whole truth, and nothing but truth, so help you God? Let the record reflect the the witnesses have answered in the affirmative. Ms. Bride, please if you will start our round.

Kristin Bride:

Thank you Chairman Durbin, ranking Member Graham and members of the committee. My name is Kristin Bride. I am a survivor, parent and social media reform advocate, a member of the Bipartisan Council for Responsible Social Media. I am testifying here today to bring a face to the harms occurring every day resulting from the unchecked power of the social media industry. This is my son Carson Bride, with the beautiful blue eyes and amazing smile and great sense of humor who will be forever 16 years old as involved parents raising our two sons in Oregon, we thought that we were doing everything right. We waited until Carson was in eighth grade to give him his first cell phone and old phone with no apps. We talked to our boys about online safety and the importance of never sending anything online that you wouldn't want your name and face next to on a billboard.

Carson followed these guidelines yet tragedy still struck our family. It was June, 2020. Carson had just gotten his first summer job making pizzas and after a successful first night of training, he wrote his upcoming work schedule on our kitchen calendar. We expressed how proud we were of him for finding a job during the pandemic In so many ways it was a wonderful night and we were looking forward to summer. The next morning I woke to the complete shock and horror that Carson had hung himself in our garage while we slept. In the weeks that followed, we learned that Carson had been viciously cyber bullied by his Snapchat friends, his high school classmates who are using the anonymous apps, YOLO and LMK on Snapchat to hide their identities. It wasn't until Carson was a freshman in high school that we finally allowed him to have social media because that was how all the students were making new connections.

What we didn't know is apps like YOLO and LMK were using popular social media platforms to promote anonymous messaging to hundreds of millions of teen users. After his death, we discovered that Carson had received nearly a hundred negative harassing, sexually explicit and humiliating messages, including 40 in just one day. He asked his tormentors to swipe up and identify themselves so they could talk things out in person. No one ever did. The last search on his phone before Carson ended his life was for hacks to find out the identities of his abusers. Anonymous apps like Whisper, Sarahah and YikYak have a long history of enabling cyber bullying and leading to teen suicides. The critical flaws in these platforms are compounded by the fact that teens do not typically report being cyber bullied. They are too fearful that their phones to which they are completely addicted will be taken away or that they will be labeled a snitch by their friends.

YOLO's own policies stated that they would monitor for cyber bullying and reveal the identities of those who do so I reached out to YOLO on four separate occasions in the months following Carson's death, letting them know what happened to my son and asking them to follow their own policies. I was ignored all four times. At this point, I decided I needed to fight back. I filed a national class action lawsuit in May, 2021 against Snap Inc., YOLO and LMK. We believe SNAP Inc suspended YOLO and LM K from their platform because of our advocacy. However, our complaint against YOLO and LMK for product liability designed effects and fraudulent product misrepresentation was dismissed in the Central District Court of California last month citing section 230 immunity. And still new anonymous apps like NGL and Send It are appearing on social media platforms and charging teens subscription to reveal the messenger or provide useless hints.

I speak before you today with tremendous responsibility to represent the many other parents who have lost their children to social harms. Our numbers continue to grow exponentially with teen deaths from dangerous online challenges, sextortion, fentanyl laced drugs, and eating disorders. Let us be clear, these are not coincidences, accidents or unforeseen consequences. They are the direct result of products designed to hook and monetize America's children. It should not take grieving parents filing lawsuits to hold this industry accountable for their dangerous and addictive product designs. Federal legislation like the Kids Online Safety Act, KOSA, which requires social media companies to have a duty of care when designing their products for America's children is long overdue. We need lawmakers to step up, put politics aside, and finally protect all children online. Thank you for this opportunity and I look forward to answering your questions.

Senator Dick Durbin (D-IL):

Thank you, Ms. Bride. Ms. Emma Lembke.

Emma Lembke:

Hello everyone. My name is Emma Lembke. I'm originally from Birmingham, Alabama, but currently I am a sophomore studying political science at Washington University in St. Louis. I am humbled and honored to be here today. I created my first social media account Instagram in the sixth grade to as 12 year old girl to 12 year old me. These platforms seemed almost magical, but I began to spend more time online. I was met with a harsh reality. Social media was not magic. It was an illusion, a product that was predicated on maximizing my attention at the cost of my wellbeing. As my screen time increased, my mental and physical health suffered the constant quantification of my worth. Through likes, comments, and followers heightened my anxiety and deepened my depression. As a young woman, the constant exposure to unrealistic body standards and harmful recommended content led me towards disordered eating and severely damaged my sense of self. But no matter the harm incurred, addictive features like autoplay and the endless scroll pulled me back into the online world where I continued to suffer. And there I remained for over three years, mindlessly scrolling for five to six hours a day. I eventually reached a breaking point in the ninth grade and I began the long and difficult process of rebuilding my relationship with technology in a healthier way.

Senators, my story is not one in isolation. It is a story representative of my generation, Generation Z, as the first digital natives, we have the deepest understanding of the harms of social media through our lived experiences, but it is from those experiences that we can begin to build the most promising solutions. It is only when young people are given a place at the table that effective solutions can emerge and safer online spaces can be created. The power of youth voices is far too great to continue to be ignored. Through Log Off, I have engaged with hundreds of kids across the globe and who have shared their experiences of harm with me. I have listened as young people have told me, stories of online harassment, vicious cyber bullying, unwanted direct messages, but most powerfully I have heard as members of my generation have expressed concern not just for our own wellbeing, but for younger siblings, for cousins, and for all those to come after us.

While our stories may differ, we share the frustration of being portrayed as passive victims of big tech. We are ready to be active agents of change, rebuilding new and safer online spaces for the next generation. 10 years from now, social media will not be what it is today. It will be what members of my generation build it to be. We want to build it differently. We want to build it right. I came here today as the representative for those young changemakers to be the voice, not just of those in my generation who have been harmed or who are currently struggling, but to be a voice for all of those 12 year old girls yet to come. The genie is out of the bottle and we will never go back to a time where social media does not exist, nor should we, but make no mistake, unregulated social media is a weapon of mass destruction that continues to jeopardize the safety, privacy, and wellbeing of all American youth. It's time to act and I urge you senders to make, to take meaningful steps to regulate these companies, not just for our generation and my generation, but with my generation. Integrating youth lived experiences is essential in the regulatory process in getting it right. Thank you for having me here today and I look forward to answering your questions.

Senator Dick Durbin (D-IL):

Thank you, Ms. Lembke. Uh, Ms. DeLaune.

Michelle DeLaune:

Thank you. Good morning, chairman Durbin, ranking member Graham and members of the committee. My name is Michelle DeLaune and I am the president and CEO of the National Center for Missing and Exploited Children. NCMEC is a non-profit organization created in 1984 by child advocates to help find missing children, reduce child sexual exploitation, and prevent child victimization. I'm honored to be here today to share NCMEC'S perspective on the dangers that are facing children online and how we can work together to address these challenges. We have reached an inflection point in efforts to combat online child sexual exploitation, and we need congressional intervention to pass legislation that I'll be speaking to today. Last year, the NCMEC cyber tip line received over 32 million reports. These reports contained over 88 million images and videos and other content related to child sexual exploitation. And to put these numbers into perspective, we're averaging 80,000 new reports each day.

The internet is global and unfortunately, so is this crime. 90% of the reports that we received last year related to individuals outside of the United States and the remaining reports about 3.2 million related to US individuals. The report numbers are staggering, but the quality of reports is often lacking and there are significant disparities in how companies report. For instance, companies have no duty to report child sex trafficking or online enticement of children. Some companies choose not to report sufficient information for those cases to be properly assessed and investigated. And some companies choose not to submit actual images or the videos actually being reported or any information that could be used to identify a suspect or a victim. And we're just seeing the tip of the iceberg. Very few companies choose to engage in voluntary measures to detect known child sexual abuse material and those who do proactively look for that make the most reports.

Congress has the opportunity to send a powerful message to victims that they are not powerless to protect themselves and when abuse imagery of themselves has been shared online. Currently, child victims have no recourse. If a tech company takes no action to retop, remove and report sexually explicit imagery in which they're depicted. At the core of ncmec mission is helping children and supporting survivors, and we do a lot to support survivors, but we need Congress to help address the complexities that survivors face in this space. The following legislative measures are urgently needed to sur uh, support survivors laws that require that content seized by federal law enforcement from offenders be sent to NCMEC for victim identification efforts and supporting restitution efforts. Laws enabling child victims of extortion and enticement to have immunity when reporting their images to NCMEC. Laws enabling minor victims to have legal recourse of a tech company knowingly facilitates the distribution of their sexually abusive imagery regulations to implement the remedies promised to survivors in 2018, when the Amy, Vicky, and Andy Act was passed by Congress, and laws to make sure that we are using the appropriate words when we're discussing these crimes.

Child sexual abuse material, not child pornography. And while we struggle to address the current volume and complexity of online child sexual exploitation, additional threats to child safety online are occurring when a platform implements end-to-end encryption. No one, not even the platform itself has visibility into users exploiting children. We believe in a balance between user privacy and child safety. When tech companies implement end-to-end to encryption with no preventive measures built in to detect known child sexual abuse material, the impact on child safety is devastating. Several of the largest reporting companies have indicated that they will be moving to default, end-to-end encryption this year. We estimate that as a result, two-thirds of reports to the cyber tip line submitted by tech companies will go away and these reports will be lost simply because tech companies have chosen to stop looking for the material. And we can talk about re or lost report numbers, but behind every report is a child and the abuse doesn't stop just because we decide to stop looking for it.

We look forward to working with Congress and other stakeholders on solutions. In closing, NCMEC is proud to support many excellent legislative initiatives from last Congress, including the EARN It Act, the End Child Exploitation Act, and the Preventing Child Sexual Abuse Act. And we look forward to working with Congress to ensure the legislative measures become law in the current term. I thank you for the opportunity to appear before the committee to discuss the protection of children online. We're eager to continue working with this committee survivors and their families, the Department of Justice, engaged tech companies and other nonprofits to find solutions to these problems. Because like you, we believe that every child does deserve a safe childhood. I thank you and I look forward to your questions.

Senator Dick Durbin (D-IL):

Thank you Ms. DeLaune. Mr. Pizzuro.

John Pizzuro:

Chairman Durbin, Ranking Member Graham and distinguished Senators, thank you for this opportunity to testify on protecting our children online today. There are countless victims of infant and children being raped online as well as extortion. The sad reality is we're failing to protect our children from the threat threats they face online. Those who would protect our youth are overburdened under-resourced, which makes those children vulnerable. I'm here today as the CEO of Raven, an advocacy group of comprised of 14 professionals, including nine retired internet crimes against children commanders, task force commanders who have committed their lives to the advocacy and the protection of children. I'm retired from the New Jersey State Police where I served as the commander of the ICAC Task Force. We witnessed children targeted by offenders across all platforms. No social media or gaming platform was safe from apps such as Snapchat, Twitter, Kik, Telegram, Discord, LiveMe and MeetMe to gaming platforms in outline games such as Minecraft Roblox and Fortnite.

And these just represent a fraction of places where offenders regularly interact with children. If the platform allows individuals to chat or a way to share a photograph and videos, I assure you there's a very real danger that offenders are using that access to groom or sexually exploit minors. Children are made vulnerable on these platforms as a result of poor moderation, the absence of age or identity verification and inadequate or missing safety mechanisms, and the sheer determination of offenders. As the New Jersey ICAC commander, I struggled with the significant increases in arrests, victims investigations we faced each year. These challenges were frustrating and present with every ICAC task force commander throughout the United States. The most staggering increase we faced was self-generated sexual abuse videos of children ages seven, eight, and nine. The online landscape is horrifying because offenders know this is where our children live and they recognize there are not enough safeguards to keep them at bay.

The details of these cases shock the conscious. There's no shortage of case reports describing the sexual abuse of 11 year olds, or a mother who is targeted by an offender because her five year old is too young to text, but is the age of interest for the offender or the offender bought a stuffed animal for the 10 year old that he was going to rape along with a bottle of Viagra and other sexual devices when that Viagra failed. Today, law enforcement is no longer able to proactively investigate child exploitation cases due to the volume of cyber tips. As a result of that increase, law enforcement agencies have been forced to become reactive and most no longer can engage in the proactive operations such as peer-to-peer file sharing investigations or undercover chat operations, which target hands-on offenders. Sadly, most of the investigative leads provided by service providers through nemec, uh, to the ICAC task forces are not actionable, meaning they do not contain sufficient information to permit an investigation to begin.

The lack of uniformity in what is reported by service providers results in law enforcement being forced to sort through thousands of leads trying to desperately identify worthwhile cases, peer-to-peer files sharing investigations and operations used to allow ICAC task forces to efficiently locate and apprehend hands-on offenders. In the last 90 days alone, there have been a hundred thousand IP addresses across the US that have distributed known images of rape and toddler, um, sexual abuse, yet only 782, less than 1%, are being worked. Right now, the dark net including tour has become the newest online haven for child exploitation. Some forms and boards contain the most abusive child exploitation videos in images. Law enforcement has encountered chat forms allow offenders to create best practices on how to groom and abuse children effectively. There's a post even named The Art of Seduction that explain how to seduce children that has been read more than 54,000 times. Based upon what I've experienced, I can confidently tell you three things at the moment: the predators are winning, our children are not safe, and those who are fiercely committed to protecting them are drowning and will continue to do so unless we can get them the resources they need. I thank you for the opportunity to testify here today, and I welcome your questions.

Senator Dick Durbin (D-IL):

Thank you very much. Dr. Prinstein.

Dr. Mitch Prinstein:

Good morning, Chairman Durbin, Ranking Member Graham and members of the Judiciary Committee. Thanks for the opportunity to testify today. Psychologists are experts in all human behavior and we've been studying the effects of social media scientifically for years. In my written testimony, I've detailed a variety of caveats, limitations, and clarifications that make it challenging as a scientist to offer causal statements about the effects of social media. In short, online activity likely offers both benefits and harms. Today I wanted to discuss specific social media behaviors and features that are most likely to harm and which youth may be most vulnerable. Unfortunately, some of these most potentially harmful features are built directly into the architecture of many social media applications and kids are explicitly directed towards them. To date, we have identified at least seven sets of results that deserve more attention to safeguard risk for children. I'll briefly describe these here, but first, it's critical to understand that following the first year of life, the most important period for the development of our brains begins at the outset of puberty.

And this is precisely the time when many are given relatively unfettered access to social media and other online platforms. In short, neuroscience research suggests that when it comes to seeking attention and praise from peers, adolescent sprains are all gas pedal with weak breaks. This is a biological vulnerability that social media capitalizes on with seven psychological implications. First, our data suggests that the average teen is picking up their phone over 100 times and spending over eight hours online a day mostly on social media. Psychological science reveals that over half of all youth report at least one symptom of clinical dependency on social media, such as the inability to stop using it or a significant impairment in their ability to carry out even simple daily functions. Second, as compared to what kids see offline, data suggests that exposure to online content changes how youth sprains respond to what they see and influences teens later behavior.

These are psychological and neuroscientific phenomena occurring outside of youth's conscious awareness, suggesting a potentially troubling link between likes, comments, reposts and teens later, risk taking behavior. Third, although many platforms have functions that can be used to form healthy relationships, users instead are directed to metrics and follow accounts that don't really offer psychological benefits. For this reason, social media often offers the empty calories of social interaction that appear to help satiate our biological and social needs, but do not contain the healthy ingredients necessary to reap benefits. Research reveals that in the hours following social media use teens paradoxically report increases rather than decreases in loneliness. Fourth data suggests that approximately half of use experienced digital stress, a phenomenon resulting from too many notifications across platforms, a fear of missing important social updates, information overload and anxiety that their posts will be well received. More digital stress predicts increases in depression over time.

Fifth, a remarkably high proportion of teens are exposed to dangerous, discriminatory, and hateful content online. This predicts anxiety and depression among youth, even beyond the effects of similar content they see offline. Six. The more kid time kids are online, the less time they're engaged in activities critical for healthy development. Most notably sleep. Sleep disruptions at this age are associated with changes in the size and physical characteristics of growing brains. And last new evidence suggests frequent technology use made change adolescent brain growth to increase sensitivity to peers attention and change teens self-control. So what do we do? First and foremost, we must increase federal funding for this research. 15 million will not move the needle. The funding for this work should be commensurate with our commitment to protect children. Second, parents and teens must become better educated about these emerging research findings. Recently, more than 150 organizations led by APA called on the Surgeon General to create and distribute teaching resources so families could minimize risks and maximize benefits from social media.

Third, more must be done to protect youth who belong to traditionally marginalized communities. Warnings on harmful, illegal, hateful and discriminatory content should be mandated. Yet content and space is scientifically proven to offer social support and vital health information to members of these communities must be saved. The manipulation of children to Jennifer generate a profit is unacceptable. The use of children's data should be illegal and the use of psychological tactics known to create addiction or implicitly influence children's behavior should be curtailed. Social media companies should be compelled to disclose both internal and independent data, documenting potential risks that come from their use of their products. So parents, teens and regulators can make informed decisions. APAs heartened by the focus on mental health and congress and eager to work with this committee to develop re legislation and help enact bills that will protect children your actions now can make a difference.

Senator Dick Durbin (D-IL):

Thank you, doctor. Mr. Golin.

Josh Golin:

Thank you. Chair Durbin, ranking member Graham and distinguished members of the committee for holding this important hearing. My name is Josh Golin and I'm executive director of Fair Play, an organization committed to building a world where kids can be kids free from the harmful manipulations of big tech and the false promises of marketers. We advocate for policies that would create an internet that is safe for young people and not exploitative or addictive. You've heard today from witnesses about a litany of online harms that have had a devastating toll on families In our society, these harms share a common nexus, big tech's business model and manipulative design choices. Digital platforms are designed to maximize engagement because the longer they capture a user's attention, the more money they make by collecting data and serving targeted ads. As a result, children are subject to manipulative design and relentless pressure to use these platforms as often as possible.

Over a third of teenagers say they are on social media, almost constantly overuse of social media displaces critical offline activities like sleep, exercise, offline play, and face-to-face interactions, which in turn undermines children's wellbeing. Big tech's profit-driven focus on engagement doesn't just harm young people by fostering compulsive overuse. It also exploits their developmental needs often at the expense of their safety and wellbeing. For example, displays of likes and follower accounts which take advantage of young people's desire for social approval. Invite harmful social comparisons and incentivize interactions with strangers and the posting of provocative and risk a content. Additionally, algorithms designed to maximize engagement fill young people's feeds with curated content that is most likely to keep them online without regard to the user's wellbeing or potentially harmful consequences. So on platforms like Instagram and TikTok, depressed teens are shown content promoting self-harm and young people interested in dieting are barrage with content promoting eating disorders.

A report last year from Fair Play detailed how Meta profits from 90,000 unique pro-eating disorder accounts on Instagram that reach more than 6 million minors, some as young as nine. How did we get here? For one, the last time Congresspassed the law to protect children online was 25 years ago. The digital landscape has changed dramatically in unforeseen ways since the passage of the Children's Online Privacy Protection Act. And that law only covers children until they turn 13, leading a significant demographic vulnerable to exploitation and harm. Consequently, the social media platforms that define youth culture and shape our children's values, behavior and self-image were developed with little to no thought about how young people might be negatively affected. At this point, it is clear that tech platforms will not unilaterally disarm in the race for children's precious attention. Nor can we expect young people to extract themselves from the exploitative platforms where their friends are or expect overworked parents to monitor every moment that their kids are online.

We need new legislation that puts the breaks on this harmful business model and curbs dangerous and unfair design practices. Such legislation should one, extend privacy protections to teens to limit the collection of data that fuels harmful recommendations and puts young people at risk of privacy harms. Two, ban surveillance advertising to children and teens to protect them from harmful marketing targeted to their individual vulnerabilities. Three, impose liability on companies for how their design choices and algorithms impact young people. Four, require platforms to make children's privacy and account settings the most protective by default. And finally, impose transparency requirements including access to algorithms that enable outside researchers to better understand how social media impacts young people. Last Congress, the Kids Online Safety Act and the Children and Teens Online Privacy Protection Act, two bills, which together would do all five of these things advanced out of the Commerce committee with broad bipartisan support.

The committee votes followed a series of important hearings in the Senate Judiciary and Commerce committees, as well as the House that established a clear record of harm and the need for new online protections for young people. We've named the problem and debated the solution. Now's the time to build on last year's momentum and disrupt the cycle of harm by passing privacy and safety by design legislation. Let's make 2023 the year that Congress finally takes a huge step towards creating the internet children and families deserve. Thank you so much for having me here today, and I look forward to your questions.

Senator Dick Durbin (D-IL):

I want to thank all the witnesses and, uh, as you noted, some of the members are going to vote and will return at the bottom, uh, of this discussion From the legal point of view is section 230 of the Communications Decency Act, which I'm sure you're all aware of as to the liability of these companies for the speech that is broadcast or is exercised over their, uh, social media. It provides that companies will not be treated as publisher or speaker of any information provided by another person gets 'em off the hook. Uh, the EARN It Act, which we are debating here, would change that ballgame unless there is a provable effort by these companies to police their own, uh, product. Uh, they would be exposed to liability. And I will tell you, as a former trial lawyer, uh, I invite them to take on, uh, the media that ignore that responsibility after the Internet act is enacted into law. I hope that will be soon. Mr. Lawn, when you told the story about encryption, uh, inhibiting the cyber tips that come your way, uh, I couldn't help but be struck by the numbers that you used last year. 32 million cyber tips were sent to NCMEC, your organization concerning child sex abuse material. Upwards of 80% or 25 million of those would be lost if the companies adopt end to in encryption. Would you bring that explanation down to a level where liberal arts majors are with you?

Michelle DeLaune:

Absolutely. Senator. Thank you. With the end-to-end encryption again, where end-to-end encryption serves a very important purpose, end-to-end encryption with no mitigation strategy for the detection of known child sexual abuse imagery is unacceptable. Though what we have seen, the the vast numbers to the cyber tip line are because companies have voluntarily, a handful of companies have voluntarily chosen to look and seek out known child sexual abuse material by simply turning off the lights and no longer looking. The abuse doesn't go away, the abuse continues. Just nobody is able to actually investigate, intervene, and help a child. Uh, you know, we really support a balanced approach. There are disagreements and discussions between many stakeholders regarding how end-to-end encryption can balance user safety, user privacy with not having children as collateral damage. Um, you know, we also want to speak to the privacy of the children who are depicted in the imagery that is continuing to be circulated. These are images as, as Mr. Pizzuro mentioned, images of children being sexually abused and raped. They also are entitled to privacy. So we do look for a balanced approach that will help support user privacy and not leave children as as unfortunate collateral damage.

Senator Dick Durbin (D-IL):

Let me, uh, open another subject for inquiry and that is the statement by Dr. Prinstein. Mr. Golin, kinda reflects Emma Lembke on your decision at a very young age to do something about what you consider to be a problem. I'm trying to square this, the possibility of diverting people from conduct, which apparently is almost addictive in its nature and move them to a different level. Can you comment on that?

Emma Lembke:

Yes, sir. And Senator, thank you for your question. I think what is important to note is that social media is not all bad. Members of my generation understand it to be a multifaceted entity. One where we can connect with each other, we can explore our identities, and we can express ourselves on a new dimension. The difficulty though, of reaping these benefits in these online spaces is as they are right now, as the status quo creates it. I, a 12 year old girl could go onto Instagram and research a healthy recipe, and within seconds be fed pro anorexic content. There are steps that companies can take to place meaningful safeguards so that this content does not harm young people. And so that we can begin to go into these online spaces in a safer and more productive manner, reaping the benefits of a technological era.

Senator Dick Durbin (D-IL):

Dr. Prinstein, your comment on that?

Dr. Mitch Prinstein:

I agree. The adolescent brain is built to develop dopamine and oxytocin receptors in an area of the brain that makes us want to connect with peers. And it feels really good when we do the area of the brain that stops us from engaging in impulsive acts called the prefrontal cortex does not fully develop until the age of 25. So from 10 to 25 kids' brains are builds in such a way to make them crave the exact kind of content that social media can provide with like buttons and reposts. But they are biologically incapable of stopping themselves from incessant use of these platforms. That vulnerability is being exploited by these platforms.

Senator Dick Durbin (D-IL):

And the question is whether or not on their own kids can solve the problem. Do they need help?

Dr. Mitch Prinstein:

They need help.

Senator Dick Durbin (D-IL):

What kind of help?

Dr. Mitch Prinstein:

Reminders, telling kids that they've been on for longer than they intended, helping kids to stop the, the signals that are coming through social media in the forms of likes, reposts, algorithms that are showing them content, feeding them the next video, feeding them the next post. Those are all actually making things much worse from a neuroscientific perspective, if there were controls in place that were age-based to make sure that kids were being blocked from engaging in this unbridled kind of craving for social attention and dopamine responses that could significantly address the issue.

Senator Dick Durbin (D-IL):

Thank you. I'm gonna recognize, uh, Senator Grassley and then Senator Coons is gonna preside as I make a dash to vote and return. So Senator Grassley, floor's yours.

Senator Chuck Grassley (R-IA):

Yours. Thank you Mr. Chairman. Thanks all. Your witnesses. I'm sorry I missed your testimony for other reasons that's already been explained to you. Uh, I'm glad that we're here, uh, discussing this very important issue today. Uh, I happen to be a father, grandfather and great-grandfather, but regardless, we all gotta be, uh, uh, with this worthy cause that we're discussing today. Congress has and will continue to play a crucial role. Unfortunately, Congress has had to intervene in times in the past. Uh, just wanna remind people of the Larry Nassar, uh, uh, thing, uh, dealing with young girls and the botched investigation of the FBI. And Senator Ossoff and I got a bill passed that would, further give federal intervention in the case of those crimes being committed. If they're committed outside the United States by somebody following young people to international meets.

It's also important to hold online service providers accountable in keeping our children safe. This EARN It, which I was an original co-sponsor of last year, ensures online service providers that fail to crack down on certain contents are not able to escape because of Section 230, uh, intervention. And also protecting children on, uh, online also means combating human trafficking. And Senator Feinstein and I have passed, uh, legislation in that area as well. Uh, the, of course, it's impossible to discuss protecting children online without pointing out the unfortunate role of social media and the internet, playing in, in the drug overdose deaths among our children. And I look forward to discussing that strategy to prevent those.

So I'm going to go to Mr. Pizzuro first. Uh, recently, uh, an Iowa family lost their daughter because she bought a fake prescription pill from a drug dealer on Snapchat. Uh, it contained Fentanyl. Her family is suing to try and hold Snapchat accountable. One particular allegation is that Snapchat's algorithms connected their child with a drug dealer who she did not know previously, which I would find especially disturbing. So for you, uh, to the best of your ability, can you explain to this committee how Snapchat's algorithms protect children against with drug dealers?

John Pizzuro:

Thank you, Mr. Grassley. As far as Snapchat and the algorithms, not a hundred percent sure on how Snapchat is doing it, but I could talk to the broader experience of cell phone usage as, as far as apps and drugs, because whether it's narcotics, whether it's child exploitation, whether it's pictures and videos, whether it's emojis, everything is done through that social media, again, that's where children are. So it's very easy to target them specifically in those, in those realms. So I think a lot of times you're going to have that, again, whether it be fentanyl, whether it be marijuana, that doesn't matter the drug, but the scope is where I can target those individuals and the offenders as well as the individual selling that know that

Senator Chuck Grassley (R-IA):

You said you couldn't speak specifically to Snapchat, so I was gonna ask you what, uh, that needs, what that, uh, uh, social media needs to do to differently, uh, to stop for what happening. But you could answer the second part of that question. What can the government do better?

John Pizzuro:

Well, the government can do a lot better as far as that we're talking about today. Uh, we need a little bit more, uh, first of all, uniformity, age identification, identity verification. There's a lot of times where the users, uh, tomorrow I can go get a phone and be whoever I want to be. I can get a phone, I can create an app, I can create a fake email address and then use it for, for whatever reasons I need to. So from that perspective, is that from the tech companies? We need a little bit more from that moderation and that aspect. Who's on what end of the phone?

Senator Chuck Grassley (R-IA):

My next and last question will be to miss DeLaune. If I'm pronouncing your name right. Technology created these problems and technology advances will be essential to fighting these problems in the future. So can you tell me about the tools available today to address the online dangers to children? And what more should social media do in online plans for platforms do to protect children?

Michelle DeLaune:

Thank you, Senator. There are various initiatives and technologies that are being used by some social media companies. Uh, certainly not all. Uh, and because of these tools such as searching surgically for known child sexual abuse material, companies are able to surface it. Uh, there are other companies that are voluntarily, uh, choosing to look for online enticement and instances where children might be extorted online, where, uh, offenders target them for imagery or for financial gain. Uh, there's an important aspect of companies being transparent of what tools they're using, not only for, for the consumer to understand what platforms are doing, but also to share with one another what the best practices are. When everyone is speaking freely, we're able not only to see what works, but also what significant gaps still exist.

Senator Chuck Grassley (R-IA):

Thank you very much. Senator Coons.

Senator Chris Coons (D-DE):

Thank you. And, um, thank you to Chair Durbin and to Ranking Member Graham for both convening this hearing and for your ongoing work to find a bipartisan path forward. Ms. Bride, Ms. Lembke, thank you for your testimony today, uh, and for making clear and purposeful what we all know, which is that, uh, far too many Americans are spending time on social media, and in particular for young Americans, it can have harmful, even destructive or toxic impacts. Um, we have limited research about exactly what the effects are of the design choices that social media platforms are making on childhood development and on children's mental health. We all know they design their platforms to hold our attention longer and longer, and we know from your testimony and many of us through, uh, personal exposure that it is not helpful, but we need to better understand why it's harmful and how it's harmful so we can craft solutions that'll move us forward.

Uh, I've worked with Senators Klobuchar and Cassidy on a bipartisan bill, the Platform Accountability and Transparency Act that would make social media companies work with independent researchers to validate and ensure that we understand how these platforms impact our children. Uh, the, um, surgeon General of the United States came and spent a day with us in Delaware and visited a youth center and listened to some of our youth from Delaware and some mental health professionals and public health professionals to talk about this nationwide public health crisis. Dr. Prinstein, um, you call in your testimony for a greater transparency and reporting requirements for social media companies, including better data access for researchers. What kinds of questions about children's mental health would we be able to answer with greater data access? And what data do researchers need that they don't currently have access to, and what are the barriers for their access?

Dr. Mitch Prinstein:

Thank you so much for your question. There are numerous barriers. We don't have the funding to be able to do the research that we need to do. We actually find that number of, uh, academics who are pursuing a career in research on social media are recruited by social media companies themselves and, uh, offer salaries that make it very hard to compete in an academic environment. Um, the data that we, that social media's companies have would a, allow for a better exploration of exactly what it is that kids are viewing, how they're using social media, what they're seeing, how that's related to future behaviors, including what they log on, what they share, how they share that information. It would be tremendously valuable for scientists to be able to understand those questions and link it specifically to mental health. In fact, there, there is no such access right now, which is severely hindering our ability to work scientifically in this area.

Senator Chris Coons (D-DE):

Thank you. Mr. Golin you also call for Congress to implement transparency requirements to allow independent researchers to better understand the impact of social media on young Americans. The Platform Accountability and Transparency Act would require platforms to disclose information about how their algorithms actually operate so that we could conduct that research in a reliable and um, stable way. Do you agree this would help parents ultimately to make better informed decisions about the social media products their children consume?

Josh Golin:

I think transparency and researcher access is a critical piece of the equation. We shouldn't have to rely on courageous whistleblowers like Francis Haugen to understand what the companies already understand about how these technologies are impacting our children. So I think it's incredibly important that we have, uh, that we have transparency requirements and researcher access. I will say though, um, that we can't stop there. We need also at the same time to have a duty of care for these platforms to limit their data collection and their, the, what they're doing with that data. So I wouldn't want to see a transparency be a, you know, kicking the other policies down the road. We need to limit what the platforms are doing at the same time that we get a a view into what they're doing.

Senator Chris Coons (D-DE):

I agree with you. Um, look, many of us have the strong sense based on testimony. We hear based on our own experience as parents and community leaders, um, that this, as Senator Blumenthal called it this, a toxic experiment on our children is going badly wrong. I look forward, uh, joining in support of the Kids Online Safety Act, for example. Um, but I also think we need to get underway, uh, with better funded broader spectrum research, um, so we know exactly what is happening and what isn't, and how we can fine tune our responses. Mr. Pizzuro, if I might, uh, I appreciate your work, uh, to protect children by leading New Jersey's Internet Crimes against Children Task Force. What were the biggest problems you faced, uh, when investigating leads generated by cyber tips? And how can Congress provide resources or improve the quality of those investigations?

John Pizzuro:

Well, there's a lack uniformity. Uh, so what would happen is that there's so many tips. So, like New Jersey, for example, I think this year had 14,000 when I was there in 2015, it was 2000. And the challenge is, is that there are tips within that that will result in a significant arrest. But the challenge is the volume and the ESP and the providers that are actually giving us that information do not give us that information. And if you go from a tip perspective, I asked if I asked everyone in here who had an iPhone, we don't get any tips from Apple, right? So that's now double that. So I think those are the challenges. We need to have that better information. We need to have viability where we can actually protect witnesses.

Senator Chris Coons (D-DE):

Last question, if I might miss DeLaune in your testimony, um, you said most sextortion offenders are located outside the U.S. You mentioned particularly Nigeria and Côte d'Ivoire. How could we better work with international partners, uh, and law enforcement to combat this growing problem?

Michelle DeLaune:

Thank you, Senator. Uh, yes. The problem with sextortion, we're seeing a rapid increase of exponentially more reports now regarding children who are being targeted, uh, for money. Uh, it's aggressive. We talk to these victims, we talk to their parents on the phone, and it's heartbreaking. Uh, there has been a coordinated effort amongst law enforcement, um, to identify where these, uh, offenders are coming from. This is an organized crime syndicate. Certainly there are offenders, uh, all around the world. We are seeing that there's, uh, a criminal component, um, uh, with, uh, Nigeria and Ivory Coast, um, in some instances. Uh, and we're also working with the tech companies because the tech companies, it, it takes all partners here to be able to find the solution and sharing elements between companies. Um, because offenders and children move from platform to platform, it's really important to be able to share that information so we can stop, intervene, make an adequate, uh, uh, uh, good report that law enforcement would then be able to safeguard a child and hopefully hold an offender accountable.

Senator Chris Coons (D-DE):

Thank you. Thank you all very much for your testimony. Senator Graham.

Senator Lindsey Graham (R-SC):

Thank you all. It's been a very, very helpful, uh, hearing, uh, Ms. Bride, after the tragic loss of your son, you complained to certain apps, uh, that allowed, um, bullying without naming who the person was. Is that correct?

Kristin Bride:

Yes, Senator.

Senator Lindsey Graham (R-SC):

And what response did you get?

Kristin Bride:

I reached out to Yolo, the anonymous app that was used to cyber bully my son. I told them what happened to my son, and I asked them to follow their policies, which required that they revealed the identity of the cyber bully. And I was ignored all four times.

Senator Lindsey Graham (R-SC):

Okay. So, uh, you followed a lawsuit against these products, is that correct? Yes. Uh, you're alleging they were unsafe, Dr. Prinstein, is that right?

Dr. Mitch Prinstein:

Yes. Uh, I believe there are a number ...

Senator Lindsey Graham (R-SC):

Let me ask a question first. Sorry. Do you believe these products aren't safe, uh, the way they're configured today for children?

Dr. Mitch Prinstein:

The research is emerging, but we have a number of reasons to think that some of the features, uh, that are built into social media indeed are, uh, conferring harm directly to children.

Senator Lindsey Graham (R-SC):

Are you recommending to the committee that these, uh, social media companies put warning labels on their products like we do with cigarettes?

Dr. Mitch Prinstein:

I don't think that would hurt at all.

Senator Lindsey Graham (R-SC):

Okay. Uh, back to Ms. Bride. So you sued and you were knocked outta court cause of Section 230, right? Yes. Okay. So how many of you, or Dr. Prinstein do you, are you a practicing psychologist, psychiatrist?

Dr. Mitch Prinstein:

I'm not. I'm a clinical psychologist. I'm not practicing at the moment.

Senator Lindsey Graham (R-SC):

Okay. Do you have a license?

Dr. Mitch Prinstein:

I do.

Senator Lindsey Graham (R-SC):

Now, how many of you have a driver's license now that can be taken away from you if you do certain things? Are any of these social media companies licensed by the government? The answer is no. Is it pretty clear that section two 30% prevents individual lawsuits against these social media companies? Everybody's nodding their head. Is there any regulatory agency in America that has the power to change the behavior of these companies in a meaningful way? The answer is no. Are there any statutes on the book today that you think can address the harms you've all testified regarding?

The answer is no. You can't sue 'em. There's no agency with the power to change their behavior, and there's no laws on the books that would stop this abusive behavior. Is that a fair summary? Where we're at in 2023? All the witnesses nodded. Do you think we can do better than that? Isn't that the reason you're here? The question is, why haven't we done better than that? Uh, Senator Blumenthal and I had a bill that got 25 votes on the Judiciary Committee. There are 25 of us. I can't think of any subject matter that would bring all 25 of us together. So, Mr. Chairman, in spite of all of our differences, let's make a pledge to these people. Ms. Lembke, what, how do you say your last name?

Emma Lembke:

Lembke.

Senator Lindsey Graham (R-SC):

Do you believe that your generation particularly has been let down?

Emma Lembke:

Yes, Senator. I do.

Senator Lindsey Graham (R-SC):

And you worry about future generations even being more harmed?

Emma Lembke:

Yes, sir. Every day.

Senator Lindsey Graham (R-SC):

The behavior that we're talking about is driven by money. In terms social media, the more eyes, the more money. Is that correct? So the financial incentive of the social media companies exist today to do more of this, not less. Everybody noted the affirmative. Mr. Pizzuro, you said, of the platforms that sexual predators use, is Twitter one of them?

John Pizzuro:

Yes. Every platform. There's, I don't think there's a platform that I haven't seen used.

Senator Lindsey Graham (R-SC):

Okay. So if we did a regulatory consumer protection agency to hold these people account, would that be a step in the right direction?

John Pizzuro:

I believe so, yes.

Senator Lindsey Graham (R-SC):

If we change Section 230 to allow more consumer pushback, would that be a step in the right direction? Everybody nodded. If we pass the Online Child Protection Act in the EARN It Act, would that be a step in the right direction? Everybody nodded. Mr. Chairman, we know what to do. Let's just go do it.

Senator Dick Durbin (D-IL):

Thank you, Senator Graham. And I accept the invitation. I might add that the Commerce Committee has jurisdiction on this issue too. And I spoken to Senator Cantwell, she shares this sentiment. Uh, wouldn't it be amazing if Congress could do something on a bipartisan basis? And, uh, why not start here? So let's continue with this hearing and with some resolve. Senator Blumenthal.

Senator Richard Blumenthal (D-CT):

Thanks, Mr. Chairman. And I, I want to add, again, my thanks to Senator Graham for his partnership on the EARN It Act. Uh, we've worked together on this measure that recognizes the excessive breath of Section two 30, and the idea of the Earner Act is very simple. That if any company wants to have any defense or immunity against legal action, it has to earn it, it has to earn it. That's why we named it the Earn It Act. And it is a beginning, it's a step, not a stride, but it will mark major progress if we are able to pass this measure. And I am grateful to the chairman for his support. Senator Grassley, for his, uh, I'm gonna embarrass myself a little bit. Uh, I began working on this problem when Big Tech was Little Tech, and NCMEC was so importantly helpful in this effort, and it has continued.

So I want to thank NCMEC for your continued support and work in this area. And, uh, to Emma Lembke, uh, Log Off is exactly what we need. And I'm going to go a little bit outside my lane here and suggest that we have you and number of your supporters and members back here, and that we do a little lobbying with you talking to my colleagues, which I think will overcome the massive number of lobbyists and lawyers that now Big Tech has. And, uh, you know, Kristin, you have been such an eloquent and moving advocate, but like you so have been. Many of the other parents, they've sat with Senator Blackburn and me, and those conversations and meetings have been some of the most really powerful moments. So I would invite you again, to come back. I know that for both of you and for others in this position, it's difficult to do because you are reliving that pain.

You are going through that loss. And so I, I want to thank you for your continuing effort, and I'd like to invite you back too. Uh, the EARN It Act and the Kids Online Safety Act are the least we can do, the very least we can do to help begin protecting against big tech. And, uh, the chairman has suggested that maybe we'll have big tech come back. Uh, frankly, I'm less interested in big tech's words than big tech's actions because they've said again and again and again, oh, well, we're for regulation, but just not that regulation. And if it's different regulation, oh, well, that's not quite it either. Um, so we're gonna continue this work. And, uh, my thanks to everybody who is here today. I want to ask, uh, Dr. Prinstein, uh, because this report that the CDC came out with today talks not only about girls and the crisis they are going through in this country, but also about LGBTQ+ young people and how they particularly are going through this crisis. Could you describe for the committee how the addictive and harmful content affects them, maybe more than others, either through bullying or other kinds of, uh, toxic content driven at them?

Dr. Mitch Prinstein:

Absolutely. Thank you. The LGBTQ+ community is experiencing a disproportionate amount of mental health issues, particularly related to the stress that they experience as a marginalized or minoritized group. Um, they are also experiencing a much higher rate of self-harm and suicide compared to others. Um, the research on social media has demonstrated a remarkably high proportion of posts that are discriminatory or hateful, either to the entire LGBTQ+ community or to individuals based on their LGBTQ+ status. So it's very important to recognize that, that online discrimination does have an effect on mental health directly. It is important, however, to recognize that the online community also provides vital health information and does provide social support that can be beneficial to this community. So, um, it's a complex situation, but one that deserves tremendous attention. Thank you.

Senator Richard Blumenthal (D-CT):

Thank you. Thank you to all the panel for being here today. Thanks, Mr. Chairman.

Senator Dick Durbin (D-IL):

Senator Blumenthal. Senator Cornyn.

Senator John Cornyn (R-TX)

Thank you, Mr. Chairman. Thank you to the witnesses for being here today. Um, as I've been listening to the testimony, it's just another reminder of how frustrating and maddening and in, and frankly infuriating it is that Congress has been unable to deal with this in a more timely and a more, in a more targeted manner. Um, but I'm also reminded of the fact that technology does not move at the speed of legislation. And it seems like, uh, the people who profit from this, uh, from this technology, um, these apps, uh, are, uh, very adaptable to whatever obstacle, whatever, uh, penalty that, uh, Congress might impose. But, Mr. Pizzuro, I think it was you, uh, that made a comment. It really jumped out at me. You said, we ought to make use of children's data illegal. Did you say that?

John Pizzuro:

I'm, I'm sorry, Senator? Uh, no, I didn't.

Senator John Cornyn (R-TX)

Excuse me, doctor, you said that. Okay. And in, in thinking about the, the model, uh, the business model of these apps, they're primarily designed to, uh, hoover up data, including personal data, and then use that data then to, uh, apply algorithms to it to provide additional enticement or encouragement for, to continue using that, uh, app. Is that correct, doctor?

Dr. Mitch Prinstein:

Yes. Yes, it is.

Senator John Cornyn (R-TX)

And so if we were able to figure out how, how to make, uh, use of a minor's data illegal and had appropriate penalties that would attack the business model, uh, and go after the people who profit from, um, from this technology, correct?

Dr. Mitch Prinstein:

I believe so.

Senator Dick Durbin (D-IL):

Well, maybe there's something, um, fairly straightforward we could do in that area. Uh, because as I said, obviously legislation moves very slowly and, and these, uh, and the people who profit and benefit from this sort of technology are very adaptable and are moving much different speed than we do. Um, Ms. Bride, the, we all, um, grieve with you over your loss of your, of your son. But in listening to your testimony, it seems to me that you did just about everything that a parent might do to protect your child, but yet you weren't able to completely protect him from the cyber bullying. Can you talk a little bit more about the role of parents in protecting their children? And are there other things that parents should do that you weren't able to do or didn't occur to you at the time?

Senator Marsha Blackburn (R-TN):

Thank you for the question, Senator. Um, yes, parents absolutely have a role like we took in talking to their kids about online safety and managing screen time. But we're at a situation right now where, if I can give you all a visual, it is like a fire hose of harmful content being sprayed at our kids every day. And it's constantly changing. And I wish I could testify and say, all you have to do as a parent is these five things, and you can hand the phone over and, and, and your kid will be safe. But that would be irresponsible of me. And this is why we need to go back to the source. The source of the harm is the social media companies and their dangerous and addictive products that are designed to keep our kids online as much as possible. And in the example of anonymous apps, what better way to keep kids online, but let them in a public forum say whatever they want to each other without their names attached?

Dr. Mitch Prinstein:

Okay,

Senator Dick Durbin (D-IL):

Dr. Prinstein, you make the point about, uh, needing more investment in, um, in mental health studies and, um, and, uh, resources. You're probably aware of this, but I'll just remind you and remind all of us that in the, uh, bipartisan Safer Communities Act that Congress passed last summer, we made the single largest investment in community-based mental healthcare in American history, uh, together with additional resources for schools. In that context, it was in the wake of the shooting at Uvalde and the obvious, uh, failure, uh, of the, uh, mental health safety nets such as it exists, um, to, to deal with, uh, young men in this case who, uh, fit a dangerous profile of self-harm or harm, harm to others. But could you speak briefly to the, to the workforce challenges? If we make these huge investments in, uh, mental health care, uh, we need people to be able to provide that care, uh, trained professionals and other associated, uh, pro uh, professionals. And where are we today in terms of providing, uh, that sort of trained workforce to deal with the need?

Dr. Mitch Prinstein:

Thank you so much for the investments that you all have made so far. Unfortunately, it's just a start. The federal government currently funds the training of physicians at a number of 750 times more than the amount that's invested in mental health professional. The CDC report that you just saw, and, uh, a number of senators have discussed is likely a direct product of that disparity. It's high critical that we are funding psychologists and other mental health providers with the same commitment and at the same level that we do our physician workforce and think about physical health also, thank you for noting the importance of the slowness by which our progress is in the social media area as compared to the rapid way in which social media changes. This is also why a commitment to social media research on the effects of social media, on mental health is so urgent now because for us to do a study to learn how social media well affect kids over many years, it will take many years to do that study. So we must start immediately investing, uh, much more in that research. Thank you.

Senator Dick Durbin (D-IL):

Thanks, Senator Cornyn. I'd also like to recognize the presence of, uh, former house Democratic leader, Dick Gephardt, and former Lieutenant Governor Healy of Massachusetts for being here today and their work on the Bipartisan Council for Responsible Social Media. Thank you for joining us, Senator Whitehouse.

Senator Sheldon Whitehouse (D-RI):

Thank you. And let me double down on that. Welcome to, Governor Healy for her work as Attorney General. Because my questions are gonna be about the legal situation here. Ms. Bride, you mentioned in your testimony that your class action lawsuit was thrown out, um, in large part because of Section two immunity. Is that correct?

Kristin Bride:

Yes, that is correct.

Senator Sheldon Whitehouse (D-RI):

So, we're having kind of a bipartisan moment here today with the Blumenthal-Blackburn legislation, with the Durbin-Graham hearing, and I would be prepared to make a bet that if we took a vote on a plain Section 230 repeal, it would clear this committee with virtually every vote. The problem where we bogged down is that we want 230 plus, we wanna repeal 230 and then have x, Y, Z and we don't agree on what the X Y, Z are. I would encourage each of you, if you wish to take a moment when the hearing is over and write down what you would like to see with respect to Section 230. If this is not your area, fine, don't bother. Um, would you be happy with a flat section 230 repeal? Would you like to see Section 230 repealed with one, two, or three other things added?

What would your recommendations be? As we look at this, it strikes me that when you repeal Section 230, you revert to a body of law that has stood the test of hundreds of years of experience, hundreds and thousands of trials in courtrooms, uh, around the country. And we know pretty well how to deal with it. And we've also had the experience of honest courtrooms being very important when powerful forces full of lies need to be brought to heal. And, um, nobody knows better than Dick Blumenthal the tragedy of the families of Sandy Hook and the lies that were told about what took place that day. And it took an honest courtroom to hold the prime liar in all of that accountable. And there is a lot of lying told about the Dominion Corporation, and it took an honest courtroom trial still underway, discovery still happening, but in the honest courtroom, you have the chance to dig down and see what were the lies and who should be held accountable rather than just have it all be thought out in the noise of the internet and the public debate.

So to me, it seems like an enormous amount of progress would be made if we would repeal Section 230. And, um, your thoughts on that from each of you would be very compelling. If there's something somebody would like to say right now, I've got two minutes left and you're welcome to jump in, but I mean, if you just can't hold back and you've got your answer ready, but I'd really be interested in the considered judgment of anybody who would care to answer about what the world would look like if Section 230 weren't there. Ms. Bride?

Kristin Bride:

Thank you, Senator. I would like to see a minimum of Section two 30 repealed to the point where these companies can be held accountable for their own policies that lure kids into their products. Like in the case of the anonymous apps monitoring, we monitor for cyber bullying and we reveal the identities of those who do. So if you have that policy as a company, you need to be able to follow it like every other industry in America.

Senator Sheldon Whitehouse (D-RI):

Thank you. Yeah. The things we're looking at, I think most closely here are, first, the company owns its own policies and ought to be accountable for them. That has nothing to do with something that pops up and then gets put on a platform. And when should they be accountable for what's on the platform? These are the basic operating systems designed by them of their platform, and they should own that period, end of story. And the other is when they're on notice, when something is up on their platform and they know perfectly well that it's up there and they know perfectly well that it's dangerous and they don't bother to deal with it responsibly because they know that they won't be held accountable. They can do whatever they please to try to generate clicks off even dangerous content. So those are the areas we're looking at. And, um, I look forward to hearing the advice, uh, from this terrific panel. And I want to thank, uh, Chairman Durbin and Ranking Member Graham for hosting this. Um, Senator Blackburn had stepped out and returned. Now let me just say, uh, thank you to her and to Senator Blumenthal for your terrific work together.

Senator Dick Durbin (D-IL):

Thank you, Senator Whitehouse. Senator Blackburn, you're next.

Senator Marsha Blackburn (R-TN):

Thank you, Mr. Chairman. And thank you to each of you. We are glad you're here, uh, for everyone on the panel and you can just give me a thumbs up. And I am making the assumption that you all support the Kids Online Safety Act. Okay. The record will reflect y'all are all for it, and we appreciate that we think it is necessary. Thank you to each of you for your testimony and also for your advocacy. We appreciate this.

Ms. DeLaune, I wanna come to you, if I may, um, the End Child Exploitation Act, um, that I had filed last Congress, and we have this back up again. Um, this is something that we've done because what we realize is the necessity for, for child exploitation to be reported to NCIS cyber tip line, and the bill unanimously passed through the Senate last year, and we are hopeful to get it finished. So give me just about 30 seconds on why this bill is important.

Michelle DeLaune:

Thank you, Senator, and thank you for your leadership, um, on this particular act. Sure. One of the most important components is the extension of the retention, uh, period. Uh, many of the ESPs obviously, uh, when they're making reports to us, the tech companies, from the moment they make the report, there is a 90 day retention notice that the companies agree to wait and hold that material if law enforcement chooses to serve legal process and gather more details. As we've demonstrated, um, with the exponential growth in numbers and the number of law enforcement leads that we are sending out, it is simply not enough time for law enforcement to be able to assess a report and determine whether or not an investigation must ensue. So extending the, the data retention is an important part of this act,

Senator Marsha Blackburn (R-TN):

And that was a wonderful suggestion that came to us from advocates to extend that because it takes longer sometimes for individuals to come forward and for law enforcement to piece that together. And the goal is to keep our children safe. Yes. So we appreciate that Ms. Bride. I wanna come to you, um, again, and as always, we know how you grieve your loss and our sympathies are with you, but also our action to get something done. Let's talk about fentanyl and the impact of fentanyl, uh, and the way children have met, whether it's on Instagram, TikTok, Snapchat, YouTube. Um, we have worked on this issue about how these platforms need to be held accountable for the illegal activity that is taking place. And you spoke beautifully about Carson and the bullying that was taking place with him. But we also know from other parents that you and I have met with that the introduction to drugs, the, uh, acquaintances they think are children, and then they find out that they're being groomed to be pulled in to using drugs or being groomed to be pulled into sex trafficking. And that is one of the dangers that are there that luring in that addiction of social media. And Emma, you spoke so well to that and we thank you. Um, but let's talk a little bit about how we should be protecting children from meeting these drug dealers and pushers and traffickers online, and how easy it has become for these people to impersonate children and to then ruin the lives of our children.

Go ahead. I'd like for you to speak to that. I know your advocacy is in that vein.

Kristin Bride:

Thank you, Senator. When we have met with other parents, and you've been in the room as well, we have parents who have lost their children to fentanyl, lace drugs. And the frustration with them is they also can't get the drug dealers taken off the platform. Um, I think I would defer to somebody else on this topic to speak, um, as that's not my specific area of expertise.

Senator Marsha Blackburn (R-TN):

Ms. Bride, let me ask you this. And for any of you, uh, for parents that have lost their kids to drug dealers, do any of you know of a drug dealer that has been apprehended, charged, indicted, convicted? No.

Isn't that amazing? It goes back to Senator Graham's point that something needs to be done about this. They're using social media as their platform. Uh, Dr. Prinstein. Oh, Mr. Chairman, my time is out. I guess I will need to yield back to you. I had one more question.

Senator Dick Durbin (D-IL):

Thank you Senator. Senator Hirono.

Senator Mazie Hirono (D-HI):

Thank you, Mr. Chairman. And, uh, I thank all of the panelists and everyone, uh, in the audience, uh, and those who are watching these proceedings. But I get of course, as the utter frustration that you all are sharing with us. And of course, I thank my colleague, uh, for her advocacy in getting something done. I am Dr. Prinstein, there is a definition for addiction. And, uh, would you say that, uh, the millions of young people who are on, on social media, that they are exhibiting what amounts to an addiction to these platforms?

Dr. Mitch Prinstein:

Within the science community, we're debating over the use of that word a little bit right now to depict social media, but I do think there's agreement that there is clearly a dependency on social media, which we can see in kids, uh, suffering from many of the same symptoms that we see in the DSM, the diagnostic manual, uh, for addiction to substances. It seems to apply quite well to the description of kids' behavior and dependency on social media.

Senator Mazie Hirono (D-HI):

And the additional danger to an addiction to social media is that the, um, this is such a, a negative, uh, kind of information that they can get their bully, they're a hassle. They are, uh, that there are all kinds of, uh, horribly negative, uh, kinds of, of, um, uh, messages that they get from this particular, which may be, you know, a little bit different. So we do have treatments normally for addiction. Do we have treatments for addiction to social media?

Dr. Mitch Prinstein:

I don't believe those have been adequately studied,

Senator Mazie Hirono (D-HI):

And we probably should study it. And, and that gets me to Ms. Lembke, you started on social media at sixth grade. Was it, would, would you say that you were addicted to social media?

Emma Lembke:

I will say that I exhibited, and thank you for your question, Senator, a dependency that was stated here today, but I do not think that I alone can define what that addiction means. I think that other members of my generation and other young voices should be integrated into these ongoing conversations into what constitutes an addiction moving forward.

Senator Mazie Hirono (D-HI):

Well, did you have a hard time not to, uh, going to social media on a, on a regular basis? On a daily basis, you spent up to six hours Absolutely. Platforms. So regardless of what the medical definition might be, that, uh, when you're spending six hours on a platform that didn't make you feel terribly good about yourself, so how was it that you finally broke yourself of this dependency?

Emma Lembke:

Well, thank you, Senator, for your question. It, it took getting to a, a breaking point where my anxiety was so great, my depression was incredibly acute, and my issues with disordered eating were rampant. It took about three to four years getting into the ninth grade where one day I heard the buzz of a notification and I had the Pavlovian response to instantly grab for it. And suddenly in that moment, I asked why, why was I allowing these companies to have so much control over me? And that question has led to many more and has gotten me here today to speak up about the importance of having youth voices at the legislative table.

Senator Mazie Hirono (D-HI):

So I, I appreciate your mentioning that. Um, it sounds, but your, um, sort of the the light going off in your head is that the kind of experience that a lot of young people who are so dependent on these platforms that they can on their own will decide, I just can't take this anymore? Or is that one of the reasons that you created Log Off? Can you tell us a little bit more about what this, what your program or the movement does to help young people?

Emma Lembke:

Yes, Senator. Thank you. I think each young person who struggles with this issue comes at it from a very different angle. For me, it reached or took reaching that breaking point for others. They continued to be harmed. And that was the reason I created Log Off. It was to seek out other young people who were frustrated, who were struggling, who were angry and wanted to talk to each other across our generation members who understand the experience better than any other group of people across this world. So I created that body in order to have those conversations and to work collectively to move forward in building effective solutions and in discussing those complexities and in the online world and living through a digital childhood.

Senator Mazie Hirono (D-HI):

Thank you very much for your, uh, stepping up. I only have a little bit of time I wanna get to Ms. Bride. There, there's been a lot of discussion about Section 230. A number of us have bills to, uh, reform Section 230, as do I I think one of the concerns though is that, uh, the wholesale elimination of Section 230, which I be, I do support, you know, holding these platforms responsible for the kind of, uh, hugely a harmful content, but it does get into a First Amendment freedom of speech issues. So we need to be very aware that as we reform Section 230 enable, I would say lawsuits like yours to proceed, that, uh, we do it in a careful way to avoid unintended consequences. But I, I just, I wanna share with you our deepest sympathies for what you, uh, continue to endure. And the rest of you, thank you very much for your testimony. Thank you, Mr. Chairman.

Senator Dick Durbin (D-IL):

Thank you, Senator. Senator Lee.

Senator Mike Lee (R-UT):

Thank, thank you, Mr. Chairman. Ms. DeLaune, I'd like to start with you, if that's all right. NCMEC does a great job of highlighting a lot of these problems, uh, and the pervasiveness of CSAM um, through the, uh, through the cyber tip line. Um, it's my understanding that about 32 million reports of CSAM were reported to the cyber tip line last year, and I believe you said in your testimony that of those 32 million reports, only about 6%, um, uh, can be referred to, uh, US Federal or US local law enforcement, um, here in our country. Is that, is that right?

Michelle DeLaune:

That is correct, Senator.

Senator Mike Lee (R-UT):

So of the 32 million reports that we start with, we're already down to about 3.2 million that can be actionable here, uh, that can be reported to law enforcement here. Uh, would you be comfortable estimating about how many of those 32 million images of c a m end up being removed from the, from the internet? I think you said in your testimony some more that it was, uh, maybe 55% of those. So I'm guessing 1.7 million.

Michelle DeLaune:

We, we have a lot of numbers <laugh>. Uh, so for 32 million reports that are coming in the door, uh, the reports are coming in from the tech industry mostly. Uh, and in addition to public reports, they are reporting users who are using US platforms to transmit child sexual abuse material. Clearly we have global companies here in the United States, so approximately 90% of the leads that are coming in, uh, are going back to other countries where offenders are uploading child sexual abuse material. Gotcha. So we're down to a smaller amount of about, uh, 3.6 million reports here in the United States that we are able to refer to law enforcement. Um, it goes to the, the point of there is a lot of disparity in a long line of, of, uh, issues that will impact actionability of a cyber tip line report. There are some basic key things that are necessary and are currently voluntary for tech companies to provide.

That would be the images, uh, or videos or the content that, uh, meets the, uh, standard of apparent child pornography. It would be baseline information regarding the geographic location of where law enforcement should be reviewing this lead to determine if an investigation should ensue, uh, basic information on a user who uploaded the child sexual abuse imagery. And if a victim, uh, existed, um, if they have any information. That's the baseline information that law enforcement needs. We estimate of the reports that we were able to provide to law enforcement last year, 55% of them may have been actionable, meaning they meet all of those criteria, which tells us there's a lot of improvement that can happen at the, at the beginning of the pipeline, that quality information coming in, so law enforcement can make proper assessments.

Senator Mike Lee (R-UT):

Yeah, that makes, makes a lot of sense. Now, Mr. Pizzuro, you've done some fantastic work help helping kids who are in actual or imminent danger. Uh, I know that rescuing, um, kids who are in distress should be a priority. II'm guessing, uh, that the removal of the CSAM images from the internet, um, can't take quite as high of a priority as rescuing the kids from imminent danger. Uh, is that, is that the case?

John Pizzuro:

That's true. And one of the things is, you know, from, from the investigative standpoint, is those proactive cases where we're really targeting those egregious offenders.

Senator Mike Lee (R-UT):

Gotcha. Makes sense. Look, bottom line, uh, pornography is very bad. It's especially bad for young people. I think it's bad for everyone. Uh, but it, it subjects young people to significant, uh, and somewhat unique harms. It's bad enough that children were abused to make these images in the first place, but every single time these images are viewed or shared, a child's re-traumatized. Again, it's one of the reasons why last year, I, I introduced a bill called the Protect Act. This is a bill that would require any websites hosting pornographic material, um, uh, on, on a commercial scale to put in place a removal mechanism and remove images at the request of the individual who appears in them. And it also require websites to verify the age of individuals appearing in pornographic material. And also, uh, they would have to, to verify consent they'd be hosting, uh, they'd be also penalized for hosting csam and any other items, um, uh, that were in there that shouldn't be.

And then their victims of their authorized representatives could petition for those images to be removed from the website. And I think that would, would help with that. Ms. DeLaune, in your testimony, you mentioned that current needs to be changed at Ms. DeLaune, I'm sorry, um, that it needs to be changed in order to help CSAM be able to share those images, help people be able to share those images with CSAM and with law enforcement. And I'd be happy to work with you on that, um, uh, to get that done and to incorporate into that, that into my bill, the, uh, the Protect Act. One more thing that these things are all important. That's why at the end of last year, I also introduced another bill called the Screen Act. This bill would require that any commercial website hosting pornographic images has to verify the age of users on their site and block miners from viewing graphic material. I look forward to working with my colleagues and the witnesses before us today and the organizations they represent to get those bills across the finish line. Finally, I just want to thank, uh, uh, you Ms. Bride and you, Ms. Lembke, for sharing your stories on difficult heart wrenching circumstances. Thank you.

Senator Jon Ossoff (D-GA):

Thank you, Senator Lee. Uh, I'll be managing time for a moment while chair Durbin votes. And, uh, I'm up next, followed by Senator Kennedy. Uh, I want to thank our panel, uh, for your, your testimony in particular, uh, Ms. Bride, uh, to you for bringing your advocacy to the Senate amidst this nightmare that you and your family, uh, have lived and continue to live. Uh, Ms. Lemke, thank you for your extraordinarily well considered and powerful testimony. Um, Ms. DeLaune, as you know, and as you mentioned, your opening statement, Senator Grassley and I, uh, have legislation to strengthen federal protections against sexual abuse of children, including online exploitation. And, uh, we were able to pass that legislation through the Senate last Congress with bipartisan support, not yet through the House. We're hoping to do that, this Congress with your help. Uh, and a key aspect of this bill is to ensure that the law's keeping up of technology, uh, and to ensure that when abusers use webcams or online messaging platforms to target children, that, uh, the full strength of federal law can be brought to bear to prosecute them and to protect children from other crimes.

Can you describe briefly, please, Ms. DeLaune, the necessity of ensuring that relevant federal statutes keep up with technology and how these threats evolve?

Michelle DeLaune:

Thank you, Senator. Thank you for your leadership on that with Senator Grassley. Um, we look forward to, uh, you know, continuing to work with you and your staff. It is important as we're talking about, uh, the continual evolution of threats to our children. Technology, uh, was mentioned earlier, moves much faster than the legislative process, and it's very important and encouraging to be here today, uh, to hear from all of you, uh, kind of leading the charge here of ensuring that our legislative proposals and legislative pieces, um, that you're considering are actually matching the technology. What you mentioned, Senator, about live streaming, um, that's being, uh, considered, uh, in your bill. We have seen an evolution with, uh, children being sexually exploited, who there is not a physical abuser who is actually physically touching them. And we need to ensure that the legislation actually actually reflects that children are being exploited, children are being sexually victimized by individuals, uh, in different countries, in different states and different rooms. Uh, and this is something that we continue to see where offenders are moving children from social media platforms, maybe where they introduce and then move them to a more, um, uh, a different platform where they would have live abuse ability as well as individuals who are selling children for sexual performance online. So thank you for, for recognizing that evolution of technology needs to be reflected in legislation.

Senator Jon Ossoff (D-GA):

Thank, thank you, Ms. DeLaune. And the same legislation, uh, that I've offered with Senator Grassley also strengthens law enforcement, uh, as they prosecute those who cross state lines or international lines to abuse children. What are you seeing now in terms of trends and dynamics in so-called sex tourism? Particularly, uh, as it pertains to the abuse of children,

Michelle DeLaune:

Sex tourism, certainly you still have people who are, uh, traveling to other countries, uh, uh, taking advantage of lax laws, um, and poverty, uh, to sexually exploit children. We do, of course, see now an increase, if you want to call it sex tourism of individuals who are virtually streaming live streaming, sexually exploiting children in impoverished countries, uh, and paying them via, uh, you know, online apps. So this is something that we continue to see as a problem actually getting worse because of the, the new ways that people can communicate live streaming.

Senator Jon Ossoff (D-GA):

Our bipartisan legislation, as you know, will help to crack down on online abusers as well as those who cross state lines, international lines to attack children. I thank you for your continued support for the legislation. Uh, finally, um, just briefly, Senator Blackburn ran at a time and had another question, uh, that she wanted to ask. I wanna make sure to get that to Dr. Prinstein. And it, Ms. Lembke, you, you, uh, in a very candid and personal way, described the impact, um, that, uh, the use of these technologies had on your psyche. And, uh, I know that in particular for other young people around the country, they've experienced the same dynamic. Um, the, the formation of dependence, the impact on self-image, uh, and mental health. And I thank you for sharing your story, and I want to act ask you, Dr. Prinstein, if you could just speak for a moment about the long-term negative psychological impact, uh, that in particular young people can experience as a result of their use of social media and how we in Congress should think about addressing that.

Dr. Mitch Prinstein:

Scientists are working as fast as we can to give you those answers. Um, it's something that requires us to follow kids as they mature and see how it is that they develop. We do know that there are numerous online communities and opportunities to engage with content that actually teaches kids how to cut themselves, how to engage in behaviors that are consistent with an eating disorder, how to conceal these behaviors from their parents and adults. And they sanction young people when they discuss the possibility of engaging in adaptive rather than maladaptive behaviors. Many of these online posts and communities have no warnings, no trigger warnings to indicate that these might be concerning for kids. And of course, that's something that is directly associated with kids' likelihood of ins engaging in these maladaptive behaviors themselves.

Senator Jon Ossoff (D-GA):

Thank you, Dr. Prinstein. Deeply disturbing and certainly Warren's regulatory attention. Appreciate your testimony. Senator Kennedy, you're next for five minutes.

Senator John Kennedy (R-LA):

Thank you, Senator. Um, many of the companies that we're talking about are American companies. Uh, not all big tech is American, but we certainly led the way these companies are, are very successful. They're very big, they're very powerful. They're really no longer companies, they're countries. Um, and they're going to oppose any, any of this type legislation. It's why virtually nothing with respect to big tech has passed in the last five years. I wanna be fair, I think that, um, social media has made our our world smaller, which is a good thing, but it has made our world coarser.

Um, and if I had to name one fault, it wouldn't be the only one. But I would say that, uh, social media has lowered the cost of being an a-hole. People say things on social media that they would never say in an interpersonal, um, um, exchange. Adults, even though it's, it's, it's depressing sometimes can deal with that. It's hard for young people. We, we've talked about a number of problems that are presented by social media, data privacy, sexual exploitation, but also mental health. And the impact that, that I think it's clearly having on, uh, particularly young women in, in the Gen Z generation 10 or 11 to, um, to 25 and 26. Uh, they're living their lives on social media and it is, they're not developing interpersonal relationships. It's making them very fragile. It's, it's, uh, reaffirming this, this culture of victimhood. Um, they're not getting ready for the world. So let me to the chase. I I'll start with Mr. ... I saying it right? Golin. I apologize. Uh, for young people defined as, uh, people under the age of 16, should we just abolish social media for them? Don't let them access it.

Josh Golin:

You know, um, things are so serious that I...

Senator John Kennedy (R-LA):

Give me some quick answers cuz I'm gonna go down the line.

Josh Golin:

We should consider all options, but I think we should focus it more, makes more sense to focus on a duty of care and changing how these platforms operate. Practically keeping kids off under 16, maybe impossible. And I would also say it's not just social media. A lot of these things happen on video game platforms as well.

Senator John Kennedy (R-LA):

Do you really, you think it'll really be easy to change the attitudes of, of these social media companies?

Josh Golin:

If you create a duty of care and you limit the data that they can collect?

Senator John Kennedy (R-LA):

Right. I think they have a duty to care already. Um, what about you, doctor?

Dr. Mitch Prinstein:

I think we desperately need to educate parents.

Senator John Kennedy (R-LA):

I know we need to educate, but should we just tell kids, look, it's, it's a lot like alcohol. Um, this stuff is addictive and until you're 16, you can't access social media.

Dr. Mitch Prinstein:

There are benefits that also come from social media and I don't know whether it's realistic to keep kids off of it completely. I think practicing moderation with closed parental supervision with, uh, substantial education coming from the school

Senator John Kennedy (R-LA):

Here's a news flash for you, a lot of parents don't care doctor. Mr. Pizzuro?

John Pizzuro:

Senator, basically there should be something, if I bought a phone tomorrow, there should be at least, at the very least, a terms of agreement. I can't even access that phone until I go through a three minute or five minute video.

Senator John Kennedy (R-LA):

Okay, Ms. DeLaune?

Michelle DeLaune:

An acknowledgement that when you build a tool that allows adults and children to communicate with one another or find connections, that there's a duty of care to ensure that you're creating a safe environment for those kids.

Senator John Kennedy (R-LA):

Well I think there's clearly a duty. You care. The issue is how to enforce a duty of care. Go try to pass a bill, enforcing that duty of care in the United States Congress and, uh, and see what the reaction right from big tech is.

Michelle DeLaune:

Right? Absolutely. And, and creating these tools, recognizing that these incidents are going to happen and finding ways that, that children...

Senator John Kennedy (R-LA):

Would you support a law that says, okay, if you're under 16, you can't access social media?

Michelle DeLaune:

I think will be difficult. There are positive things about social media, but there are many, many terrible things that kids are finding themselves in, um, in bad shape.

Senator John Kennedy (R-LA):

I if it were you say it would be hard. I know it'd be hard. You think it's a wise thing to do?

Michelle DeLaune:

I believe that the tools are designed properly. There could be benefits.

Senator John Kennedy (R-LA):

Okay. I can't have my, I don't have my glasses on. Yes, ma'am. Your answer please.

Emma Lembke:

Yes, Senator. I have not spent a lot of time thinking about specifically the right age to enter because I do not think that it addresses the fundamental question. We must answer how to create online spaces that are safer when kids decide to enter. Because I can tell you that...

Senator John Kennedy (R-LA):

Is that a no?

Emma Lembke:

Is that a what you, sorry, Senator.

Senator John Kennedy (R-LA):

Do you think we should, should prevent kids under the age of 16 from accessing social media?

Emma Lembke:

I think that we should spend more time looking at how to make those platforms safer because kids will circumnavigate.

Senator John Kennedy (R-LA):

Yes ma'am.

Kristin Bride:

And I agree with Ms. Lembke as well. I think that safeguards is the way to go. If we look historically at the automobile industry, it was not safe, but we brought in seat belts, airbags, and now it is much safer and we can do that with this industry.

Senator Dick Durbin (D-IL):

Okay, thank you Senator Klobuchar.

Senator Amy Klobuchar (D-MN):

Um, thank you very much Mr. Chair. So thank you so much. Uh, it's been incredible hearing and as you know, I'm involved in this issue. I thank Senator Blumenthal and for his work and Senator Blackburn, so many others. Um, so I would agree we need rules of the road. We need rules of the road for everything from the, um, uh, what we're talking about here for kids to privacy, to competition, um, because there's just no rules of the road as Senator Kennedy has expressed. Um, we have tried in many ways and passed a number of bills in this committee. I believe one of these days they're gonna start to pass. Uh, because, uh, the social media companies have stopped everything in their tracks that we have, uh, tried to do. And I think it is important, I guess I would start with that, um, that they are companies and they are media corporations basically.

And I try to explain to people that if you put something online or put it on a one person does it, that's bad. That's one thing. But when, or if you yell fire in a crowded theater, okay, that's on you. But if the multiplex were to take that yelling fire and put it in all their theaters with an intercom so everyone could hear it, that's a whole 'nother thing. And that's a problem that hasn't been solved. When it comes to these companies. They are profiting off the repeating of this information and the spreading of this information. So, Mr. Golin, I just ask you this. In addition to setting the rules of the road that we wanna do, um, when we talk about auto companies and all of these other areas, at some point people have been able to sue them, um, for problems. And right now these companies are completely immune. Do you wanna get at that and talk about your views on that?

Josh Golin:

Yeah, I think that's a huge piece of the equation is the ability of, uh, parents and, and young people themselves to hold these companies accountable. Um, you know, Kristen talked about her lawsuit being thrown out. Uh, we work with Tiana Anderson, whose 10 year old daughter, um, died after attempting the viral choking challenge, which TikTok put into her for you feed. It's not something she was searching for. TOS decided that this was the piece of content that would be most, uh, appealing to her at that time. And, uh, their court case was thrown out of court for Section 230 reasons as well, right? So...

Senator Amy Klobuchar (D-MN):

Okay, I just wanna make that clear. These, the rules are good, but I'm telling you, if you just pretend that they are a, um, loftier than any other company class that can't be sued for anything, we we're never gonna get a lot of these things done. So let's, let's be honest about that. Um, the respect for, uh, Child Survivors Act. Um, so this something Senator Cornyn and I passed. Mr. Prinstein. Um, do you agree that it's important for mental health professionals to be involved in interviews of child survivors? This is this idea that whatever the crime I was a prosecutor for quite a while, sexual abuse, whatever. It's important to have, uh, coordinated effort when it comes to interviewing kids.

Dr. Mitch Prinstein:

Yes, absolutely. There's a clear psychological science around how to do that in safe and appropriate ways.

Senator Amy Klobuchar (D-MN):

Thank you. Um, the, uh, issue of eating disorders, I'll go back to you. Um, Mr. Golin. Studies have found that the eating disorders have the highest mortality rate of any mental illness. I think that surprises people. I led the Anna Weston Act and, uh, last year, of course, thanks to Senator Blumenthal. We heard in Senator Blackburn from Francis Haugen, the Facebook whistleblower about Instagram's own internal research on eating disorders. You talk about that connection, uh, between the internet eating disorders. Do you wanna quickly comment on that connection and why that should be part of our focus here?

Josh Golin:

Yeah. So what happens is, um, when girls or, or, or anyone really expresses any interest in dieting or dissatisfaction with their body, they get barrage by, um, content recommendations, uh, for pro eating disorder content. And so, because that's what's gonna keep them engaged. So we need to create a duty of care, uh, that these platforms have a, uh, you know, a duty to prevent and mitigate harmful eating disorder content, uh, and not push it on kids. I mean, I think that's one of the really important things to distinguish between queries where people might be interested in getting some information versus what is being actually pushed in their feed. And frequently it is the worst, most harmful content that's being pushed in their feed.

Senator Amy Klobuchar (D-MN):

Okay. Ms. DeLaune, Senator Cornyn and I did a lot of work on human trafficking, as you know, passed that original bill, um, to create incentives for safe harbor laws. Uh, can you talk about how the internet has changed the way that human traffickers target and exploit kids?

Michelle DeLaune:

Yes, thank you, Senator. Human trafficking and child sex trafficking in particular has, uh, certainly been fueled by online platforms and the connectivity between offenders and, and children. Not only does it make buyers, it makes it easier for buyers to find children who are, are being, uh, trafficked. But it also allows the imagery of these children to continue to circulate. And that often keeps the victims quiet and, um, being silenced in terms of speaking up because their images are then being, uh, transmitted online for, for potential buyers to, to locate.

Senator Amy Klobuchar (D-MN):

Okay, last question. Um, Mr. Pizzuro, thanks for your work. Um, I have heard heart-wrenching stories of young people who've died after taking drugs in one case in, uh, drugs they bought on Snapchat through messages, a child named Devin suffering from dental pain, uh, bought what he thought was Percocet, and it was laced with Fentanyl. Um, and this was off of Snapchat .As his mom, Bridget, said, all of the hopes and dreams with parents had for Devon were erased in the blink of an eye. And no mom should have to bury their kid. Uh, could you talk about whether or not the social media companies are doing enough to stop the sale of drugs to kids online?

John Pizzuro:

The social media companies aren't doing anything, period. I think that's part of the problem, and that comes to drugs, uh, as well. Uh, there there's no moderation. Again, they're not looking at things specifically. Uh, they're not looking, again, you can't from a communication standpoint, but that's what they're promoting the social media, the interaction of people. So it this to, my opinion really is that we haven't seen anything and we haven't seen any help from 'em.

Senator Amy Klobuchar (D-MN):

All right. Thank you.

Senator Dick Durbin (D-IL):

Thank you. Senator Klobuchar. Senator Hawley.

Senator Josh Hawley (R-MO):

Thank you, Mr. Chairman. Thanks to all of the witnesses for being here. Ms. Bride, I wanna start with you. I want to particularly thank you for being willing to share your story in Carson's story. I'm the father of three, two boys, and you've lived every parent's nightmare, but thank you for being willing to try and see some good come of that and for being so bold and, and telling your family's story. I, I want to ask you about one thing that I heard you say, and you've also written it in your written testimony about Carson. You said it wasn't until Carson was a freshman in high school, so about 14. And I would guess that we finally allowed 'em to have social media because this is what caught my attention. That was how all the students were making new connections. Could you just say something about that? Because that's the experience I think of every parent. My kids are, my boys are 10 and eight and they're not on social media yet, but I know they'll want to be soon because they'll say, well, everybody else is on it. So could you just say a word about that?

Kristin Bride:

Yes, thank you. We waited as long as we possibly could, and we were receiving a lot of pressure from our son to be involved. I think I hear, and I hear this a lot from other parents, you don't wanna isolate your kid either. And so we felt by waiting as long as possible, talking about the harms, don't ever send anything that you don't want on a billboard with your name and face next to it. That we were doing all the right things and that he was old enough. He was by far the last kid in his class to get access to this technology. Yet this still happened to us.

Senator Josh Hawley (R-MO):

Yeah, that's just incredible. Well, you, you were good parents and you were a good mother. Incredibly good mother. Clearly, this is why I support and have introduced legislation to set 16 years old as the age threshold for which kids can get on social media and require the social media companies to verify it. I heard your answers, the down the panel a second ago to Senator Kennedy, I just have to say this as a, as a father myself, when you say things like, well, the parents really ought to be educated. Listen, the kid's ability, and I bet you had this experience, Ms. Bright, the kid's ability to figure out how to, how to set what's on this phone. My, my 10 year old knows more about this phone than I know about it already. What's gonna be like in another four years or five or six years like your son, Ms. Bride? So I just say as a parent, it would put me much more in the driver's seat. If the law was you couldn't have a phone, sorry, you couldn't get on social media till 16. I mean that would help me as a parent. So that's why I'm proposing it. Parents are in favor of it. I got the idea from parents who came to me and said, please help us. You know, please help us. And listen, I'm all for tech training, it's great, but I just don't think that's gonna cut it. So I've introduced legislation to do it. Let's keep it simple. Let's just, let's put this power in the hands of parents. I'd start there. Second thing, Ms. Bride, you brought suit for, against Snapchat and others and I've got your lawsuit right here. And you were barred by Section 230 and you testified to that effect. They just threw it all out, right? Mm-hmm. <affirmative>, the court threw it all out.

Kristin Bride:

Right? And it wasn't, the lawsuit was not about content, it was about the company's own policies. Yeah. That lured my son in to think that this product, this app was safe, this anonymous app that they would monitor for cyber bullying and reveal the identities of those who do. So it had nothing to do with content.

Senator Josh Hawley (R-MO):

Yep. And this is why I, I think it is just absolutely vital that we change the law to allow suits like yours to go forward. And if that means we have to repeal all of Section 230, I'm fine with it. I'm introducing legislation that will explicitly change Section 230 to allow suits against these social media companies for their own product design, for their own activities, for their own targeting of kids, for them to be sued for that. And will allow you to you and every other parent, Ms. Bride, to get into federal court. We will create a federal right of action. Because here's what I've decided. Listen, I'm a lawyer, former attorney general, I believe in the power of courts. And what I've decided is you can find these social media companies to death ftc, find Facebook what, a billion dollars or something a couple years ago.

They didn't change their behavior at all. They don't fear that. What they will fear though, is they fear your lawsuit. That's why they fought it so hard. They don't want parents suing them. They don't wanna be on the hook for damages, double damages, trouble damages. Well, they should be. And if we give the power to parents to go into court and say, we are gonna sue you, they will fear that far more than they fear some regulator here in Washington to DC who by the way, is probably looking to get a job with that same company when they rotate off their regulatory panel. Cuz that's what happens. All the regulators here in dc they go to work for these tech companies as soon as they're done here. Well, enough of that, let's put power into the hands of parents. Allow you Ms. Bride and every other parent in America who has a grievance here to get into court and sue these people and hold 'em accountable.

And I'd say the same thing about child sexual exploitation material. Let's let parents, Sue and I will introduce legislation that will allow any parent in America who finds child sexual exploitation material online to go sue the companies for it. If they know or should have known the companies that they were hosting this material, let's let 'em sue 'em that I tell you what, if these companies think they're gonna be on the hook for multi hundred million dollar or more fines and damages from multiple suits all across the country, they'll change their act. They'll get their act together real quick. So my view is enough of this complicated regulatory, this regulatory that, just give the American people and American parents the right to get into court and defend their kids and to defend their rights. And if we do that, I think we'll see real results.

Last thing, Mr. Chairman, I know I'm going long here, but I just wanna say this. We have these hearings every so often. I love these hearings are great. Everybody talks tough on the companies and then later on watch we'll have votes in this committee. Real votes. And people have to put their names to stuff and oh, lo and behold, when that happens, we can't pass real tough stuff. So I just say this to my colleagues, this has been great. Thank you Mr. Chairman for holding this hearing. This has been great. But it's time to vote. It's time to stand up and be counted. I've been here for four years. It's been four years of talk. The only thing we've gotten done on big tech is TikTok, which we finally banned from all federal devices. That's the only thing of any significance we have done on Big Tech that has got to change. And I wanna thank all of you for being here to help galvanize that change. Thanks for indulging, Mr. Chairman.

Senator Dick Durbin (D-IL):

Thank you. Senator Hawley. Senator Welch.

Senator Peter Welch (D-VT):

Um, you know, this is a pretty, there's a lot of heartache in this room, uh, and you've lived it and I just wanna acknowledge that. And what you've lived is every parent's fear, uh, and this dilemma that we have, uh, if there's an easy solution to it, maybe the lawsuits, uh, is being proposed. Uh, if there was an easy solution, we'd get it. You know, I wanna talk to you, uh, uh, Emma, just if I can, this question of whether we can have a age limit. It's appealing, but is it practical?

Emma Lembke:

Thank you, Senator, for your question. I have not spent a lot of time thinking through specific age ages that should go on social media. I think looking at age verification is crucial in understanding how to build a productive solution. But to your point, I think the question we really have to ask is, when children who know more than most parents enter these online spaces, how are they protected? Because we have seen time and time again that no matter the bands kids find a way in. Right?

Senator Peter Welch (D-VT):

So they'll find a way in mm-hmm. <affirmative>. And you know, what we're hearing from you, you lost your son. Uh, the childhood sex exploitation. I mean, it's horrifying. And these are the examples of a system that is really gone ammuck and it's a system that's legal. But even those kids who are not caught up in victimized in child prostitution or bullied into taking their own life, there's a mental health crisis. I mean, this is just not good for anybody and kids. I mean, we are all kids once and we're vulnerable at that age, uh, to what other people think of us. So I think there is a question here, uh, that is raised by Senator Hawley about how do we have responsibility at the point of entry? And that is the tech companies. And they've got a business model where they don't necessarily publish it, and of course that was section 230, but they amplify it.

The Senator Klobuchar in her owned Klobucharian <laugh> Way was able to express it. Uh, and that's where the business model, uh, is sustaining this effort on the part of Big Tech. Because the more clicks they get, uh, the more advertising revenue they get, uh, you know, one, uh, question I have is whether it is time for us to create a governmental authority that gets dismissed oftentimes. But when we had previous examples like the, the lack of seat belts, it was the National Highway Transportation Board that was looking out after the public interest when we had a lot of securities fraud in the thirties, we had this securities and exchange commission. It's very tough here in Congress to come up with a one-off, especially in tech because they just keep moving ahead and whatever we do to try to, uh, deal with the behavior of kids, they're kids and they're gonna get on that platform. Uh, you wanna say something, uh, doctor, but one of the proposals that Senator Bennett made and I made in the House, have a digital authority that had some authorization from Congress. It's charge was to protect the public interest, to look at the real world about what's happening to real kids and say, Hey, you know, this may be legal, but it ain't right. And we've gotta do something. Go ahead, doctor.

Dr. Mitch Prinstein:

Thank you. I I appreciate your comments. I just wanted to mention a an age limit is only going to be useful if there's some way to make sure that kids below that age can't get on. Remember that kids' brains are not fully matured at the age of 16. We cannot say that everything that's happening on social media now would be safe for kids at 16. In fact, please be aware that this is the time when most kids are now starting to get autonomy driver's licenses and the things they're seeing online are changing the ways that they're understanding what is risky versus not giving kids free reign to that content just before they get in a car and drive far away from their, their parents might actually be, uh, shortsighted.

Senator Peter Welch (D-VT):

Thank you. Ms. Ms. Bride, do you wanna offer anything? After all you've been through and thank you. I share, I think the sentiment all of us have, it's so inspiring to see a parent try to turn tragedy into something good in the memory of her son. Thank you.

Kristin Bride:

Thank you Senator. I would like to see a combination of both. I would like to see federal legislation so that these products that we know are dangerous get reviewed before they're released to American children. The example of my son with the anonymous apps we saw in the past, they led to cyber bullying and suicides. Why were two other companies able to put out the same product mm-hmm. <affirmative>. And on the other side of it, when things go wrong, yes, I would like to see Section 230 reform so that we can hold them accountable. But it should not take grieving parents filing lawsuits to change what's happening because it's too late for us. Right.

Senator Peter Welch (D-VT):

Thank you. Thank, thank you very much. I yield back, Mr. Chairman.

Senator Dick Durbin (D-IL):

Thanks Senator Welch. Senator Blumenthal has a question.

Senator Richard Blumenthal (D-CT):

I have. Uh, thanks Mr. Chairman. I'll be very, very brief and, again, my thanks to all the members of the panel and all of the folks who have come to attend. I share Senator Hawley's frustration and impatience as you may have gathered. Uh, and I feel that sense of outrage at Con Congressional inaction. And I know Ms. Bride, you were part of our effort during the last session, very, very much involved as were many of the parents who were here today and others who were perhaps watching. And my question to you, and perhaps to Emma Lembke, is what did that failure to act mean to you personally?

Kristin Bride:

Thank you, Senator. It was extremely disappointing. There was so much momentum. I made trips along with my fellow moms that are in the written testimony today to Washington several times. It is so difficult to tell our stories of the very worst day of our lives over and over and over again, and then not see change. We're done with the hearings, we're done with the stories. We are looking to you all for action. And I am confident that you can all come together and do this for us and for America's children. Thank you.

Senator Richard Blumenthal (D-CT):

Ms. Lembke. You are, uh, part of a generation that has a right to expect more from us.

Emma Lembke:

Yes, Senator. You know, I got on Instagram at the age of 12 and I sit in front of you all today as a 20 year old. About eight years down the line, I still see and hear of the harms that I experienced eight years ago. And what I will say to this body is that those harms will only increase from here. The mental health crisis for young people that we are witnessing will only continue to rise. So we cannot wait another year. We cannot wait another month, another week or another day to begin to protect the next generation from the harms that we have witnessed and heard about today.

Senator Richard Blumenthal (D-CT):

Thanks, Mr. Chairman.

Senator Dick Durbin (D-IL):

Thank, thank you Senator Blumenthal. Thanks to the panel. I don't know if any or all of you realize what you witnessed today, but this Judiciary Committee crosses the political spectrum, not just from Democrats to Republicans, but from real progressives to real conservatives. And what you heard was a unanimity of purpose. And that's rare. In fact, it's almost unheard of. Uh, and it gives me some hope. Now we have our own problems that have to do with this institution that I work in, in terms of when things are appropriate, how to bring them up, and how to deal with the rules of the Senate. Not, uh, an easy responsibility, a challenging responsibility. But I think the urgency of this issue is going to help propel us pass some of these obstacles. One of them is, uh, a jurisdictional issue which relates to the Senate Commerce Committee, which Senator Blumenthal can tell you has a major piece of the law that we've discussed today.

And we of course are on the Judiciary side, the criminal side of it. We have a piece of it as well. The question is whether there's any way to build them together. I think there is, there's certainly the will from Senator Cantwell, the chairman of the eCommerce committee. I've spoken to her personally. And what I'd like to promise you is this, we are going to have a markup. Now that doesn't sound like much, but it is a big promise. It means that we are gonna come together as a judiciary committee and put on the table the major pieces of legislation and try to decide as a committee if we can agree on common goals and common efforts to re to reach those goals. I think we can do this. Just sensing what I heard today, and I think as a father and grandfather that we must do it.

We must do it. Ms. Bride, and others who have come here because of their passion for their children that they've lost. It makes a difference. As painful as it is, it makes a difference. And Ms. Lembke, good luck at the hilltop with Washington U, but you've done a great service to our country by coming here today. And for the others, thank you for sharing this information. Now it's our turn. We've gotta get down to work and roll up our sleeves. It won't be the bill I wanna write. It won't be the bill you wanna write, but if it is a step forward to protect children, we're gonna do it. We have to do it. We have no choice. The hearing record's gonna remain open for a week for statements, uh, to be submitted. And you may receive some questions which I ask you to respond to promptly. I thank you all for coming today and your patience and determination to do good, but do well by our children. I thank the witnesses and the hearing stands adjourned.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics