Pair of Child Online Safety Bills Advance Out Of House Energy And Commerce Committee
Gabby Miller / Sep 18, 2024A pair of online privacy and safety bills passed out of the House Energy and Commerce Committee by voice vote during a full committee markup on Wednesday. The Kids Online Safety Act (H.R. 7891) and Children and Teens’ Online Privacy Protection Act (H.R. 7890) were two of more than a dozen bills up for review on Wednesday. A combined version of the kids’ safety bills, the Kids Online Safety and Privacy Act (KOSPA), cleared the Senate in July by a 91-3 vote, less than a week before Congress broke for August recess.
House Energy and Commerce Committee Chair Cathy McMorris Rodgers (R-WA), who has led the charge on comprehensive data privacy and kids’ online safety legislation on the House side, opened the markup promising “a new era on the internet” defined by accountability and safety. “We will join together in a bipartisan way on behalf of the millions of parents, grandparents, and kids across the country to say enough is enough,” Rep. McMorris Rodgers said. KOSA is intended to “provide both kids and parents the tools they need to better protect against serious online threats to children’s health and emotional well-being,” she added.
The Senate version of the Kids Online Safety Act, or KOSA, would “require social media platforms to provide minors with options to protect their information, disable addictive product features, and opt-out of algorithmic recommendations—and require platforms to enable the strongest settings by default.” The Children and Teens’ Online Privacy Protection Act, commonly referred to as COPPA 2.0, amends the Children's Online Privacy Protection Act of 1998 to strengthen online protections for users under the age of 17.
In advance of the House Committee’s markup, Meta introduced “Instagram Teen Accounts,” or what the company is pitching as “a new experience for teens, guided by parents.” It automatically places teens under 16 into accounts with stricter “built-in protections” that can only be changed with a parent’s permission. The move comes amid mounting legislative pressure, both at the federal and state levels, to create a safer experience online for children.
Both the House and Senate versions of KOSA and COPPA 2.0 have seen years worth of modifications, including changes around age verification, covered ages, and an explicit list of harms that platforms must take reasonable steps to mitigate. Accordingly, an amended version of KOSA was released on the evening before the markup. The new version follows a similar structure to Title I of the Senate’s KOSPA bill, albeit with some key differences. These differences, including weakening the duty of care language to remove obligations for protecting children and teen’s mental health, will further add to the contentious debate around the bill as it moves to consideration by the full House.
This congressional session will be Chair McMorris Rodgers’ last. An open question remains whether her retirement will also bring the fight for stronger data privacy protections in the House to a close.
Related Reading
- KOSA is Good Tech Policy, But the House Has an Opportunity to Make it Even Better
- Child Online Safety Law Clears the US Senate, But Faces Uncertainty in the House
- Washington’s Big Week for Child Online Safety
Below is a lightly edited transcript of the committee marking up KOSA (H.R. 7891) and COPPA 2.0 (H.R. 7890). Please refer to the official video when quoting from the markup.
Rep. Cathy McMorris Rodgers (R-WA):
Chair calls up H.R.7891 and asks the clerk to report.
Clerk:
H.R.7891, a bill to protect the safety of children on the internet. Be it enacted-
Rep. Cathy McMorris Rodgers (R-WA):
Objection. The first reading of the bill is dispensed with, bill will be open for amendment at any point so ordered. Does anyone seek to be recognized? Chair recognizes chairman of the subcommittee, Mr. Bilirakis, for five minutes on the bill-
Rep. Gus Bilirakis (R-FL):
Thank you, move to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
... and the prime sponsor of the bill.
Rep. Gus Bilirakis (R-FL):
Thank you. I like to say it has been an incredible honor and privilege. Thank you for giving me this honor, Madam Chair, I know this is an issue you deeply care about, all of us do. So, thank you for giving me the honor and privilege to be able to lead and be the sponsor, Republican sponsor, of the Kids Online Safety Act, KOSA, long overdue legislation to reform and finally hold big tech accountable for its failures to keep children safe on the internet. The sad reality is that many of our young users have been besieged by harmful conduct online, from sexual exploitation, to illegal drug sales, promotion and threats of violence, dangerous acts that lead to suicide or death, and an unprecedented mental health crisis, particularly among teens. Much of this is due to the fact that big tech, the business models for big technology, rely on products fed by advertisers that are designed to keep kids addicted to their screens and accounts longer than ever.
I'm so saddened by the stories we've heard about tragedies of families whose lives were ripped apart, and parent survivors whose children were taken far too soon after exposure to harms online so vile and repulsive they would make your stomach turn. Access to illegal drug sales, harassment and online sextortion, twisted and dangerous games and challenges that cause kids to pass out and die are just a handful of the countless harms that are perpetuated online, and we must do everything we can to prevent any more tragedies. I want to thank and commend the bravery and advocacy of parent survivors and families, such as the Smill family, the Malik family, the Bride family, the Mitchell family, the Minor family, and too many others who have turned these tragedies into action. And thank you very much for attending today's markup, we appreciate it so much, and thanks for advocating on behalf of your children's memories.
These parents are pleading with Congress to take action so that other families do not have to suffer unfathomable loss. The time to move this bill forward is now. We must continue to fight for putting the necessary guard rails, and empower parents and families to keep kids safe online. We must force big tech to avert and alleviate these harms wherever possible, and stop wrongful death. KOSA would be a significant step forward to accomplish these goals, and I ask my colleagues to support this bill at a committee today. H.R.7891 would provide kids and teens with default safety settings and safeguards, and turn off data-driven algorithms, and provide parents with tools to manage and view account interactions, track their time, limit purchases, and address harmful usage. We have continued to take a significant amount of feedback on this bill, and I've offered [inaudible] that I believe substantively responds to many concerns about speech regulation, censorship and overreach.
Many may continue to express concerns on both sides of the aisle about unintended consequences, but we also know this is not the end of the road, simply an important step in the process to get the bill closer to law. I have been so grateful to work with my colleague and partner on this legislation, representative Kathy Castor. I commend her and thank her and her team for their tireless efforts on behalf of children. Neither of us will rest until ... And I know the chairman and the ranking member, I just appreciate everybody working together on this bill. It takes sacrifice, it takes compromise to get things done. And again, my goal is for this bill to become law. And let's do the right thing and vote yes on the Kids Online Safety Act. We must protect our children online, and hold big tech accountable. With that I yield back, Madam Chair.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back, Chair recognizes Ms. Castor for five minutes, the Democrat co-lead on this legislation.
Rep. Kathy Castor (D-FL):
Thank you, Madam Chair. I move to strike the last word, and I want to start by thanking Chair Bilirakis for your partnership here on the Kids Online Safety Act. But I especially want to thank the parents and the wide range of advocates, from the pediatricians to the nation's surgeon general, to the families who have struggled with a child with an eating disorder, or has suffered bullying, anxiety, discrimination, or suicide. To the student advocates, like Design It For Us, who have urged us to move this forward, thank you for sharing your experiences and urging Congress to act. Now, done right, the Children's Online Safety Act will make the world safer for young people, because big tech platforms and social media companies would have a duty to design their products not to harm kids. Big tech platforms designed their products in a deceptive manner, they deployed manipulative features to addict young users.
They do so knowing that our kids' mental, physical, and social well-being is suffering, but they value their profits more. Yesterday, Meta announced changes to their apps for kids, the day before this hearing. They should have done this years ago. We cannot wait for these companies to do the right thing. The Congress must act, especially in the face of last year, America's surgeon general published an advisory on the impacts of social media, calling it out as an important contributor to our nation's youth mental health emergency. Dr. Murthy said that adolescents who spend more than three hours a day on social media face double the risk of anxiety and depression symptoms, and the average daily use in this age group as of the summer of 2023 was 4.8 hours. He said legislation from Congress should shield young people from online harassment, abuse, and exploitation, and from exposure to extreme violence and sexual content that too often appears in algorithm-driven feeds. The measures should prevent platforms from collecting sensitive data from children, and should restrict the use of features like push notifications, autoplay, and infinite scroll, which prey on developing brains and contribute to excessive use.
So, after years of hard work and examination by this committee, I was proud to have introduced the KOSA with Chair Bilirakis. As originally proposed, KOSA would provide young people and parents with the tools, safeguards, and transparency needed to stay safe online, and hold the big tech companies accountable, requires the platforms to take reasonable measures to prevent and mitigate online harm to minors, provides kids, teens and parents the opportunity to turn off the data-driven algorithms, requires those platforms to provide parents with the tools to help manage a minor's use of the platform, safety settings, limit purchases, address harmful usage and make those design features the default setting.
All of this was passed in a legislative miracle this year in a closely divided Congress. We all celebrated when the United States Senate passed KOSA 91 to 3. That could have been the easy starting point for this committee, and that's in essence the version that Chair Bilirakis and I introduced. Over the past couple of weeks, Mr. Bilirakis and I have worked with stakeholders to bring an [inaudible] to this committee in the spirit of moving this forward and protecting our kids. I regret that this [inaudible] is a weakened version from what passed the United States Senate, but I think now, and I appreciate what Mr. Bilirakis has acknowledged, that there is still work to be done on this bill, and I look forward to working with his team to do so. See, we cannot abdicate our responsibility that we have to our kids in America. We can't allow unintended consequences to creep in because there was politics played with KOSA here at the 11th hour.
It's unconscionable that big tech is pushing our kids to spend unhealthy amounts of time on their devices. Pro-suicide online content is leading kids to die by suicide, and I think it's important today to move it forward with the promise and acknowledgement that we ... I don't know that I could support this version if it comes to the House floor in this manner, but I trust Chair McMorris Rodgers and her leadership, I trust Chair Bilirakis and his leadership, I know where their hearts are. I know how hard they've worked to get us to this point, and I think in the spirit of what we need to do as responsible legislators, we should go ahead and move this forward today with the understanding that we probably need to move towards the Senate version of the bill and not go backwards at this time. With that, I yield back the balance of my time. Thank you.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlelady yields back. I recognize myself for five minutes to strike the last word on the bill. Today I'm thinking of Matthew from Maryland, Nylah from Pennsylvania, Gavin from North Carolina, Anna from Colorado, Ethan from Washington State. All of these children are no longer with us because of the dangers posed by the current state of social media and our online ecosystem. Unfortunately, these stories are not isolated incidents, there are thousands of stories just like these. These are some of the stories that we have collected, and I'm going to pass these around today, and I'm going to urge the committee members to just read through these stories. Today is about them, it's about protecting millions of other children in the United States that are facing the same online threats like Matthew, the next Anna, the next Matthew, and I ask unanimous consent to enter into the record these stories from the parents of children harmed across the country without objections so ordered. Right now teenagers are spending on average nearly five hours a day scrolling through social media.
The platforms developed by these companies are specifically designed to get kids addicted, and they've been used to target children with content that leads to dangerous and often life-threatening behaviors. There is currently no accountability for the harms our children are experiencing as a direct result of their use on these platforms. The American people, parents are exhausted and losing hope. They feel like they've tried everything possible to solve this problem on their own with no success, and they're looking to their elected leaders to step up and act. That's why we're here today considering the Kids Online Safety Act, KOSA, led by reps Bilirakis, Bucshon, Castor, Houchin, Schrier and others to answer their cry for help. KOSA will provide both parents and kids the tools they need to better protect against serious online threats to our children's development, their mental health, and ultimately their safety. Never again should there be a choking challenge, or a Benadryl challenge, or an inhaling toxic fumes challenge.
These are viral trends that have ended too many young lives too soon. KOSA will ensure high impact online companies create and implement design features that prevent and mitigate harms to minors. We'll also ensure covered platforms provide easy-to-use safeguards for minors. And when a platform knows a user is a minor, the platform will ensure that the most protective safeguards are enabled by default. A covered platform will also provide tools for parents to help manage their child's use of covered platforms, and be required to be more transparent with the American people about the risk associated with their platforms to minors, and the steps they're taking to mitigate them. We must build a better future for our children, they are our future. Throughout this process, the bill sponsors have worked in good faith despite plenty of policy disagreements throughout the process, and I want to thank them for their leadership and their hard work to get KOSA to where it is today.
I also want to thank all of the parents from across the country, many who are here, who have made their voices heard and demanded their children be protected. We're just a few years away from the 250th anniversary of our great nation. We are only able to celebrate such a milestone because throughout our history, brave, courageous patriots came together, made hard decisions, and did whatever was necessary to serve, protect, and defend the American people and our nation's values. I urge my colleagues to join the ranks of those patriots, have courage to make hard decisions, and do what is in the best interest of the American people, the best interests of your children, and your grandchildren, and support the Kids Online Safety Act. I yield back. Further discussion. For what purpose does Mr. Ruiz-
Rep. Raul Ruiz (D-CA):
I move to strike the last word on the bill.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman's recognized for five minutes.
Rep. Raul Ruiz (D-CA):
I believe it is crucial to protect our children from the dangers of social media. As a doctor, and more importantly a father, I'm deeply concerned about the potential risks social media may have on my two young daughters. The current data is alarming. Research shows that excessive use of social media platforms is linked to a 25% increase in anxiety and depression among teens, with 40% of youth reporting feelings of isolation. This is especially concerning as mental health crisis among youth has risen by 57% over the last decade. Due to ongoing exposure to harmful content, cyberbullying, and the pressure to conform to unrealistic expectations set by influencers and peers and others, the Kids Online Safety Act, as introduced in the Senate, took an important step by requiring social media companies to implement design features that protect against mental health harm, specifically anxiety, depression, eating disorders, substance use disorders, and suicide behaviors.
This language is critical as it directly addresses the mental health risk that social media poses to our children. However, this version of KOSA that is being put forward in the House today in a last-minute move, removes these critical protections. The House bill only requires social media companies to guard against physical violence and harassment, that's it. Not anxiety, not depression, not eating disorders, not substance use disorders, and not suicidal behaviors. This means the House version would not prevent platforms like Snapchat and Instagram from promoting videos and content that contribute to anxiety, depression, eating disorders, substance use disorders, or suicidal behaviors. This is the very content that has already devastated so many families, and claimed the lives of far too many kids. And most importantly, this version is not what parents of children who have died from suicide, fentanyl poisoning, or accidental death caused by dangerous social media platforms want. In fact, I would like to ask unanimous consent to put into the record a letter from 97 parents who lost a youth to drug overdose, suicide from depression or bullying, who oppose this version.
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, so ordered.
Rep. Raul Ruiz (D-CA):
For this reason, and Chairwoman, for Matthew, Nylah, Gavin, Anna, and Ethan, I cannot in good conscience support a bill that fails to address these mental health harms. By excluding these provisions, the House version falls short of protecting our children from the most dangerous and pervasive threats posed by social media. If this passes without restoring these crucial protections, we are leaving children vulnerable to the same harmful content that continues to prey on their mental well-being. That said, I believe this committee can do better, and I welcome the opportunity to continue working on and ultimately voting for a bill that not only addresses physical harms, but also takes meaningful steps to protect the mental health of our children. Indeed, our children do deserve better. And with that, I yield the remainder of my time.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back, further discussion. For what purpose does Mr. Crenshaw seek recognition? Gentleman's recognized for five minutes.
Rep. Dan Crenshaw (R-TX):
First, I want to say I will support this bill. It's not going to sound like it from these remarks, but I do support it. It has some good provisions. It gives parents tools that give them more control over their children's social media use, that's a great thing. Provision that calls for a study on how we might implement age verification requirements, that's a good thing. But the rest of it, not so much. So, I vote for this reluctantly. I want to raise some important concerns about the rest of this bill, and I hope I end up being wrong about these concerns, I really do. But at its core, this bill hands over authority to the FTC to regulate design features on social media platforms. So, what does that actually mean? According to the bill, it means things like scrolling time, use of badges, personalized recommendation systems, et cetera.
This is all very much geared towards regulating the algorithms of these platforms. And I want to make something clear for anyone who knows anything about social media, you can't manipulate algorithms without also manipulating content. Algorithms and content feed off of one another, you can't separate the two. So, if the FTC decides that a design feature causes "serious emotional distress," which is by the way a subjective definition that's subject to change, how does that not end up implicating content moderation? It will. So, a little bit of a warning to everyone, and I hope I'm wrong, is that we might be back in here sooner than later doing hearings on how the FTC, or social media companies, have conducted very serious content throttling and censoring in order to avoid the liability that this new law would impose. I mean, doesn't all political speech induce some kind of emotional distress for those who disagree with it?
I say things here all the time, and my colleagues show severe emotional stress over it. Now, don't get me wrong, we all agree on one thing, that the intent here is to protect our kids. But that leads to my second concern actually, which is that this bill gives false hope to parents. I'll explain this as simply as I can. If my concerns above don't come true, then that probably means that the FTC and social media companies took a very light touch approach to this law, which means our kids' safety is still threatened. It means an FTC regulator really has no idea how to regulate algorithms, and I don't blame them. I don't see how they can effectively do it. And as someone who has probably been on social media longer than anyone in this room, I really don't see how changing algorithms is going to save your kids, either from harmful content or online bullying.
So, I do not want to sell parents a false bill of hope here. Don't get me wrong, I think there's a serious problem with kids on social media, and I think we should actually take it seriously. I think the evidence is clear that social media is a net harm for kids, but I also believe in simple common-sense solutions for complex problems. It's why I filed an amendment modeled after what Florida did, a straightforward ban on social media for kids under 16, with age verification. That's a cleaner, more direct approach, and it doesn't require giving the FTC any more power than it already has. If we think that social media has become so dangerous for our kids, and I think we seem to agree on that, then we should actually ban it.
We should actually ban the use of it for kids under 16. I don't want my daughter on it, but I also don't want her to be isolated by her peers when they're all on it and she isn't. So, maybe Florida got it right, this time. We're not voting on my amendment. I will not offer it today, but I want to make it clear that there is a model we can follow, and I believe we'll be back in here in the not-too-distant future talking about this again, and we will be in need of simpler solutions that really address the problem. With that, I yield back.
Rep. Brett Guthrie (R-KY):
Gentleman yield. Would Gentleman yield?
Rep. Dan Crenshaw (R-TX):
I yield.
Rep. Brett Guthrie (R-KY):
I just needed 30 seconds. I just want to say I appreciate your comments, I appreciate Chair Rodgers and the hard work that she has done, Representative Bilirakis and Wahlberg, and we all have the same goal and we want to ensure our children are protected from harmful content online. And I want to let the committee know I stand ready to work on any outstanding issues, and want to get this across the finish line in a way that protects our children. So, I will yield back to my friend from Texas, thank you for yielding.
Rep. Dan Crenshaw (R-TX):
And with that, like I said, I will support the bill, but I want to address those concerns, and I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back, further discussion. For what purpose does Ms. Schrier seek recognition?
Rep. Kim Schrier (D-WA):
Thank you, Madam Chair, and I ... Oh, to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlelady's recognized for five minutes.
Rep. Kim Schrier (D-WA):
Madam Chair, I so appreciate your really heartfelt comments, and I am a proud co-lead of the original version of the Kids Online Safety Act, and I want to thank Representative Bilirakis, Representative Castor for their work, and I also want to thank the parents whose children were tragically victims of online dangers for so bravely sharing their stories with all of us, and helping us craft the original bill. Particularly as a pediatrician and the mom of a teenager, I believe it is so essential to pass legislation to protect children online. However, I, like many of my colleagues, am deeply concerned about the most recent last-minute changes made to this bill, particularly to the duty of care standard. Strong duty of care language is necessary to make sure this bill truly has teeth, and will hold social media companies accountable and protect kids online.
I know my colleagues Mr. Bilirakis and Ms. Castor have worked tirelessly to get this bill to passage in committee today, and I want to thank them for their dedication to this work and to all of our families. But there's more work to do, and I look forward to continuing these discussions and strengthening this bill before it passes the House. I would like to read the letter, given that I have time, from the Social Media Victims Law Center that was already entered into the record by my colleague Dr. Ruiz, that highlights these concerns that must be addressed. Here is the letter. "Dear Chair McMorris Rodgers and Ranking Member Pallone, we are parents of children who have died from suicide, fentanyl poisoning, or accidental death caused by dangerous social media platforms, writing to express our opposition to the amendment to H.R.7891, the Kids Online Safety Act, scheduled for markup today, September 18th.
For the past three years, we have urged Congress to protect kids from social media platforms that maximize advertising revenue through addictive algorithms, exploiting adolescents' underdeveloped brains by deluging them with materials that promote depression, anxiety, suicide, and eating disorders, and that connect them to social predators and drug dealers. Having suffered the worst loss a parent can endure, our singular goal has been to protect other kids from suffering the same fate that befell our children, and prevent other families from experiencing the unrelenting grief that we will endure for the rest of our lives. We were grateful for the Senate's passage of the Kids Online Safety Act with an overwhelming 91 votes, and are hopeful that the House would follow this bipartisan example. We were appalled to learn that the proposed amendment to H.R.7891 removes the language that would impose on social media companies the duty of care to protect against the very harms that our children sustained.
The Senate version required social media companies to implement design features that prevent the following mental health harms, anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors. The House bill eliminates this language, dramatically reducing the harms social media companies owe a duty to prevent and mitigate. Social media has caused a 156% increase in suicide among 12 to 16 year olds, a 350% increase in opioid deaths in minors, a 40% increase in hospital admissions for mental health harms, and skyrocketing rates of anxiety and depression. 91 Senators voted to require social media companies to protect kids from these harms. As parents, we cannot support legislation that waters down this solemn commitment. We urge you to restore the duty of care language passed by the Senate bill, and move swiftly to enact this legislation, and stop the carnage being inflicted on America's kids by dangerous social media platforms. Sincerely," and this now has 97 signatures from parents. With that I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
The Gentlelady yields back. For what purpose does Dr. Joyce seek recognition?
Rep. John Joyce (R-PA):
Madam Chair, I move to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman's recognized for five minutes.
Rep. John Joyce (R-PA):
Thank you Chairwoman Rodgers, and for the sponsors for the work on this landmark piece of legislation, H.R.7891, the Kids Online Safety Act. As a doctor who has treated numerous pediatric patients, I understand how critical it is that we take steps to protect our children online. This bill is a meaningful and vital step towards ensuring that we are shielding our children from the internet's most dangerous, vicious corners, and protecting their mental health. In 2022, Nylah Anderson, from my home state of Pennsylvania, was shown the blackout challenge videos on TikTok, which encouraged children to choke themselves to the point of passing out. Through this dangerous algorithm, and TikTok's so-called inability to take down harmful content, Nylah was shown videos that encouraged deadly self-harm and the name of the viral content continued.
Sadly, Nylah lost her life while attempting to replicate what she had seen online. It has been reported that 10 children have been killed this way, and that the challenge had led to numerous hospitalizations. It's clear from the data that we've seen, that children should not have unfettered access to the internet. We must safeguard them from these dangerous fads, eating disorder content, and other toxic posts that have been perpetrated on social media. It is time for us to take the necessary steps to keep children like Nylah safe. Actually, it is time to work to keep all children safe online. I urge my colleagues to vote in favor of this bill, the Kids Online Safety Act, and Madam Chair, I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back, further discussion. For what purpose does Ms. Craig seek recognition?
Rep. Angie Craig (D-MN):
Madam Chair, I move to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
The Gentlelady is recognized for five minutes.
Rep. Angie Craig (D-MN):
My colleagues, we're here today because we all want to keep our kids safe. That's the bottom line. And this summer, the Senate passed a bill with overwhelming bipartisan support that will do just that, keep our kids safe. I'd like to begin by stating that I'm a proud co-sponsor of the Kids Online Safety Act. Yesterday morning, I was prepared to vote in the affirmative, yes, on this particular bill. It's with a heavy heart that I share my deep concern with today's changes to the bill, including changes around the duty of care provision. I heard this morning from Bridgette Norring, she was in my office last week about KOSA, the mother of Devin Norring, who died at age 19 of Fentanyl poisoning from a fake pill that he bought on Snapchat. She was a strong advocate for this bill. We worked hand in hand to get it here today, and it breaks my heart that we are here where we are today.
It's devastating to her and to me that the changes made to this bill at the last minute dropped with one day's notice to members of this committee, that the changes made to this bill will not adequately hold social media companies accountable, and in fact keep our kids safe online. We have in front of us a Senate bill that passed 91 to 3. Folks, we all know that that just doesn't happen anymore. The Senate bill could have protected Devin, and if we pass it here in the House, it will help protect countless other kids just like him, and just like so many kids in this room today. I am determined to pass a bipartisan bill that will withstand constitutional challenges, and today's changes to the duty of care make that mission much, much more difficult.
We owe it to families like the Norrings and those in the room today to get this done in a way that will protect kids like Devin. I commit to Bridgette and every family in this room to continue fighting for a bill that will get this done, until we can protect kids all across America. My colleagues, we don't have to be conflicted here today. 91 to 3. We don't have to cave to social media companies. The duty of care provision is critical. I trust that we will get this done in the end with this provision, but it's a damn shame that we're sitting here today in the situation that we're in. And with that, Madam Chair, I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlelady yields back, further discussion. For what purpose does Mr. Duncan seek recognition?
Rep. Jeff Duncan (R-SC):
Move to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman's recognized for five minutes.
Rep. Jeff Duncan (R-SC):
Thank you, Madam Chair. It's been a work in progress. We've heard very emotional testimony and subcommittee from people who have experienced a lot of things through social media platforms that have affected them in different ways. And I plan on supporting this bill today, but I do have serious concerns about some of the language in there, and I hope we can address it before this bell sees the light-
... some of the language in there and I hope we can address it before this bill sees the light of the House floor, specifically the language where social media platforms can mitigate content that causes serious emotional disturbance. Who defines what that is? Is it the social media platform? Is it the FTC? Is it government in general? I believe in clarity in language and legislation, and I believe this language is very vague, arbitrary, and ambiguous. So, I would hope the committee would work on trying to clarify that. And anyway, with that, I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back. Further discussion? Oh. For what purpose does Ms. Eshoo seek recognition? Gentlelady's recognized for five minutes.
Rep. Anna Eshoo (D-CA):
Thank you, Madam Chairwoman. I think like all members of the committee, we've been listening very, very closely to one another. I don't know how many members listened to the hearing that was held in the Senate on this issue with the hearing room packed with families, with families of the children that have been deeply, deeply affected in such negative ways. I had occasion to keep following a thread and it took me to a site that actually instructed young people how to commit suicide. So I mean, if that's not an impetus for us, then I don't really know what is so. It even caused Mark Zuckerberg, my constituent, to stand up and to try to apologize to parents. That doesn't mean that much unless there's a law that deals straight away with this.
The bill is a good bill. There's the support, which was just stated, about what the Senate has done is highly instructive to us. But I do think that in some very important areas that this bill needs to be strengthened. There's no doubt in my mind that this bill is going to be conferenced. And if everybody votes for it, there's no leverage to improve the bill. The authors of the legislation before us today acknowledge that the Duty of Care section of this bill really needs to be improved. There are whole communities of interest in the country that can be negatively impacted by this who are not small children, but because of who and what they are, the law would go against them.
So I opt for leverage at a conference, and that's why I'm going to vote against the bill, not because it's a bad one, it's a good one, but very importantly in key sections needs to be improved. The AINS has been, in my view, hastily introduced in, again, unclear language about critical provisions, specifically what an inherently dangerous act is.
So, that's where I am, my colleagues. And I thank the members that have brought this forward. I think you've done very, very good work. I think it still needs some more work to be strengthened, because I think that not only parents but the children of our country deserve that. And I want the best bill to go forward. And with that, I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Lady yields back. For what purpose does Mr. Morgan seek recognition? Griffith.
Rep. Morgan Griffith (R-VA):
Speaking to the measure.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman's recognized for five minutes.
Rep. Morgan Griffith (R-VA):
I thank you Madam Chair. I'm going to agree with some of what Ms. Eshoo said. Her words were spoken like a true legislator who's been around a while. That said, however, I come down on the opposite side. Here's my fear. We have no leverage if we don't get a bill out of this committee. We have no leverage if we don't work to improve the bill and get it off the House floor. I'm open to some suggestions that I've heard in the debate today. I've heard Mr. Crenshaw's comments. I heard comments from our colleagues on the other side. But what we have done consistently in this subject area is we have punted the ball down the road for another day.
I don't think this bill's perfect. I have no embarrassment if we have to come back in and tweak it, improve it, or do whatever we have to do in a year or two years. I agree with Mr. Crenshaw, we're probably going to have to do that. But until we pass a bill and see what works and what doesn't work, we're still just going to be debating the concepts. This is a new area. We're going to have to go out. We're going to have to pass laws. We're going to have to pass laws then to amend or repeal the laws that we already passed in this arena. That's legislating.
And so, while I recognize what Ms. Eshoo said is appropriate, that you want to have leverage in the conference, you want to have leverage to do good things, you have to have a vehicle to get it to conference. You have to have a concept to get it there. I think this is a good bill. Can it be improved? Absolutely. I've already spoken with Mr. Bilirakis about an amendment, that as I was reading through it I realized there's a little amendment that needs to be made. That's what we do. We get it out of committee. Hopefully it'll get to the floor. We go from there.
I would say that I am going to propose later this week to the Rules Committee that we have a process where a bill comes out of the committee of main jurisdiction, that Rules has a shot clock before they send the bill to the floor. There's a couple of things they can do, re-refer to a separate committee, et cetera. But at this point, there's a fear that if we don't come out of here with at least a decent vote, leadership's not going to bring this up. Certainly not going to bring it up before the upcoming election, but the time that this bill is likely to come up or has a chance to come up is subsequent to the election. But if we come out of here with a vote that indicates that we are namby-pamby on this particular bill or weak on this particular bill, we will never have the opportunity to negotiate with the Senate or those who want to improve the bill.
So I'll be voting for it, and look forward to working on trying to get the right bill long term. Can't do that if we don't start somewhere. This is the-
Rep. Anna Eshoo (D-CA):
Will the gentleman yield?
Rep. Morgan Griffith (R-VA):
... beginning of that path.
Rep. Cathy McMorris Rodgers (R-WA):
Will the gentleman yield? Will the gentleman yield?
Rep. Morgan Griffith (R-VA):
I will yield to the gentlelady.
Rep. Anna Eshoo (D-CA):
Thank you very much. I listened very carefully to what you said, and what I'd like to re-emphasize is if... There's been a lot of haste on this. Otherwise, I would've, in my view how to strengthen the bill, would've offered via amendments today. But that opportunity is... I mean, it's not here. It's not in front of us. And I think that it's important for everyone to understand that. The last thing I would ever be described as is namby-pamby. There are a lot of things that can be said about Anna.
Rep. Morgan Griffith (R-VA):
Let me be clear. Reclaiming my time. Let me be clear. I never intended to indicate that the gentlelady was namby-pamby. I think we as a body are taking that approach consistently over the years on both sides of the aisle. Because we can't get the perfect bill, we reject the good. I see this as the first step in getting something done that's positive for the children of the United States of America. I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Would the gentleman yield to the chair?
Rep. Morgan Griffith (R-VA):
I will yield to the chair.
Rep. Cathy McMorris Rodgers (R-WA):
I appreciate the gentleman's comments. This has been a very long process. There are amendments to the bill today. We have held numerous hearings. Congress has been working on acting for years, maybe decades. It's been 26 years since Congress has acted. We've held hearings. We've brought in the CEOs. We've had numerous bills introduced. It's been a very long process. But yet today, as we sit here, there is no bill, there is no legislation, there's no law on the books to protect children online. There is nothing. There's no recourse.
As I mentioned earlier, John Dingell once said... I mentioned this earlier this year, "There hasn't been a perfect law since Moses came down from the mountain."
I agree with some of the concerns that have been raised. I'm committed to continuing to work, but the tactic of big tech and the opponents of this legislation has been to make the perfect the enemy of the good. It's easy to come up with a reason to vote no. We all know that. If you've been in Congress, you know it's easy to vote no. The hard thing is to vote yes. I urge my colleagues to vote yes today. Let's take action on behalf of kids online all across this country. I yield back to the gentleman.
Rep. Morgan Griffith (R-VA):
I yield back to the chair.
Rep. Cathy McMorris Rodgers (R-WA):
Further discussion? Gentleman... Well, the chair recognizes a gentleman from Florida, Mr. Bilirakis for the purpose of offering the AINS.
Rep. Gus Bilirakis (R-FL):
Thank you, Madam Chair. I guess we should read the amendment. Is that correct?
Rep. Cathy McMorris Rodgers (R-WA):
Offer the amendment. Yes, that'd be great.
Rep. Gus Bilirakis (R-FL):
Okay.
Rep. Cathy McMorris Rodgers (R-WA):
Clerk will report the amendment.
Rep. Gus Bilirakis (R-FL):
At the desk.
Clerk:
Amendment in the nature of a substitute to H.R.7891, offered by Mr. Bilirakis, strike all after the enacting clause and insert the following.
Rep. Cathy McMorris Rodgers (R-WA):
Clerk will... Without objection, the reading of the amendment will be dispensed with. Gentleman Subcommittee Chairman Mr. Bilirakis is recognized for five minutes in support of the AINS.
Rep. Gus Bilirakis (R-FL):
Thank you so very much. I have an amendment at the desk in the nature of a substitute. I'm urging my colleagues to vote yes on this legislation and ask for their support on this bipartisan AINS. I will tell you folks, big tech wants you to vote no. They want you to kill this bill. We're moving this bill forward through regular order. So ultimately, and Representative Castor and I have worked on this bill and we welcome suggestions on how to improve the bill. As matter of fact, I talked to Morgan, as he said, Representative Griffith just now and I told him that I would work with him and more than likely include his amendment to the bill before it gets to the floor. So let's move forward. Let's play... I'm an SEC guy, but let's play Big Ten football and move those chains forward. Okay? That's how you get legislation done.
Listen, I know most people... Some people don't like it on the right, some people don't like it on the left, but it's just like our chairperson said, it's easy to vote yes or no. It's easy to vote no on a particular bill. But we have a reputation, Representative Castor and I, at looking at both sides. We're very open-minded. And we want to improve this bill, because ultimately all we want to do is save our children's lives. And I know that it's been detrimental. And this is a positive bill. Sure, can it be improved? Absolutely. And we pledge to you that we will work with you to get this done.
So, I'm urging my colleagues to vote yes on the AINS. As I stated before, we have taken a significant amount of feedback on H.R.7891 and I certainly appreciate those who are earnestly trying to get the bill. And you're working in good faith. The groups, most of the groups, understand, they know that we're not going to get a hundred percent of what we want, particularly during this markup. But we've got to move forward, because we all, again, we know what the ultimate goal is, and the ultimate goal is to save these children's lives.
I have heard concerns again from groups and members on both sides of the aisle, conservatives and liberals alike. The leadership, working with the leadership, the ranking member of course, concerned parents, who matter the most, and other stakeholders. In response and in order to strike a balance, I'm proud that we came to a bipartisan agreement on this AINS with co-author, and I give her so much credit, Kathy Castor. This new language tightens and updates the text and attempts to remove ambiguity and vagueness, instead, making clear that this bill is not about constitutionality, protecting speech, or content moderation. This bill, just like the Senate bill, this bill is about the behind-the-scenes design codes that cause and enable harmful conduct on the internet. My amendment makes crystal clear that the design features definition doesn't hinder anyone's individual First Amendment rights to protected speech or their ability to post specific viewpoints online. Instead, it prohibits government enforcers from weaponizing this based on any user's individual point of view.
We also made modest but important changes to the Duty of Care section through new definitions to ensure that this bill survives the scrutiny that the courts will apply. And you know the courts are going to file lawsuits. So we've got to take that in consideration, folks. These address bipartisan concerns that an activist government enforcer could interpret the language to weaponize speech and content. We wanted to ensure that this doesn't happen. We want this to become law. And again, if we have to come back next year and amend, I'm certainly open to it too.
And that the duty for companies should be to prevent harmful conduct and content we know to be unconstitutional. Sexual exploitation, drug and illegal sales marketing, physical violence are all harms that the government has a compelling interest to stop, and we do it with this AINS.
My amendment also updates definitions in the text that looks to build upon existing federal law and regulations, such as a serious emotional disturbance, which is a long-recognized term, by the way, that SAMHSA and HHS use in the public health space to describe children with mental illness that lead to functional impairment and limited their daily lives, limiting their daily lives. So in other words, there were questions on that. And this is already in law under SAMHSA.
We set a new focus on inherently dangerous activities such as suicidal and viral challenges. We know these are often unconstitutional in nature and promoting it directly endangers someone's safety and too often leads to wrongful death. I don't have much time, do I?
Sadly, we know big tech will pour millions into their lawsuits to tear the law down in the courts. They've said so as recently as yesterday when they read my amendment, and are trying to stop us from moving forward. Unfortunately, many valiant efforts of state governments looking to protect kids' privacy and safety online have not survived such challenges.
So let's work on this bill together for the good of our kids. Thank you. And I yield back, Madam Chair. Sorry for taking up so much time. This is such a very important issue. Thank you.
Rep. Cathy McMorris Rodgers (R-WA):
Thank you. Gentleman's time has expired. Further discussion? For what purpose does Ms. DeGette seek recognition?
Rep. Diana DeGette (D-CO):
I move to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlelady is recognized for five minutes.
Rep. Diana DeGette (D-CO):
Thank you, Madam Chair. The Energy and Commerce Committee is the oldest committee in the U.S. House and it's the most storied committee. And it is our job as this committee to pass legislation which will improve the lives of our constituents. And I know all of us on both sides of the aisle feel strongly we need to protect our children. I know I do and we all do. And I think that our mantra should be like our doctor colleagues here say, but from a legislative perspective, which is, "First do no harm." This committee should first do no harm when we are trying to tackle these very difficult and important jobs. And I sadly feel that with this AINS, we are sorely failing on that and we are failing the children of this country.
The first thing I want to say, I respect the sponsors of this bill so much, particularly Ms. Castor and Mr. Bilirakis, who I work with so much. But I do resent Mr. Bilirakis insinuating that if you oppose this AINS that you were in the pocket of big tech. Far from it. And I know that was not his intention. We're all trying to work together on this.
The chair said that we had numerous hearings on this bill going on for months. And that is true, and I appreciate that. And the chair also said that we've tried to bring this bill up several times. That is also true. However, the AINS was received, the language of this AINS was received at noon yesterday. And when we all frantically tried to review the language of this AINS, it could actually have unintended consequences that are going to do more harm to our children than passing this bill.
Mr. Bilirakis says, "Well, that's okay. We're just legislating on the fly here, so maybe we can fix it before we go to the floor. And Mr. Griffith might have some ideas and other peoples might have some ideas." This is not the way to legislate, folks. We should have had this weeks ago. We could have worked on it in a bipartisan way.
I'm going to talk specifically about some of the problems. The Senate passed a bill 91 to 3. I want to know why we couldn't have taken the Senate bill. Why did we have to do this at the eleventh hour? So let me talk to a couple of specific issues. Some of my other colleagues have already mentioned them.
The revised language eliminates or restricts a high-impact company's duty of care to reasonably prevent and mitigate following harms to minors from its design features. Anxiety, depression, eating disorder, substance use disorders, and suicidal behaviors, cyberbullying and discriminatory harassment, promotion and marketing of tobacco products, gambling or alcohol, unless the company knows that the user's a minor. Compulsive usage unless the company knows the user's a minor. Promotion of acts that are likely to cause serious bodily harm, serious emotional disturbance or death, unless the promoted act is inherently dangerous.
Section 102.a.4 applies this duty of care to promotion of acts that are, quote, "Likely to cause serious bodily harm, serious emotional disturbance or death, but only if the promoted acts are inherently dangerous. This restriction excludes promotion of acts that are not inherently dangerous but nonetheless are likely to result in serious bodily harm, serious emotional disturbance or death."
I could go on. But what I want to say is, Madam Chair, if you really want to pass a bill, if we really want to pass a bill that protects our kids, let's just not pass this bill and hope things work out before we get to the floor. Let's pull this bill off the table today. Let's spend the next couple of weeks working to fix this language and then let's come back, let's pass the bill, let's go to conference with the Senate. I remember that because I've been in Congress for a long time, back in the day when we had regular order, remember that? Well, we could do that. We could go to conference with the Senate, we could come up with a bill that protects our children, send it to the president's desk and we would all stand there proudly working together on behalf of the children of America. I yield back.
Rep. Gus Bilirakis (R-FL):
Will the gentlelady yield, please?
Rep. Diana DeGette (D-CO):
Yes.
Rep. Gus Bilirakis (R-FL):
Yes. And no way did I intend that anyone on the other side of the aisle is in the pocket of big tech. I just wanted to remind you that big tech wants you to vote no on this particular bill. It kills the bill-
Rep. Diana DeGette (D-CO):
But that doesn't mean we should vote yes. And I yield back.
Rep. Gus Bilirakis (R-FL):
That's all I'm saying. That was just a reminder of where they are.
Rep. Diana DeGette (D-CO):
Thank you. Reclaiming my time. That doesn't mean we should vote yes, and I yield back.
Rep. Raul Ruiz (D-CA):
No, yield to me.
Rep. Diana DeGette (D-CO):
I yield to Congressman Ruiz.
Rep. Raul Ruiz (D-CA):
Yeah. So, big tech wants us to vote for this bill because the Senate bill is better than this bill. The Senate bill will actually protect and safeguard against anxiety, eating disorders, substance use disorders, depression, and suicide. This bill doesn't, so big tech is advancing this bill. And with that, I yield back.
Rep. Diana DeGette (D-CO):
And I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
The gentlelady yields back. Further discussion? For what purpose does Mr. Walberg seek recognition?
Rep. Tim Walberg (R-MI):
Strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman's recognized for five minutes.
Rep. Tim Walberg (R-MI):
Thank you. This is a great debate and argument. I think the reality is that we do not want to not have a part in this process. And I would contend, and I guess I take the opportunity to respond here because I know the next bill, COPPA, there could be some of the same challenges made, that we're moving this quickly. That's absolutely false. We've been working on this for years, literally years, KOSA, COPPA, trying to protect kids, trying to protect parents and families, trying to say to big tech, "We need you, but we need you to be there in a way that protects our kids."
Yes, this has come quickly, but it's also not as coming quickly as it could have because of efforts to try to get some of our Democrat colleagues' amendments in. But we also know that the best thing that could happen for big tech right now is this to die here and to run out of time. And then they can major on the Senate proposal, making sure that it doesn't come to fruition here in the House. We need the opportunity to negotiate.
And yes, there's some things that I'm sure Representative Bilirakis, myself, and others want to see the opportunity to change before we get to a final vote in the House. But if we stop here, that takes the pressure off those that don't want anything to change. And I'm not saying anything about my colleagues who have expressed opposition to this bill. I get some of that. But I've also been here long enough to know that the greatest failure that we can have is to do nothing that moves something positive forward to give us a further chance. As the old Army Ranger statement goes, "Forward motion always allows other opportunities to develop." And I think that's what we want. I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
The gentleman yields back. Further discussion?
Rep. John Sarbanes (D-MD):
Madam Chair.
Rep. Cathy McMorris Rodgers (R-WA):
For what purpose does Mr. Sarbanes seek recognition?
Rep. John Sarbanes (D-MD):
To speak on the bill.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman's recognized for five minutes.
Rep. John Sarbanes (D-MD):
I just want to thank Representatives Castor and Bilirakis for their tireless work on this. I know that their efforts have been heartfelt. The desire to reach some kind of compromise that can move the bill forward is heartfelt. I do feel like I want to reject the notion, however, that, and this was suggested by Chairwoman McMorris Rodgers and by Subcommittee Chair Bilirakis, that to vote no on this is easy. It's not easy to vote no on this. If you share all the concerns that we've talked about today, you very much want to keep this thing moving forward. But if someone casts a no vote, that's not an easy vote for them today, in this broader context. That's a vote designed to register significant anxiety about the substance of the bill, and in a sense, to reserve that position for the process as it moves forward. So I just thought it was important, even as I acknowledge the work that's been done here, to put that on the record. And with that, I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields. Further discussion? For what purpose does Ms. Kuster seek recognition?
Rep. Ann Kuster (D-NH):
Madam Chair, I move to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Lady's recognized for five minutes.
Rep. Ann Kuster (D-NH):
I've listened very intently to this debate, and I think our intentions are clear and the same, but I think this process raises more issues for me. So I support the Senate-passed version of KOSA. It's a good bill, it will protect kids, and it passed virtually unanimously. But the bill before us is not that bill. This bill undermines tech platforms' duty of care to protect children. And I appreciate all of my colleagues who have worked so hard on this issue. So because I support the goals of KOSA, I just want to be clear to my constituents, I'm going to vote no and work to improve this bill so that it actually will protect children. Thank you. I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
The gentlelady yields back. Further discussion? Are there any amendments to the AINS? For what purpose does Mr. Obernolte seek recognition?
Rep. Jay Obernolte (R-CA):
Madam Chair, I have an amendment at the desk entitled Obernolte 83.
Rep. Cathy McMorris Rodgers (R-WA):
The clerk will report.
Clerk:
Amendment to the amendment in the nature of a substitute to H.R.7891 offered by Mr. Obernolte-
Rep. Cathy McMorris Rodgers (R-WA):
Objection. The reading of the amendment is dispensed with. The gentleman is recognized for five minutes in support of his amendment.
Rep. Jay Obernolte (R-CA):
Thank you, Madam Chair. I want to be clear at the outset I strongly support the objectives of this bill. I raised two sons who are now young adults, but raised them in the smartphone era, and I watched them go through the struggles that they had with social media. I've also sat through several committee hearings and subcommittee on this bill, and the testimony from parents who have lost children due to some of these harms was very difficult to sit through and resonated incredibly deeply with me. And I feel we have an obligation to do something to prevent these harms from occurring in the future.
However, I have a serious concern with the way that this bill is constituted. This bill establishes a duty of care that is very vaguely and poorly defined. And I'll read you the text here, that the Duty of Care in section 102 says, "A high impact online company shall create and implement its design features to reasonably prevent and mitigate the following harms to minors." And then it lists the harms. That's it. That's all it says.
So we're creating a duty of care and then we're not defining what it means to have met that duty of care. Essentially, that is abdicating our responsibility as legislators because we're punting the interpretation of the duty of care to the judicial branch. And what we're enabling is years and years of litigation where an army of lawyers from big tech is trying to define what a duty of care is and what reasonable actions would have taken.
I had the unfortunate experience several years ago when I served in state government of authoring a bill that was insufficiently specific in its language. And that bill was signed into law and I got to, for years and years afterwards, watch rooms full of lawyers arguing about what the intent of the author was. And of course it doesn't matter what my intent was, what mattered was what was in the bill. And they're still arguing about it today. I don't want that to happen with this bill.
What I think should happen is that we should be very specific about what we mean when we say duty of care. And I'll give you an example. When this bill was heard in subcommittee, we heard some incredibly moving testimony from personal injury lawyers who had represented parents in lawsuits against big tech. And I asked them, "What should these social media companies have done?"
And there was one instance of a young person who had been bullied online, the content had not been taken down, and he subsequently took his own life. It was heartbreaking. And I asked the lawyer, "What should have been done?" And he said, "Well, the social media company, when they've been notified that a post had been made that was bullying in nature, they should have removed the post within a certain amount of time." And I said, "Perfect. That's perfect. Let's put that down."
There's not a person on this dais that would disagree that a social media company, when notified that the bullying post has occurred, should within, let's say, 24 hours take that post down. Not a person out in this dais would disagree with that. So put that down and say, "A duty of care includes this." And there should be a list of 30 or 40 or 50 of these different things in here. And that way, we won't be arguing about what reasonable means and whether or not the burden of the duty of care has been met.
Let me give you a hypothetical that I worry about, and this is really not that far-fetched. So say we pass the bill as written and it's signed into law, and say a similar occasion occurs in the future that we just talked about, where a young person is bullied online, a social media company is notified. Perhaps they take that down, that post 48 hours after the post was made, but the young person then subsequently takes their life. And a lawsuit ensues where the parents are seeking accountability from the social media company.
... Accountability from the social media company about whether or not 48 hours was reasonable. And can you imagine the courtroom where Big Tech has an army of lawyers sitting there and the parents only have a personal injury attorney? That is not an equal distribution of resources and Big Tech is going to win that nine times out of 10. I don't want to see that happen. So, this amendment does not completely solve that problem. Frankly, this problem can only be solved by the sponsors of the bill working with the stakeholders, but this makes some minor changes to the duty of care in the hopes that we can continue solving this problem as we go through the process of getting this bill passed on the floor. So, I urge adoption of my amendment and I urge the sponsors of this bill to work with the stakeholders and be more specific about this, so some of these problems do not occur in the future. Madam Chair, I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
The gentleman yields back. Further discussion? For what purpose does Mr. Bilirakis seek recognition?
Rep. Gus Bilirakis (R-FL):
To speak on the amendment.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman's recognized for five minutes.
Rep. Gus Bilirakis (R-FL):
I move to strike the last word. I appreciate the gentleman for offering his amendment, but I must respectfully oppose this amendment and other amendments being offered at this time. I appreciate everyone's genuine attempts to ensure the language is right. In response to Mr. Obernolte's amendment, I must oppose this amendment that I believe would be a significant departure from the current duty of care language in the ANES. We view this bill as a prevention bill and I know my colleagues on the other side feel that way and my colleagues on this side, not just responding to tragedies after they occur. The gentleman is concerned about vagueness and I understand and appreciate the concern, but his amendment offers reasonable measures. That is even more vague and just seems to give Big Tech an out to do the bare minimum. So, I commit to working with the gentleman, I really do. I know he has a lot of experience in this area, but I must oppose this amendment.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back. Further discussion? Seeing none, the vote occurs on the amendment. All those in favor say aye. Those opposed, nay. Nays have it, the amendment is not agreed to. Are there further amendments? For what purpose does Mr. Carter seek recognition?
Rep. Buddy Carter (R-GA):
Madam Chair, I have an amendment at the desk. It is Carter Georgia 125.
Rep. Cathy McMorris Rodgers (R-WA):
Oh, we do have a Democrat amendment? Okay-
Rep. Buddy Carter (R-GA):
Oh, excuse me.
Rep. Cathy McMorris Rodgers (R-WA):
Let's suspend here. Gentleman yields back. Ms. Trahan, for what purpose do you seek recognition?
Rep. Lori Trahan (D-MA):
Thank you, Chair Rodgers. I have an amendment at the desk labeled Trahan 75.
Rep. Cathy McMorris Rodgers (R-WA):
The clerk will report the amendment.
Clerk:
Amendment to the amendment and the nature of a substitute to HR 7891 offered by Ms. Trahan of Massachusetts.
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the reading of the amendment is dispensed with. Gentlelady is recognized for five minutes in support of her amendment.
Rep. Lori Trahan (D-MA):
Thank you, Chair Rodgers. I want to start by thanking Chair Bilirakis and Representative Castor for their work hard work on the Kids Online Safety Act, and I'm especially grateful to the parents and the young people who have so passionately engaged in the legislative process on both sides of the debate to push forward a better, safer vision of social media with lasting protections for our children. This bill has the potential to do a lot of good, and I support its goals to require independent audits for large social media companies, provide more controls to young people and their parents, and require additional disclosures from Big Tech. But as many members of this committee know, I still have a major concern that is not addressed by the bill. As my colleagues know, this bill would not be possible without the exceptionally hard work that independent researchers and journalists have done to expose the failures of big technology companies and the dangers their services pose to users.
And we also know that those same companies are never going to admit those failures and dangers themselves, no matter how many self-reporting disclosure requirements we put on them. Without external accountability, companies and platforms will never change. Outside pressure is what forced Instagram to do the bare minimum yesterday to finally stop waking children up in the middle of the night with push notifications, even as they complained that it would hurt teen engagement. That wouldn't have happened without Frances Haugen blowing the whistle on Instagram's harms to young users and years of independent research and reports. However, what's often overlooked is just how difficult it is to collect the data that researchers and journalists need from social media platforms, because companies block access via their terms of service and wield legal threats like a cudgel to silence critical voices and skirt accountability.
Without the ability to look under the hood of these companies in a way that protects users' privacy and the platform's IP by the way, we are effectively flying blind when trying to understand what they're actually doing. That should be unacceptable to each and every one of us who are concerned about the way these companies interact with our children at such an individualized level. Unfortunately, this draft of KOSA, which was overhauled and released just 24 hours before this committee takes a vote, has struck the last remnants of this section to protect researchers and journalists from doing this important work. That's particularly disappointing because protecting independent researchers has historically been a part of the bill for years and it was one of the surgeon general's key recommendations for protecting children online.
It's for that reason that I have offered this amendment, to reestablish the bill's legal safe harbor to protect researchers and journalists from undue liability when Big Tech companies try to silence them. Now, I'm grateful to Chair Bilirakis for his commitment to work on legal safe harbor provisions, and I plan to withdraw my amendment, so that we can work together with Representative Castor to find a way to include this proposal. Now, I sincerely hope that the sponsors continue to work to improve the duty of care, to address the deep concerns raised today, and bring it closer to the Senate version, and pass a final package that creates the strongest possible protections for our kids when they're online. So, I thank the chair and I'll withdraw my amendment.
Rep. Cathy McMorris Rodgers (R-WA):
The gentlelady has withdrawn her amendment. At this time, we're going to take a break for lunch and votes on the House floor. The committee stands in recess. Subject to the call of the chair, we will reconvene promptly after floor votes.
Committee will come to order. Committee will continue consideration of HR 7891. The gentleman from Georgia, Mr. Carter, has an amendment at the desk.
Rep. Buddy Carter (R-GA):
Madam chair, I have an amendment at the desk. It is Carter GA 125.
Rep. Cathy McMorris Rodgers (R-WA):
The clerk will report.
Clerk:
Amendment to the amendment in the nature of a substitute to HR 7891 offered by Mr. Carter of Georgia.
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the reading of the amendment is dispensed with. Gentleman's recognized for five minutes in support of his amendment.
Rep. Buddy Carter (R-GA):
Thank you Madam Chair. Madam Chair, joining me in this bipartisan amendment are Representative Schrier and Representative Miller-Meeks. This amendment would insert HR 5778 Sammy's Law, which gives parents the choice to use third-party safety software to protect their children from harmful situations on social media. The bill is named after Sammy Chapman, who at the age of 16 was approached by a drug dealer on Snapchat and was unknowingly and unnecessarily a victim of fentanyl poisoning. If his parents and many other parents were able to use a third-party safety app and alerted to this kind of activity, his tragic death may have been prevented. Our children are being victimized by bad actors on social media. There are far too many parents who have tragically lost their children because of this malicious activity facilitated by social media. As a father and a grandfather, my heart aches for Sammy's parents and any others who have lost a child too soon.
No parent should have to endure that pain. I'm glad this committee is taking action to help protect kids online and I'm proud to co-sponsor both KOSA and COPPA 2.0. I believe Sammy's Law is an essential complement to these important pieces of legislation, especially KOSA, which would require third-party integration if proven to be an effective intervention. This third-party safety apps are proven to identify drug abuse, human trafficking, mental health crises, eating disorders, and other similarly harmful situations before they endanger our children, so that they can get the help they need. Social media platforms do not currently permit these life-saving tools and KOSA, absent Sammy's Law, would not accomplish this necessary change. Children spend hours and hours on social media, and we know that the youth who report frequent social media use are more likely to experience poor mental health. There's a silent epidemic happening in almost every home in America, and we have to take this seriously.
Unfortunately, there's resistance to moving this bill forward, despite the legislation adding critical protection for children while also increasing privacy for them. Preventing parents from using these types of safety tools will only lead to more deaths. We cannot allow drug dealers to have more access to our children than parents. I want to repeat that. We cannot allow drug dealers to have more access to our children than parents. I urge committee leadership to continue to work with us on getting this bill across the finish line to help save lives. My staff stands ready to work with yours to coordinate demos of softwares to fully grasp how critical this technology is to protecting our youth. I can't think of a single reason that anyone, anyone would oppose this bill, and I welcome any comments from other committee members. Explain to me why you're more afraid of parents than fentanyl. Please explain that to me. I look forward to moving Sammy's Law forward and ensuring these tools are more widely accessible. Thank you, Madam Chair. And if no other members wish to speak on the amendment, I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back. For what purpose does Ms. Schrier seek recognition?
Rep. Kim Schrier (D-WA):
Madam Chair, I move to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlelady is recognized for five minutes.
Rep. Kim Schrier (D-WA):
Well, I'll be brief. I just want to thank my colleagues, Mr. Carter, Ms. Wasserman Schultz, and Dr. Miller-Meeks for their continued work on this really important legislation that I support. This committee's work on the Kids Online Safety Act and the Children's Online Privacy Protection Act is critical and I believe that Sammy's Law complements and expands that work, giving parents the tools they need or more tools they need to be able to protect their kids, and as you just heard, save lives. We have been working diligently to ensure that all members of this committee feel comfortable with the language of our bill and clear about its intent. And we will continue to work with all members and leadership of this committee to do so. And I hope we can come to a conclusion and agreement there, and I'm going to yield the rest of my time back to my colleague, Mr. Carter.
Rep. Buddy Carter (R-GA):
Madam Chair, I appreciate my colleague yielding to me. I'd like to continue with comments if there are any, so I'll yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Yield back. Is there further discussion? Seeing none, does the gentleman still wish to withdraw?
Rep. Buddy Carter (R-GA):
Yes, ma'am. I will withdraw it, but Madam Chair, if I may, I certainly hope that we can continue with discussion on this. This is important. Everybody here knows that. Explain to me how we can withhold information from parents.
Rep. Cathy McMorris Rodgers (R-WA):
Okay, yeah. Okay, so reclaiming my time.
Rep. Buddy Carter (R-GA):
Yes.
Rep. Cathy McMorris Rodgers (R-WA):
Yes. So as the chair, I will yield now to the gentleman to withdraw his amendment.
Rep. Buddy Carter (R-GA):
And I withdraw my amendment.
Rep. Cathy McMorris Rodgers (R-WA):
Well, I'll yield to Mr. Pallone then.
Rep. Frank Pallone (D-NJ):
Thank you. Before we proceed, I just wanted to say that I've listened to the remarks of most of my colleagues. Well, I listened to the Republicans, too, but I was paying particular attention to the Democrats who spoke today, and I just want to echo with many of them said that I obviously firmly support protecting kids online, but I have serious concerns that this bill will not have its intended effect and could have unintended consequences. And because the stakeholders and advocates, many of whom spent years advocating for stronger online safety, deserve an opportunity to thoroughly vet and provide feedback on this amended legislation, which they haven't had, because we didn't get these changes until yesterday afternoon. So at this time, I can't support the legislation in its current form, Madam Chair. I just wanted to express that. I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back. Chair yields back. Are there further amendments?Okay, seeing none, the voter... Okay. For what purpose does Ms. DeGette seek recognition?
Rep. Diana DeGette (D-CO):
No, I don't...
Rep. Anna Eshoo (D-CA):
Eshoo.
Rep. Cathy McMorris Rodgers (R-WA):
Or I mean Eshoo. What purpose does Ms. Eshoo seek recognition?
Rep. Anna Eshoo (D-CA):
I move to strike the last word, Madam Chair.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlelady is recognized for... Well, yeah. Gentlelady is recognized for five minutes.
Rep. Anna Eshoo (D-CA):
Thank you, Madam Chairwoman. We've had an extended discussion on the legislation, actually extended discussion and work for many years on this, and I fully comprehend the weightiness of the issue. I know that and I want to compliment our two colleagues that have brought this forward, but I also want to point out that there really are some very critically important areas in the ANES that are not being included in this legislation. And Republican and Democratic members of the committee have stated one of the main reasons that we need legislation is in the... What needed to be included with mental health disorders, anxiety, depression, eating orders, substance abuse disorders, suicidal behaviors, and suicides that have taken place, none of that is in the legislation and I think that that is a big hole. I think it's essentially a loophole that a Peterbilt truck can drive through. So, it diminishes this overall effort, and the entire purpose of the legislation is to protect children, but it eliminates all the key areas there.
So, I don't want to support a bill that disengages, removes some of the biggest reasons in terms of experience of what's taking place in the country today and will continue to. Are there other parts of the bill that are good? Yes, they are, but after all of these years and efforts and what's taking place in the country, how we hold these Big Tech companies accountable, it doesn't do it. It doesn't do it, my friends. So, I feel compelled to say this out loud, you heard me before, and I'm so sorry that it is what it is, but it is a lesser, much diminished effort on a critically important issue. So with that, I yield back my time, Madam Chairwoman.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlelady yields back. Are there further amendments? Are there further amendments? The vote now occurs on approving the ANES. All those in favor say aye.
Members:
Aye.
Rep. Cathy McMorris Rodgers (R-WA):
Those opposed, no.
Members:
No.
Rep. Cathy McMorris Rodgers (R-WA):
The ayes have it. The ANES is adopted. Question now occurs on adopting HR 7891 is amended. All those in favor say aye.
Members:
Aye.
Rep. Cathy McMorris Rodgers (R-WA):
Those opposed, nay.
Members:
Nay.
Rep. Cathy McMorris Rodgers (R-WA):
The ayes have it. The bill is adopted. Chair calls up HR 7890 and asks the clerk to report.
Clerk:
HR 7890, a bill to amend the Children's Online Privacy Protection Act of 1998 to strengthen protection-
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the first reading of the bill is dispensed with and the bill will be open for amendment at any point, so ordered. Does anyone seek to be recognized on the bill? For what purpose does a gentleman from Michigan seek recognition?
Rep. Tim Walberg (R-MI):
I have an amendment at the desk.
Rep. Cathy McMorris Rodgers (R-WA):
The clerk will report.
Clerk:
Amendment in the nature of a substitute to HR 7890 offered by Mr. Walberg of Michigan. Strike all after the enacting clause and insert the following, section one's short title, table of contents-
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the first reading of the amendment will be suspended with. The gentleman's recognized for five minutes on his amendment.
Rep. Tim Walberg (R-MI):
Thank you, Madam Chair. The Children and Teens Online Privacy Protection Act, or COPPA 2.0, is a bipartisan, bicameral bill that will modernize and strengthen children's privacy protections for the 21st century and better protect kids and teens online. The bill is a product of years of work and hundreds of stakeholders, including parents, teachers, teens, medical professionals, privacy advocates, industry, and so many more, and I want to take this time to thank all of you for your help and advocacy. Thank you to my friend and co-sponsor, Representative Castor, for her work and that of her team and stalwart dedication to children's online privacy and safety. I also want to thank Senators Markey and Cassidy for creating and leading this legislation to a 91 to three vote in the Senate. We would not be here without them, and that's hard to say as a member of the House.
COPPA 2.0 raises the age of protection to 16, updates the outdated knowledge standard that has allowed Big Tech to ignore when children are on their platforms, bans targeting advertising to minors, creates meaningful data minimization practices, and gives parents and teens new tools to access, correct and delete their data. The ban on targeted advertising is particularly important. It not only disincentivizes social media platforms from collecting data on minors, and keeping kids online with addictive algorithms, but it also prevents the many examples of truly harmful ads from reaching them. The non-profit Tech Transparency Project conducted several experiments where they submitted advertisements to Meta for drug parties, alcohol, and pro eating disorder content to be targeted specifically to 13 to 17 year olds. These ads were approved by Meta 100% of the time within five minutes of being submitted. Recently, my colleagues and I also sent a letter after hundreds and hundreds of ads were found on Meta's platform directing people where to buy illicit drugs. This is what we're up against.
We've heard countless heartbreaking stories from parents who have lost children to drugs, depression, suicide, and eating disorders stemming from what they saw or interacted with online. In January, Mark Zuckerberg himself apologized for Big Tech's role in this crisis. We cannot let bad actors continue to reach kids and teens, especially through ads that line Big Tech's pocketbooks. The amendment in the nature of a substitute makes a few helpful and clarifying changes to the legislation. It updates the knowledge standard to the tiered model that is more representative of what we know about how companies interact with minors, while still holding Big Tech accountable to the highest level. It gives more tools for parents to know and manage how their teen's data is being treated online, while still maintaining privacy for the teenager and the ability to also control their own information. The internet can be a complicated and hazardous place to say the least.
We should allow parents the option to help their teens navigate it. The ANES is a compromise and I want to thank Representative Castor, Chair Rodgers, and our leadership for working with us so diligently to get it here. I will admit there are still things we want to work on. First, I have some concerns about the current preemption standard. We see the importance of giving some flexibility to states that are doing good work to protect their young people, but we also need to make sure we're not creating an unmanageable race to the top patchwork.
Work on the knowledge standard will also continue. It's important that we get this right. Despite this, when it comes to protecting kids, every step forward is a step in the right direction. I understand that Ranking Member Pallone has his reservations about passing protections, specifically for children, and I appreciate his ongoing work with Chair Rodgers on a comprehensive privacy bill. But to quote the ranking member, "I cannot agree with those that claim that consumers should not get privacy protections anywhere, because they cannot get them everywhere." Minors are the most vulnerable to dangerous and manipulative practices by Big Tech. The time is now to give them and their parents more tools to protect themselves online. I encourage my colleagues to support the ANES and the underlying bill and I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Further discussion? Chair recognizes Ms. Castor for five minutes to strike the last word.
Rep. Kathy Castor (D-FL):
Thank you, Madam Chair, and thank you, Representative Walberg, for being my partner on Children's and Teens Online Privacy. You've been remarkable to work with and I appreciate your words, and I agree with you that I wish we were here on the broader privacy bill that Chairwoman Rodgers and Ranking Member Pallone in this committee had worked on for a number of years, but it's an emergency that we act to protect kids online. And the good news for this committee is that COPPA 2.0 is generally intact from the Senate version that was voted 91 to three. So, hopefully we won't get into a detailed debate here too much on this. But remember, the difference is that the Kids Online Safety Act was pointed at the design, how apps and websites and social media is designed to keep kids addicted, whether it's the notifications or the likes or how we want to put in place default settings that keep kids safe, and that's the default.
COPPA comes at it from the privacy standpoint, so that these online apps and websites cannot collect the personal private data on children. We adopted a law in 1998, the first version of COPPA, and think of how sophisticated all of the tech platforms have gotten over 20+ years. I often use the analogy that if there was someone standing outside of your child's window at your house and they were watching where they go, what they do, what they eat, who they talk to, you would call the police. You would be alarmed. Well, that's generally what is happening online right now. The incessant surveillance and tracking of everyone... I'm afraid that we're all too easy to give away our privacy in this country. I hope we can come back to the larger privacy bill, but we're here today to protect kids and young people across America from the physical and mental harms being inflicted by these online platforms and social media.
They use that information to manipulate and exploit us, and this is especially harmful to young people and children and adolescents. The American Academy of Pediatrics says that data collection from children and teens on a vast scale allows companies to monitor, track, and target young people with advertisements, and content that exploit their developmental vulnerabilities for commercial gain. Research indicates that the use of data to target children and adolescents with highly personalized behavioral advertising and user experiences is not developmentally appropriate, because they have not yet developed the mature critical thinking skills or impulse inhibition, and countless families have relayed the stories to us. All of us know this. Parents who have lost their children to cyber bullying, eating disorders, dangerous challenges, and exploitation. This is an emergency and it is a crisis. It demands action, and again, I want to thank Representative Walberg for being a champion and a great partner on this.
Here are some of the important updates from the current law. New protections for minors aged 13 to 16 years old to bar platforms from collecting data on them, a ban on targeted advertisements that companies use to prey on young users. We create an eraser button to allow children and teens to delete that information that has been gathered on them. It adds additional insight for parents on this data. It revises the current law's knowledge standard to close a loophole that allows platforms to ignore kids online. Young people are ripe targets. We know this. In fact, the New York Times released a report earlier this summer, which demonstrated how companies like Meta are using every method possible to keep teens addicted, so that they can maximize their profits. We already knew this from Frances Haugen's testimony to this committee, but this recent story confirms that Mark Zuckerberg has directed executives to focus on getting teenagers in particular to spend more time on the platforms with an overall company goal of total teen time spent.
Internal documents show that teenagers are a core part of Meta's growth strategy to the detriment of their own health. They're a top priority for the company. Almost half the teens in the US have experienced bullying or harassment. Between 2010-2019, teen depression rates doubled with teenage girls seeing the sharpest increase. We know these statistics. It's critical that our laws evolve to tackle the problem, and that is what we are doing today to give young people a fighting chance to allow them to protect their childhood, to allow parents to protect their kids. They've been over a barrel for way too long to these companies that elevate their profits over the best interests of our kids. So thankfully, this is largely intact from the 91 to three Senate version, the bipartisan bill. Our kids can't wait for action. I truly appreciate the collaboration of the entire committee and working in a bipartisan fashion on this. Let's move COPPA 2.0. Thank you, and I yield back my time.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlelady yields back. I recognize myself for five minutes to strike the last word and speak on the bill. We have heard far too many stories of children and teenagers who have suffered terrible harms, and in some cases death, because of the current state of social media and the internet. The Children and Teens Online Privacy Protection Act, COPPA, is another critical piece of legislation we're considering today to protect our children online. This legislation provides important updates to a law that was first passed more than 26 years ago and addresses the realities and threats facing kids and teens online in the modern internet ecosystem. Specifically, this legislation would update the Children's Online Privacy Protection Act of 1998 by expanding the age of children with privacy protections to include teens under 17, ban targeted advertising to children and teens, allow children or teenagers to delete their personal information when technologically feasible.
This legislation will also put more pressure on large social media companies to be good stewards of their platforms. We've heard the consequences of the lack of current privacy protections and targeted advertising from a young woman named Ava, who testified before our committee earlier this year. She bravely shared her story, how social media companies could prey on her vulnerabilities, leading her into a downward spiral that compromised her wellbeing. She told us, "How was I, a 14-year-old child, supposed to understand that social media platforms would use my age, location, and gender to target me with advertisements designed to instill insecurity?" There are far too many children and teens across the country with a similar story to Ava's. For them, we must act. I'd like to thank the bill's sponsors, Representative Walberg, Representative Castor for all their hard work and leadership. I look forward to advancing this out of committee and getting COPPA signed into law as soon as possible. I yield back. Does anyone else seek recognition on the bill?
Rep. Frank Pallone (D-NJ):
[inaudible] the amendment.
Rep. Cathy McMorris Rodgers (R-WA):
The amendment? The ANES?
Rep. Frank Pallone (D-NJ):
No, [inaudible] the amendment.
Rep. Cathy McMorris Rodgers (R-WA):
Okay. Anyone seeking to speak on the bill? Mr. Bilirakis, for what purpose do you seek recognition?
Rep. Gus Bilirakis (R-FL):
To strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Gentlemen is recognized for five minutes.
Rep. Gus Bilirakis (R-FL):
Thank you. I'm speaking on the ANES and the underlying bill. Madam Chair, I move to strike the last word. As we know, our country faces a youth mental health crisis, particularly amongst our teens, unfortunately. Big Tech is only putting fuel to the flame on this and using the collection of kids' data to do it. I'm proud to support HR 7890, COPPA 2.0. My friend and vice chairman of the subcommittee-
My friend and vice chairman of the subcommittee, Mr. Walberg's bill and of course Ms. Castor's bill and I appreciate her efforts on the previous bill as well. So again, I appreciate Mr. Walberg and Kathy's efforts and the effort to ensure we are strengthening privacy for children, and of course, I support the amendment.
So, we know Children and Teens' Online Privacy Protection Act would, and this is 2.0, build on COPPA, the original bill, legislation, by prohibiting internet companies from collecting personal information from users who are 13 to 16 years old without their consent. Ban targeted advertising to children and teens. Revise COPPA's actual knowledge standard to close the loophole that allows big tech platforms to ignore kids and teens on their site. Create an eraser button by requiring companies to permit users to eliminate personal information from a child or teen when technologically feasible and it will establish data minimization rules to prohibit the excessive collection of children and teens' data.
Again, I want to commend Mr. Walberg and Ms. Castor for their tireless efforts for many years to get this bill here where we are today. I think we're accomplishing quite a bit today, folks. I'm more than happy to support the AINS of course in the underlying bill and I ask my colleagues to vote yes. I yield back, thank you.
Rep. Cathy McMorris Rodgers (R-WA):
The gentleman yields back. Further discussion. Are there any amendments to the bill?
Rep. John James (R-MI):
Madam Chair, I move to strike the last word, or I have an amendment at the desk.
Rep. Cathy McMorris Rodgers (R-WA):
I'm going to recognize Mr. James to offer his amendment. Clerk will report.
Clerk:
Amendment to the amendment and the nature of a substitute to H.R.7890 offered by Mr. James of Michigan. Add at the end of the bill-
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the reading of the amendment is dispensed with. Gentleman is recognized for five minutes in support of his amendment.
Rep. John James (R-MI):
Thank you, Madam Chairwoman. Countless studies have gone over multiple congressional hearings, making clear that social media is not only addictive for children but also harmful to their mental acuity and overall health. Social media has been labeled the primary culprit and for good reason.
I've been saying for years that Facebook is the Philip Morris of our time. Frankly, kids' addiction to these services are only going up. 84% of children say that YouTube is their primary source of video content. Teens and tweens' use of smartphones driven by social media went up 17% in just a two-year timeframe and the list goes on and on. Even worse, big tech has no intention of fixing the issue and are becoming more strident in their targeting services to kids and teens. Where do they get this bravado? From bad legal precedent? Perhaps. Several state bills that sought to curtail big tech's influence over kids have hit a legal wall.
Generally, the First Amendment allows online child protection measures so long as the measure doesn't infringe on adults' speech. However, the primary issue for many of these state bills falls on one fundamental question, how do these platforms determine the age of the user?
The Supreme Court in Reno v. ACLU requires the law provides an effective way for the platform to determine the identity or the age of a user who is accessing third-party material. In short, age verification is key. But even when applying age verification requirements to websites, courts have still overturned those laws, so what now?
Well, the court in NetChoice v. Griffin gave us a path forward. To go through the app stores, Judge Timothy Brooks noted that Apple and Google's app stores already set age-related content restrictions for those applications, filter online content and a control privacy setting. That makes perfect sense. Going through Apple and Google with leveraged tried-and-true policy prescriptions make the store age gate the addictive or harmful products.
For instance, let's take a look at what we do in our normal lives. When a person walks into a convenience store, we require that store to check ID when they purchase cigarettes or alcohol or lottery tickets. We also hold that store liable when kids access those products improperly, not necessarily the suppliers of the product. In other words, we don't rely on Marlboro or Budweiser to ensure that kids aren't purchasing their products. We hit them up for Joe Cool and Joe Camel and the Marlboro man, but not for actually selling the products. We hit up CVS and 7-Eleven and the supermarkets to age gate. The app ecosystem should be no different than what has already been proven out in the private sector already.
Moreover, the two major app stores, the Apple App Store and the Google Play Store are under respective consent decrees from the FTC to ensure that children are not purchasing items on their app store without parental consent, hence, both theoretically must have age verifiers in place to comply with the FTC orders to begin with. Better yet, because Apple and Google already have the ages of their app users, would not need to provide more personal information to these apps like Instagram or Snap to verify their ages.
Indeed, all app store providers would need is to send a signal to developers when they suspect a child is using their app or service without burdening the user by requiring more personal data. All Apple or Google would have to do is give the developer a thumbs up or thumbs down when a social media app asks to verify the device is owned by an adult or a child. If we require websites to perform age verification, they would undoubtedly ask you to update a picture of your ID or have you provide that data to another third-party vendor. Going through the app store eliminates that invasion of privacy.
It also completely undercuts social media's First Amendment concern over imposing age restrictions on their services because now, as the court in Reno required, social media companies will have an effective way to verify their age. Frankly, going through the app stores is a no-brainer. I filed an amendment that seeks to address these concerns and ensure that we hold app stores like Apple and Google accountable when they fail parents and kids the same way a bar or a restaurant or a convenience store might.
But even despite that, and my reservation is the way it's currently written, I'm going to withdraw my amendment and not seek a vote today on the good word of my friend Mr. Walberg and our staff, that they will continue working with me to further improve this bill and work with me on my Protecting Children on Social Media, which is bipartisan, and also the App Store bill that I'm helping to lead as there is also a Senate counterpart.
I believe that this will lead to further good productive discussions on how we can hold these trillion-dollar giants that own these app stores also and equally accountable to the people that we were sent here to protect in the first place, our children. And with that, Madam Chairwoman, I yield.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman withdraws his amendment. Are there further amendments? Chair recognizes Mr. Pallone.
Rep. Frank Pallone (D-NJ):
Thank you Madam Chair. My amendment at the desk is the amendment labeled H7890_ANS_01.
Rep. Cathy McMorris Rodgers (R-WA):
The clerk will report the amendment.
Clerk:
Amendment in the nature of a substitute to H.R.7890 offered by Mr. Pallone, strike all after the [inaudible].
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the reading of the amendment is dispensed with, gentleman is recognized for five minutes in support of the amendment.
Rep. Frank Pallone (D-NJ):
Thank you. Madam Chair, as textbooks and notebooks have been replaced by tablets and smartphones, threats to kids' privacy have reached dire levels. That's because data is big tech's most valuable commodity and kids' personal information is a high-value target.
For children and teens, every click, like, share and post can be tracked and aggregated into detailed dossiers revealing that kids' preferences and beliefs and such invasive surveillance practices can be monetized through targeted advertisements and addictive and damaging design features that maximize screen time and engagement. And the result, big tech's revenue is up at the expense of our children.
We can't continue to allow our nation's kids to be victims of such invasive and abusive data practices. The two Congresses I partner with, Chair Rogers, to draft transformational privacy legislation that puts people back in control of their personal data, curbs data abuses by big tech, reigns in the shadowy world of data brokers and provides important protections to keep kids safe online. And our effort culminated in the introduction of the American Privacy Rights Act or APRA, which includes much stronger privacy protection for children than COPPA 2.0 does.
Now, like COPPA 2.0, APRA bans targeted and first-party advertisements to kids and requires verifiable consent before transferring a kid's or teen's personal information. APRA establishes an eraser button requiring covered entities to delete upon request any content or information of a child or teen and grants teens and the parents of children the right to access, correct, delete and port the kid's personal information. States are permitted to exceed these protections as such laws are more privacy protective for children and teens.
APRA establishes other vital provisions not included in COPPA 2.0. APRA protects all kids' information, not just information collected from a child or teen online. That's because parents, schools and data brokers retain and share sensitive information about kids and that's information that can be collected and aggregated. Without protecting all kids' information regardless of the source and whether it was shared online by a kid, we'll not close the pipeline of children and teens' most sensitive data freely flowing to data harvesters.
APRA also establishes strong data minimization requirements limiting the amount of personal information entities collect, process, retain or transfer to only what is necessary to provide the products and services being requested. It's a simple principle. The use of a product or service does not give companies the licenses to build detailed dossiers on all customers, including our children and teens.
APRA also cracks down on the shadowy world of data brokers who, without having a direct relationship with their subjects, collect and sell Americans' most sensitive personal information for profit. It grants us all the right to require data brokers to delete and stop the collection of our information and allows parents to do so on behalf of their children.
APRA requires covered entities and service providers to proactively identify and mitigate privacy risks with respect to children and teens, permits teens and the parents of teens to withdraw previously provided consent and prohibits companies from using dark patterns to trick users into agreeing to more privacy-invasive practices. And APRA also ensures that the FTC is properly equipped, empowered, and staffed to bring enforcement action against those who profit from the misuse of the personal information of our nation's youth.
Now, I found an amendment in the nature of a substitute that would strike and replace the text of COPPA 2.0 with a modified version of APRA. The amendment is nearly identical to the text of APRA that was considered before the full committee last June. But unlike that version of APRA, my amendment does not include a private right of action or expressly preempt state law. It's no secret to those who have been following the privacy debate, the preemption and private right of action are two of the most hotly debated issues. Make no mistake, if comprehensive privacy legislation was easy, it would've already been enacted. Such monumental legislation requires careful deliberation, meaningful engagement, and tough compromises.
Now, I filed, Madam Chair, this amendment with the goal of sparking renewed bipartisan conversation about how to find common ground and support legislation that will establish a strong privacy foundation that we can build on. I intend to withdraw the amendment today, but I firmly believe that we must continue this fight to get comprehensive privacy protections into law. And our charge is clear, the American people overwhelmingly want more control over their personal information and it's time we answer the call of the American people and pass the American Privacy Rights Act. And with that, Madam Chair, I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
The gentleman withdraws his amendment.
Rep. Frank Pallone (D-NJ):
I do ask that it be withdrawn, Madam Chair.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman withdraws his amendment.
Rep. Frank Pallone (D-NJ):
I have others.
Rep. Cathy McMorris Rodgers (R-WA):
I'm going to recognize myself briefly to strike the last word. I want to thank Ranking Member Pallone for being a trusted partner over the years as we've worked to establish a national data privacy standard. I believe now more than ever we need a strong comprehensive data privacy law in this country and I will continue to advocate and fight to give people the right to control their personal information.
Big tech pivoted from going after APRA to the kids' bills on the markup today and we can't let them continue to win. I will continue to look for any opportunity to protect American privacy rights. I appreciate you withdrawing today, but the fight continues. I yield back, and recognize Mr. Pallone for another amendment.
Rep. Frank Pallone (D-NJ):
Well, thank you for those comments, Madam Chair. Another amendment I have is to the Walberg AINS and it's called data_MIN_01.
Rep. Cathy McMorris Rodgers (R-WA):
The clerk will report.
Clerk:
Amendment to the amendment and the nature of a substitute to H.R.7890 offered by Mr. Pallone. Page 17, strike lines five through 15 and insert the following.
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the first reading of the amendment is dispensed with. Gentleman is recognized for five minutes on his amendment.
Rep. Frank Pallone (D-NJ):
Thank you, Madam Chair. And this is about data minimization, which I think is so important in the context of any privacy bill. Robust data minimization limits the amount of personal information entities collect, process, retain and transfer.
In the context of COPPA 2.0, strong data minimization requirements would ensure that online services and apps used by kids limit the information about kids collected by those companies and the use retention and sharing of the information that companies do collect to what is necessary to provide the requested product or service. It would put an end to video games collecting kids' geolocation for one purpose and then using it for another. It would stop apps targeted to toddlers from collecting massive amounts of kids' information and then selling that information to data brokers so that they profit.
Unfortunately, the existing provisions of COPPA 2.0 do not include strong data minimization requirements. Instead, COPPA 2.0 currently offers vague language allowing operators to collect kids' personal information if, and I quote, consistent with the context of a particular transaction or service or the relationship of the child or teen with the operator, unquote. But sadly, some people would likely argue that the invasive over collection of children and teens' personal information is consistent with the context of online transactions and the relationship between kids and online operators.
So, COPPA 2.0 would also allow operators to retain the personal information of a child or teen for no longer than is reasonably necessary. A vague standard that may be exploited by big tech, in my opinion, in their arsenal of lawyers to allow these companies to continue business as usual. Even more striking, COPPA 2.0 does not even include vague data minimization requirements on the use or disclosure of the information collected from kids. Instead, companies that collect kids' information for one purpose are not prohibited from using it for whatever purpose they deem fit, even if doing so presents privacy risks or other harms to minors.
So, my amendment strengthens the data minimization provision in COPPA 2.0, creating a bright line standard prohibiting operators from collecting, using, maintaining or disclosing the personal information of a child or teen if not necessary to provide a product or service. It includes a few common sense exceptions to that standard to allow companies to comply with the other provisions of COPPA 2.0, adhere to other federal or state laws and to protect against child exploitation by providing information to the National Center for Missing and Exploited Children.
It's a simple principle. Let's require that companies only collect, use, maintain, and disclose the personal information of children and teens when necessary. This is consistent with our larger privacy bill. So, if you ask any parent of a child or teen with a smartphone, laptop or tablet, they will say that we're not doing enough to protect kids' personal information, and I urge my colleagues to seize this opportunity to protect children and teens' personal information and support my amendment. And with that, I yield back, Madam Chair.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back. I recognize myself for five minutes to strike the last word on the amendment. I want to thank Mr. Pallone for offering his amendment and raising this important issue. Data minimization is something that I care deeply about. Right now, Americans have no say in what companies can collect from them, what they can do with that information, who they can share it with and how they can manipulate and monetize their data. Too much information is being collected about us, period. That is why this committee has spent so much time tirelessly working on comprehensive data privacy legislation. The legislation that we worked on included strong data minimization provisions, including many compromises.
I'm fully committed to continuing to work together to find a path forward to achieve this shared goal for every American. And I want to thank Mr. Walberg, Ms. Castor for including limitations on collection and important provision of data minimization. While it's a positive step forward, every American deserves to have data minimization protections. I ask the Ranking Member to withdraw his amendment and continue to work with me to accomplish this shared goal of protecting every American's privacy. And I would yield to the Ranking Member for a reply.
Rep. Frank Pallone (D-NJ):
I am willing to withdraw it, Madam Chair. I think we both realize that we'd like to see better protections not only in COPPA 2.0, but also the larger bill. But because of your request, I'll withdraw the amendment at this time.
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the amendment is withdrawn. Chair recognizes for more amendments. Are there any other amendments? Mr. Pallone?
Rep. Frank Pallone (D-NJ):
This one is just data brokers under my name.
Rep. Cathy McMorris Rodgers (R-WA):
The Clerk will report the amendment.
Clerk:
Amendment to the amendment and the nature of a substitute to H.R.7890 offered by Mr. Pallone. Page 13, line two, strike
Rep. Cathy McMorris Rodgers (R-WA):
Without objection, the reading of the amendment is dispensed with, the gentleman's recognized for five minutes in support of his amendment.
Rep. Frank Pallone (D-NJ):
Thank you, Madam Chair. So, this amendment prohibits shady data brokers from amassing troves of sensitive personal information of our children and teens and selling such valuable information to the highest bidder. Data brokers are businesses that aggregate and sell vast amounts of American-sensitive information for profit.
Data brokers frequently collect American sensitive information from a variety of sources, including mobile applications, online services, other data brokers and public government and business records and data brokers have collected and stored billions of data elements on nearly every consumer in the United States, including information about children and teens. And these data elements capture incredibly sensitive information such as data about an individual's online search habits, purchasing history, height and weight, ethnicity, religious affiliation and travel patterns.
With this information, data brokers and their customers can make inferences about an individual including that person's interest, health, purchasing preferences, financial well-being and educational attainment. Consumers, especially children and teens, are largely powerless to stop this data harvesting and transmission. It's particularly outrageous that data brokers are amassing information and selling information about our kids.
So, my amendment puts a stop to these invasive and abusive practices with respect to our nation's children and teens. It prohibits data brokers from collecting, using, maintaining, or transferring the personal information of children and teens. Earlier this year this committee led the change, the charge, I should say, in drafting and enshrining into law the Protecting Americans' Data from Foreign Adversaries Act and this crucial law prohibits data brokers from selling sensitive personal information of Americans to China, the Russian Federation, North Korea and Iran.
I'd like to build on those bipartisan efforts and close off the pipeline of kids' information freely flowing to data brokers, so I urge my colleagues to support this amendment and with that, I yield back Madame Chair.
Rep. Cathy McMorris Rodgers (R-WA):
The gentleman yields back. I recognize myself for five minutes to strike the last word on the amendment. I want to, again, thank Mr. Pallone for offering his amendment, raising awareness on this important issue.
This amendment mirrors the text from the Protecting Americans' Data and Foreign Adversaries Act, an important bill to protect Americans that is now law. But in the context of trying to prevent Americans' personal data from being exploited by our adversaries and the national security risk that it poses, I certainly support putting more restraints on what data brokers can do with data, but I believe that this amendment needs more work to ensure that we don't have unintended consequences like those we have considered when working on other privacy legislation, I ask the Ranking Member to withdraw the amendment. Yield to... Oh, okay. Okay. Okay. I'll yield back. Further discussion on the amendment. For what purpose does Ms. Trahan seek recognition?
Rep. Lori Trahan (D-MA):
Madam Chair, I move to strike the last word.
Rep. Cathy McMorris Rodgers (R-WA):
Lady is recognized for five minutes.
Rep. Lori Trahan (D-MA):
Thank you, Madam Chair. Covertly amassing and selling troves of children and teens' most sensitive personal information shouldn't be a lucrative business model, it should be illegal. Ranking Member Pallone's amendment draws a clear line. Any entity whose product or service is the sale of kids' personal information that it did not collect directly from each kid is a data broker. The amendment prohibits those data brokers from collecting, using, maintaining, or transferring the personal information of children or teens. That's common sense.
And if you have any doubts, just ask a parent. Ask a parent if they're comfortable with an entity that they do not know covertly tracking their kids' every click, like, share and post online. Ask a parent if they're comfortable with those entities making invasive inferences about their kid, including inferences about that kid's travel patterns, academic prowess, health, personal interests and financial well-being. Ask a parent if they're comfortable with those entities then selling detailed profiles about their kid's interests and personal information to the highest bidder.
As a mom of two young daughters, I know what my answer is. Hell, no. Our kids' personal information is not a commodity and it must be protected. Earlier this year, the House Energy and Commerce Committee championed legislation prohibiting data brokers from transferring the sensitive personal information of Americans to China, Russia, North Korea and Iran. We did so because the threat to our national security and to the security of every American is too dire to allow such sensitive information about US citizens to fall into the hands of our adversaries.
That bill became law and that law's definition of data broker is the same as the amendment's data broker definition. Should the definition of a data broker be narrowed, Congress would be allowing data brokers prohibited from transferring Americans data to foreign adversaries to transfer the personal information of kids to everyone else. It makes no sense and I cannot and I will not support that. Ranking Member Pallone's amendment is the right approach. We must be tough on data brokers to protect children and teens and I urge my colleagues to support this amendment. I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
General lady yields back, further discussion. All those in favor say aye.
Rep. Frank Pallone (D-NJ):
You want to...
Rep. Cathy McMorris Rodgers (R-WA):
Oh, with no further discussion, well, Ranking Member, let's see here. Does the Ranking Member desire to withdraw as amendment?
Rep. Frank Pallone (D-NJ):
At your request, I will withdraw, Madam Chair, because again, I know you're very sympathetic to this and expanding. I mean, the point we're trying to make is that a lot of these things should be expanded and included in anything we do with kids' privacy in the same way that we would do it for data brokers in this case too with our foreign adversaries or minimization with respect to kids. But at this time, I will withdraw the amendment.
Rep. Cathy McMorris Rodgers (R-WA):
The gentleman withdraws his amendment. Are there further amendments seen on the question that occurs on the AINS-
Rep. Frank Pallone (D-NJ):
Madam Chair, can I make a statement on the Walberg bill to strike the last word on the amendment.
Rep. Cathy McMorris Rodgers (R-WA):
Yes. Gentleman is recognized for five minutes to strike the last word.
Rep. Frank Pallone (D-NJ):
I'm sorry, Madam Chair. I wanted to speak on the Walberg amendment or the AINS, so to speak, before we vote on it, if I could. And the reason for that is that I believe that the amendment and the nature of a substitute, while it grants parents the power to access, correct, delete and obtain their teenager's information, it allows that even against that teen's wishes and that means in the bill purportedly providing more privacy protection for teens, Congress is creating, in my opinion, a back door by which their parents can snoop on their teen's every click online.
And I think the provision ignores the fact that teens have a right to privacy as well. It also ignores the fact that not sharing every bit of personal information with your parents is a natural part of transitioning into adulthood. And while I encourage parents and teens to engage with each other about the teen's online activities, I don't think Congress should insert itself into those family decisions.
We also must consider those parents who are not acting in the best interest of their teenagers. Let's not forget there are horrible instances of parents abusing their children. For teen victims of abuse or neglect, the internet can become a safe haven providing them with vital information and a sense of community. The proposed changes to COPPA 2.0 allowing parents to access their teen's online information would deny such teens the privacy that is necessary to protect their health and well-being and would give these abusive parents even more control over their teen aged children.
So, for me, this is a deeply concerning issue with the revised bill under the amendment. I've long advocated for stronger privacy protections for kids and teens and have offered amendments today to try to strengthen this bill with important provisions included in APRA, but without these vital protections reflected in the text and with the new language granting parents the right to control their teens' personal information, I cannot support the AINS, and with that I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back.
Rep. Frank Pallone (D-NJ):
Can I ask a question of the chair? As I said... Is that okay? All right. I just mentioned I can't support the bill in its current form, so I want to only ask that you work with me on the various amendments that I've mentioned today before this bill goes to the floor and that we have further discussions if we could.
Rep. Cathy McMorris Rodgers (R-WA):
Yes. Yeah, there will be more discussions, more work. The work continues. You have my commitment. We're going to do everything we can to get them on the president's desk.
Rep. Frank Pallone (D-NJ):
All right, thank you, Madam Chair.
Rep. Cathy McMorris Rodgers (R-WA):
Okay, gentleman yields back. Chair recognizes Mr. Latta for five minutes on the AINS.
Rep. Bob Latta (R-OH):
Well, thank you, Madam Chair, and I'd like to yield my time to the bill sponsor, the gentleman from Michigan.
Rep. Tim Walberg (R-MI):
Thank my friend, and I appreciate the Ranking Member's concern that we get this right and I think we're all committed to doing that, but we want to move forward as well so we can have those opportunities.
The AINS includes changes to the underlining bill to allow both parents and teens to access, correct, and delete the data collected on the teen. To be clear, teenagers still have the ability to consent for themselves and enter websites without parental permission, preserving their privacy and access online. It does not mandate that a parent see everything or know every account their teenager has. When we've spoken about this bill, [inaudible] and others, we've all said how minors do not have the same cognitive capabilities to safely spend time online as adults.
This is an important tool to allow parents to help their minor children navigate the internet and protect themselves. To the concern that the teen or parents' opinions could differ on deletion, well, I mean, that's parenthood, and most parents want to do the responsible thing with their kids. Certainly there are parents that don't, but they're in the minority. And at some point we must allow parents to raise their children as they see fit. We do not treat teenagers like adults anywhere else. They can get a job or sign up for driver's ed, but in most cases, not without a parent or guardian sign off.
This language is a very reasonable compromise. It preserves teen privacy while giving both the teenager and parents the tools to see how their data is treated online and the opportunity to address it. When it comes down to it, teenagers are still kids. They do have different needs and capabilities than younger children, and we've reflected that clearly in the bill in the AINS where they are treated differently. But just like driver's ed, it's essential that we have parents in the passenger seat to help them navigate the confusing roads of today's internet and that, I think, we would agree on. And so, I hope we can move this forward and I yield back.
Rep. Cathy McMorris Rodgers (R-WA):
Gentleman yields back. Question now occurs on the AINS, all those in favor say aye. Those opposed nay. The ayes have it, the AINS is adopted. Question now occurs on approving H.R.7890 as amended. All those in favor say aye. Those opposed nay. The ayes have it. H.R.7890 is adopted.