Home

Donate
Transcript

US House Subcommittee Advances 18 Child Online Safety Bills

Justin Hendrix / Dec 13, 2025

The Subcommittee on Commerce, Manufacturing, and Trade of the US House of Representatives Energy and Commerce Committee held a markup on Thursday, December 11, 2025, in the John D. Dingell Room in the Rayburn House Office Building in Washington, DC.

On Thursday, the United States House Committee on Energy and Commerce Subcommittee on Commerce, Manufacturing, and Trade held a markup hearing to consider bills related to child online safety and privacy. The subcommittee advanced all 18 bills under consideration to the full committee, including the Children and Teens' Online Privacy Protection Act (known as COPPA 2.0) and the latest version of the Kids Online Safety Act (KOSA).

The subcommittee’s chairman, Rep. Gus Bilirakis (R-FL), said the session was a response to an “online epidemic” facing children. While lawmakers in parties agreed on the need to address online harms, there was substantial disagreement, particularly from Democratic members, over preemption, enforcement, and the influence of Big Tech on the legislation.

For instance, in his opening remarks, the Rep. Frank Pallone (D-NJ), the subcommittee’s ranking member, said “these bills put a ceiling on kids’ privacy and safety by stopping states from doing more to protect kids.” He pointed to what he called “broad preemption language” in the bills that could “wipe out” existing data privacy and product liability laws.

“I cannot support the current versions of the Kids Online Safety Act or COPPA 2.0 that forever close the door on greater state protection for kids,” said Pallone. COPPA 2.0 and KOSA advanced on a party line vote.

Concern about industry influence was expressed by members of both parties. In one particularly charged moment, Rep. Jan Schakowsky (D-IL) said she was “furious” that Big Tech had not done more to protect children. “I'm sick of what you've been doing,” she said, addressing the industry. “I think you've been contributing to this, and I've had it.”

Schakowsky was not the only lawmaker to express fury at the industry. Perhaps the most explicit accusation of lobbying influencing the subcommittee’s priorities came from Rep. Kat Cammack (R-FL) during debate on the Algorithmic Choice and Transparency Act. Cammack expressed dismay that the subcommittee did not also consider the App Store Freedom Act, which she said would protect kids by encouraging a more competitive app market.

“I am furious that this bill has not been included in today's markup and it's because of the pressure of Big Tech,” said Cammack. “I will not bend the knee to Big Tech and no one on this committee should either.”

Rep. Lori Trahan (D-MA) joined Cammack in expressing concern, noting that Apple lobbyists were on Capitol Hill just the day before. The full committee chairman, Rep. Brett Guthrie (R-KY), defended the exclusion, suggesting it was not due to Apple’s influence.

“Hopefully you think I'm an honest person,” said Guthrie, indicating the bill was left out because it primarily concerns competition issues rather than child safety.

The 18 bills considered during the markup included:

  • HR 6290, the Safe Social Media Act
  • HR 6259, the No Fentanyl on Social Media Act
  • HR 6289, the Promoting a Safe Internet for Minors Act
  • HR 6437, the Kids Internet Safety Partnership Act
  • HR 5360, the AI Warnings And Resources for Education (AWARE) Act
  • HR 6499, the Assessing Safety Tools for Parents and Minors Act
  • HR 2657, Sammy's Law
  • HR 6265, the Safer Guarding of Adolescents from Malicious Interactions on Network Games (GAMING) Act
  • HR 6273, the Stop Profiling Youth and (SPY) Kids Act
  • HR 6253, the Algorithmic Choice and Transparency Act
  • HR 6489, the Safeguarding Adolescents From Exploitative (SAFE) Bots Act
  • HR 1623, the Shielding Children's Retinas from Egregious Exposure on the Net (SCREEN) Act
  • HR 6257, the Safe Messaging for Kids Act
  • HR 3149, the App Store Accountability Act
  • HR 6333, the Parents Over Platforms Act
  • HR 6292, the Don't Sell Kids' Data Act of 2025
  • HR 6484, the Kids Online Safety Act
  • HR 6291, the Children and Teens' Online Privacy Protection Act

What follows is a lightly edited transcript of the markup session. Reference the video of the session when quoting.

Rep. Gus Bilirakis (R-FL):

The subcommittee will come to order. The Chair recognizes himself for an opening statement. Good morning and welcome to today's subcommittee markup. Another important step in advancing legislation to protect children online. Our children are facing an online epidemic. Around 95% of teenagers use social media and far too many have been cyber bullied or faced other harms online. This issue is personal. We have parents on both sides of the aisle and all of us represent families back home who have been unfortunately affected in my home district, in the Tampa Bay area and also the Nature Coast. We lost 16-year-old McKenna Brown after relentless online bullying. Unacceptable. Her tragedy in the heartbreak of her family is a constant reminder of why we must act. It's because of countless stories like this that we're here today. We're considering nearly 20 bills today that together form a comprehensive strategy to protect kids and teens online.

Our approach is clear, protect kids, empower parents, and future proof our legislation. As new risks and technologies emerge, families deserve clarity, parents deserve control, and bad actors must face real consequences. The Kids Online Safety Act, or KOSA, is among these bills. Of course, a strong proposal with concrete safeguards and obligations for companies and one I'm very proud to lead, but no single bill is a complete solution. These proposals work together, complimenting and reinforcing one other to create the safest possible environment for children. There is no one size fits all bill to protect kids online, and our plan reflects that parents must be empowered to safeguard their children online. Just as a parent can guide their kids' activities at home and in school, they should be able to guide their children's activities online. Our bills ensure parents have the tools and information they need to keep their kids safe.

In a modern, increasingly complex environment, the status quo is unacceptable and we are here to change and ensure meaningful consequences for platforms that fail to protect our kids. Today is about progress. It's about setting aside differences and doing right by the families who are counting on us. I'm confident that working together we can advance meaningful bipartisan solutions that give parents peace of mind and make the online world safer for every child. I appreciate the hard work of my colleagues and I look forward to continuing this effort in the days ahead. With that, I yell back and I'll recognize the ranking member of the full committee, Ms. Shakowsky, for her five minutes for an opening statement.

Rep. Jan Schakowsky (R-IL):

Thank you, Mr. Chairman. But I want to tell you from the get go, I am furious. I think the legislation that has been proposed is terribly inadequate. We're leaving out states who want to participate to try and help the children. We want to listen carefully to the parents. Once again, parents who are here who have sadly seen their children die because of what has happened because I'm so angry at Big Tech who continues to allow the kinds of things that are happening to happen and we have to do better. There is no question about it. And so among the things that I want to say is that we do have to allow states to participate. It doesn't mean anything if we take out all the states who have been working hard and I think that that is a major lax in what is happening right now. We know the children are in fact dying and I don't understand how the Big Tech companies continue to participate in allowing things and not to prohibiting the kind of activities that we have been seeing.

I think that it is wrong right now for us to continue with the legislation that is on the page today that we are talking about because that leaves out the big issues that we are fighting for. I want to add to thank the two women that are here, but they are just not minor major, but they are spokespeople for hundreds and hundreds of people who are not protected and we see that Big Tech is part of allowing these things to happen. So we have a number of suggestions that we would make to be able to include states to make sure that our families are having their voices heard loud and clear. What else? Okay, I think I've covered about everything, but I just want to tell you how furious I am that we have not seen particularly from Big Tech, who is the big player when it comes to worrying about whether our children are going to be safe. I'm sick of what you've been doing. I think you've been contributing to this, and I've had it, and we're going to continue to fight to make sure for the safety nets that are there for our kids. I'll yield back right now, but I have a lot to say.

Rep. Gus Bilirakis (R-FL):

Gentle Lee yields back. The Chairman recognizes the Chairman of the full committee, Mr. Guthrie for his five minutes for an opening statement.

Rep. Brett Guthrie (R-KY):

Thank you Chairman. Bilirakis, I appreciate you holding the markup today. And we are here to take a decisive step towards fulfilling, which is our responsibility is to ensure that we have comprehensive solutions to protect our children and our teens online. The data is clear and it's alarming. Nearly half of us teens now report being online almost constantly, and children who spend more than three hours a day on social media face double the risk of developing serious mental health outcomes, including depression and anxiety. As we discussed in our legislative hearing last week, there is no single solution to solve this very complex problem and that is why there is a series of bills before us today, not one piece of legislation, but a very serious series of bills that have been carefully crafted to put forth a set of solutions designed to work together to put together like pieces of a puzzle.

And these bills have several themes. First, we're focusing on safety by default, ensuring that platforms are designed to protect kids from the start rather than placing the burden on a kid or their parent to navigate a complex set of safety standards. Second, we modernize COPPA to ensure existing privacy protections to teens. And third, we're empowering parents with the tools they need to oversee their children's digital lives where it's approving app downloads or managing privacy and safety settings. We have been intentional and thoughtful in shaping these proposals to ensure strong protections for kids while maintaining legal durability and effectiveness. And we remain committed to working with parents, industry experts and other stakeholders to continue to strengthen this package. I look forward to continuing to work with a ranking member and all of our colleagues to deliver a bipartisan framework that provides certainty for families while ensuring the safety safest possible environment for our kids. I'll yield back.

Rep. Gus Bilirakis (R-FL):

Gentleman yields back. Now I recognize the ranking member of the full committee, Mr. Pallone for his five minutes.

Rep. Frank Pallone (D-NJ):

Thank you Mr. Bilirakis. Today we're marking up 18 bills intended to protect kids online and I share this important goal. We all want our kids and teens to be safe online. The internet has profoundly changed how we and our children connect with others, learn about the world and participate in society. Our evermore connected lives have created new communities, shared interests, and entirely new ways to communicate. However, we have also seen reckless practices by internet platforms to surveillance and our children exploit vulnerabilities to turn a profit and promote engagement regardless of the consequences to protect our kids. Congress alongside parents, educators, and states has to act to ensure that these risks are addressed. So Mr. Chairman, I've long said, and I still believe that comprehensive federal privacy legislation is the best way to protect kids online as well as to protect all Americans. But in the absence of a data privacy bill, I do support common sense bipartisan proposals that protect kids while respecting all online users' privacy, safeguarding kids in unsupportive households and maintaining flexibility to address emerging harms.

So I'm pleased that we will be marking up my bill, the Don't Sell Kids Data Act, which will prevent shadowy data brokers from selling minor’s data and allow parents and teens to request the deletion of any data already in the hands of brokers. The data on our kids that has collected process and sold by data brokers fuels invasive ads and compulsive design features without regard to the harm suffered by children and teens who are still developing critical thinking and judgment. And my bill will turn off this flood on data being collected and sold about our kids. I also appreciate many of the bipartisan bills, Chairman Bilirakis included in the markup today which show that we can work across the aisle to protect our kids. I appreciate the good faith collaboration we have had and continue to have on these bills and I'm pleased that two additional bills are now bipartisan, the Spy Kids Act and the Safer Gaming Act.

Unfortunately, Mr. Chairman, I continue to have concerns with several other bills we're considering and it is clear more work is needed. Most concerning is that these bills put a ceiling on kids' privacy and safety by stopping states from doing more to protect kids. That's exactly the opposite of what our kids need with the speed that technology changes and with the new harms that they're encouraging. I can't support the current version of the Kids Online Act or COPPA 2.0 that forever close the door on greater state protection for kids, the broad preemption language in these bills and others would wipe out existing state laws on the books that keep kids safe, like state data privacy and product liability laws. This could shut parents who have had lost children out of the courtroom and shield tech companies from responsibility for harms that they inflict. I'm also concerned that other bills we're discussing place the onus on overburdened parents to protect their kids rather than making tech companies step up and do more.

Parents play an important role in keeping kids safe online, but platforms cannot hide behind parental tools to avoid making their underlying products safer. Moreover, for kids in unsupportive, neglectful or even abusive households, there could be real world harms from allowing parents complete access and control over their teen's existence online. And finally, I'm very concerned that some of these bills will move us in the wrong direction. In the fight for online privacy, I have concerns about bills that mandate third party access to children's data or require an adult or kid to provide additional sensitive data like a government ID or biometrics before they can access content, send a message or download an app. These bills would address the harm of tech companies’ reckless data practices by giving even more data to even more tech companies and that does not solve the underlying problem. So let me just say I must before I close Mr. Chairman, I have to touch on the Federal Trade Commission. The agency tasked with implementing and enforcing many of the bills in this markup. When President Trump attempted to illegally fire the FTC’s Democratic Commissioners, he made our children less safe online. I urged my Republican colleagues to empower the FT C to protect kids and teens by joining Democrats standing up for a bipartisan and truly independent federal trade commission. With that Mr. Chairman, I yield back the balance of my time.

Rep. Gus Bilirakis (R-FL):

Gentleman yields back. The Chair reminds members that pursuant to the committee rules, all members opening statements will be part of the record. Are there further opening statements? Ms. Trahan, you're recognized for three minutes for your opening statement.

Rep. Lori Trahan (D-MA):

Thank you Mr. Chairman. Protecting our kids when they're online is one of the most urgent responsibilities we face. It is a crisis for families in every corner of this country and it demands urgent action from Congress. That's why I am so frustrated that today's markup fails once again to meet the seriousness of what families are living through. Some may point to the number of bills on today's agenda as evidence of progress. I'm not one of them and I urge my colleagues not to be fooled. When I look at this list of legislation, I don't see a focus on platform design. I don't see comprehensive privacy and I don't see market reform. But what I do see is a continuation of a status quo where Big Tech continues productizing its users, insulating itself from competition and shifting the responsibility for protecting children onto parents who are stretched thinner than ever.

And it would be bad enough if these bills simply failed to move us forward, but they actually take us back. They undermine the work that state legislators and state regulators are already doing to protect kids. Mr. Chairman, when you combine the weak standards with a wide low federal ceiling, you get the worst of all world's legislation, like the new version of KOSA. I'm proud to stand with advocates and our colleagues on this committee who oppose this legislation and because we're here together with the very families this markup was supposed to serve in the audience, I want to offer a better path forward. It's a tech policy agenda focused on real systemic reform reform that targets the core practices, business models and markets dominance of Big Tech that perpetuate harm especially to young people. First is antitrust reform. When giant corporate monopolies shut out competition, they shut out the innovation that serves the public interest.

We need to break up monopolies just as Republican Teddy Roosevelt did over a century ago. We need to curtail vertical integration and we need to require interoperability and data portability. Our competition laws must be updated so that parents and consumers have real choices and so that smaller companies have a fair chance to succeed. Second is a comprehensive federal privacy and online safety standards that finally disrupt Big Tech's predatory harvesting, deployment and Sale of Americans. Private data state legislators have repeatedly outpaced Congress in regulating digital platforms, a fact that should embarrass us all. Even when Congress does act, it's often in ways that entrenched the status quo rather than improve it, which is exactly what we are seeing today. Comprehensive privacy and online safety legislation are opportunities for bipartisan common sense progress. The third component is simple, independent and well-funded enforcement laws without enforcement are just words on paper. And parents across this country have seen enough empty letters from Congress. We need to preserve the independence of our regulators and fund them to do their jobs. That is the three-prong strategy that I put forward to this committee and I urge us to pursue it. I steadfastly remain ready to work in a bipartisan way to deliver the reforms that American families and consumers deserve and I yell back my time. Thank you.

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. The Chair calls up. I'm not sure if there are any other opening statements. Yeah, there are. Ms. Clarke, you're recognized. I don't think we have anybody on our side. So Ms. Clarke, you're recognized for three minutes.

Rep. Yvette Clarke (D-NY):

Thank you Mr. Chairman and good morning. And good morning to our ranking member for holding and thank you both for holding this markup today. Last week during our legislative hearing on Kids Online Safety, I raised my concern over what I see as purely lip service and the weakest attempts at keeping people safe online. My colleagues on the right are sitting here today pretending to take action to keep kids safe online while they simultaneously dismantle the FTC and entertain the idea of an AI moratorium or preemption of state AI laws. I cannot make it any clearer that it would be detrimental for Congress to put any kind of ceiling on kids' safety and privacy through a moratorium or preemptive preemption. As we know, technology changes every single day when new tech emerges, so do new harms and scams and frankly right now I don't think we are doing our best to stop enabling scammers and giving handouts to Big Tech.

The best solution to protect our kids online is a federal data privacy standard that we can build off and include additional protection for children and teens. So far this year, all 50 states introduced legislation on AI while republicans in Congress offered no federal alternative other than preemption. As I've said before, it is no mistake that only the only Republican proposal freeze AI and Big Tech companies from all oversight without a comprehensive federal standard. Existing state laws are the only legislation keeping kids safe online right now. The next steps are clear, excuse me. To properly protect Americans online, we need a fully staffed FTC. We need to get over this preemption fad and we need to start working on a comprehensive federal privacy standard. I implore my colleagues on both sides of the aisle to take the time ahead of a full committee markup to put the partisan back and forth to rest and get serious about federal, a federal privacy standard. Until then, no one is safe online. With that Mr. Chairman, I yield back

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back any further opening statements. Alright, we'll get started here. The Chair calls up HR 6290 and asks the clerk to report

The Clerk:

HR 6290, a bill to require the Federal Trade Commission to conduct a study regarding social media use by team.

Rep. Gus Bilirakis (R-FL):

Without objection, the first reading of the bill is dispensed with and the bill will be open for amendment. At any point, does anyone seek to be recognized on the bill? Yes. Mr. Vince, you recognize?

Rep. Cliff Bentz (R-OR):

Thank you Mr. Chair Problem. Bill. Thank you Mr. Chair. Today we consider HR 6290 the Safe Social Media Act as straightforward bipartisan report bill that directs the Federal Trade Commission in coordination with the Department of Health and Human Services to study the real world effects of social media use on children and teenagers. All of us and of course parents, teachers, and mental health professionals are increasingly worried about what these platforms know about our kids, how that information is being used and what constant exposure to social media is doing to their mental health. But Congress is operating without a full understanding of the facts. This bill simply asks the FTC and HHS to answer foundational questions we should all agree must be asked, first, what are the documented mental health impacts including anxiety, depression, bullying, and body image concerns? Second, how often are young people on these platforms each day and how does that differ by age?

Third, what personal information are platforms collecting from minors? Fourth, how is the information being used or shared? And fifth, what are the harmful effects that are linked to extended or addictive use? This bill allows Congress to build policy on evidence rather than anecdotes on data rather than assumptions. If we want to protect children online, the first step is knowing the full scope of the problem. The Safe Social Media Act will give us facts we need to craft targeted solutions that keep kids safe without stifling innovation. I urge my colleagues to support this common sense measure and I look forward to working together as we work to improve the digital future for America's children. Thank you, Mr. Chair. Yield back.

Rep. Gus Bilirakis (R-FL):

The gentleman yields back. Now I recognize Dr. Schreyer for her five minutes to speak on the bill.

Rep. Kim Schrier (D-WA):

Thank you. Thank you Mr. Chairman. I am so pleased to be taking on the issue of kids and social media and I am so pleased to see my bill, the Safe Social Media Act considered in this markup today. This bill will compile valuable information about kids and teens social media use. We know that they are facing so many risks online and that social media use is directly impacting their development, their social lives, their mental health, their education, and between FTC and HHS. The study that this bill mandates will authoritatively assess the health and technological and educational impacts of kids' lives online and critically. This bill will also study the personal information that is being collected by social media platforms and how that information is then used by those platforms. It can be difficult to understand and appreciate just how much information from minors is collected and how it is utilized by Big Tech.

And this study will provide a clearer picture. It'll also document the social media use patterns and habits of minors including across different age groups and it'll highlight the mental health impacts of social media and how that correlates to the numbers of hours of use. Social media isn't just dangerous because of the content that kids are interacting with. It's also dangerous because of the habits and the addictive behaviors that it encourages and better understanding of how extended use and the addictive nature of social media use impacts. Minors can also really help us crack down on predatory social media platforms and designs and insist on even more study and protection of kids collecting and assessing more of this data on our kids'. Social media use can only help us better protect them online and also inform them and their parents. That's why I'm so proud to have worked on this bill with Representative Ben and I encourage my colleagues to support it. I yield back.

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. Anyone on this side of the hour wish to speak on the bill? Anyone on the Democrat side wish to speak on the bill? Okay. Seeing none, we'll ask now. Anyone wish to offer an amendment on the particular bill? Seeing none, the Chair, the question now occurs on forwarding HR 6290 to the full committee. All those in favor say aye. Aye. All those opposed the ayes have it. The bill is agreed to and forwarded to the full committee. The Chair now calls up HR 6259 and asks the clerk to report

The Clerk:

HR 6259 A bill to require Federal Trade Commission to submit to Congress a report on the ability of minors to access

Rep. Gus Bilirakis (R-FL):

Without objection. The first reading of the bill is dispensed with and the bill will be open for amendment at any point. So ordered. Does anyone seek to be recognized on the bill? Yes. You're recognized sir. Mr. Chairman, we would to strike last word. Thank you Mr. Evans. Appreciate it. You're recognized.

Rep. Gabe Evans (R-CO):

Thank you Mr. Chairman. Today I speak in strong support of my bipartisan bill geared toward helping combat the scourge of fentanyl that has been impacting American children and teens for far too long. I also want to extend my gratitude to a rep dingle for joining me in this important effort and serving as my Democrat co-lead. Before I came to Congress, I spent over a decade as a police officer in the Denver metro area. Beginning during COVID and accelerating through the previous administration, I saw firsthand the devastating impact that Fentanyl has on our communities, especially young children who are exposed to this poison via social media. In fact, my first full year out of uniform in 2023, Colorado logged the second highest youth overdose rate in the nation with 75% of those overdoses associated with fentanyl as the father of two boys elementary to high school ages.

This is personal to me and while overdose trends are starting to improve somewhat, this thread is far from over. Just last month law enforcement agencies led by the DEA seized 1.7 million counterfeit fentanyl pills south of Denver and 12 kilograms of powdered fentanyl. This amount is enough to kill 6.8 million people. Every man, woman, and child in the state of Colorado. We've heard far too many tragic stories about young impressionable Americans who thought they were buying something like Xanax or oxycodone from someone over apps like Snapchat only to ingest a counterfeit pill with a lethal amount of fentanyl. Those poisonings and these overdose deaths have taken far too many lives far too soon. In fact, every 15 days under the previous administration, we lost the same number of Americans to drug overdoses as we lost in the entirety of the September 11th terrorist attacks. As a combat veteran of the global War on terror and a former cop for 10 years, this fight is personal to me.

Congress must use every tool to combat this scourge. My bill that's being considered today helps get our government on the right path to effectively address the Fentanyl crisis and specifically how it impacts and is traffic to kids on social media. My bill would assess the full scope of this challenge and present a comprehensive set of solutions to inform every stakeholder from law enforcement agencies to social media to executive branch agencies. It includes common sense guardrails to prevent drug traffickers from being able to then exploit this report and adapt their tactics while still being a comprehensive review. As a cop, I saw personally how social media was used by dealers to get poison to our kids. I'm proud to be leading this bill with Congresswoman Dingle and look forward to supporting the bill through the legislative process. And I urge my colleagues today to join me in supporting the no fentanyl on Social Media Act. Yield back.

Rep. Gus Bilirakis (R-FL):

I thank the gentleman for his past service as a law enforcement officer and he yields back so I recognize Representative Dingle for her five minutes.

Rep. Debbie Dingell (D-MI):

Thank you Mr. Chairman. I move to strike the last word.

Rep. Gus Bilirakis (R-FL):

You're recognized.

Rep. Debbie Dingell (D-MI):

Thank you. I'm proud to co-lead HR 62 59 the no Fentanyl on Social Media Act with Congressman Evans and I thank him for all of his hard work. This bipartisan bill directs federal agencies to examine how drug traffickers use social media to target minors. Investigate how platform design enables access to fentanyl and develop solutions to keep these deadly drugs out of children's hands. Every parent in America has heard the stories and unfortunately too many have lived them. I think all of us want to acknowledge the parents that are here in the committee room today with pictures of their children and we are committed to trying to make sure that no one else goes through the experience that you have today. Teens can purchase what they believe are Percocet or Xanax through major social media platforms only to receive counterfeit pills laced with fentanyl. Let us be clear about what is happening.

Fake pills are bought through mainstream platforms and are killing our children. One report found that nearly 90% of pills the DACs from online drug transactions contain fentanyl. Other reports have documented how dealers openly advertise counterfeit pills to minors using disappearing messages, emojis and logarithmic recommendations that help connect children to sellers faster than parents or law enforcement can intervene. It has been shown that platforms design make social media a storefront for lethal fake pills. Our no fentanyl and Social Media Act will give policymakers real data on how platform design contributes to this crisis. Such as how platforms help illegal drug sellers hide their tracks, how all algorithms push content towards children and what safety interventions may work. We need to understand these systems so we can hold platforms accountable and craft solutions that save lives. And I know the desperation. I had a sister that became addicted in high school when pill pushing was happening.

Social media wasn't where it is now, but she too died of a drug overdose because we could do nothing to help her and we here have to do something to stop this. This legislation is an important first step towards protecting children from the fentanyl crisis that is unfolding online. Too many families have already suffered unimaginable loss. We cannot allow Big Tech and social media companies to just keep turning that blind eye without any accountability and say, it's not my responsibility, it's not my problem. It's all of our responsibility. It's all of our problem. Thank you Congressman Evans for your leadership on this issue for your caring. I urge my colleagues to support our bill and I yield back

Rep. Gus Bilirakis (R-FL):

General lady yields back further discussion on the bill on this side. Anybody further to speak on the bill on the Democrat side? Seeing none, any amendments being offered today on this particular bill? Seeing none, the question now occurs on forwarding HR 62 59 to the full committee. All those in favor say aye. Aye. All those opposed say no. The ayes have it and the bill is agreed to and forwarded to the full committee. Okay, the Chair calls up HR 62 89 and asks the clerk to report

The Clerk:

HR 6289 A bill to amend the Children's Online Privacy Protection Act.

Rep. Gus Bilirakis (R-FL):

Without objection, the first reading of the bill is dispensed with and the bill will be open for amendment at any point. So ordered. Does anyone seek to be recognized on this particular bill? Yes. Representative Lee, you're recognized for five minutes to speak on the bill.

Rep. Laurel Lee (R-FL):

Thank you Mr. Chairman. As technology rapidly evolves, so do the risks that kids and teens face online. While the benefits of internet access are undeniable, it also exposes young people to serious harm, including cyber bullying, sexual exploitation, scams, and even access to illegal drugs. The growth of artificial intelligence has also created new challenges like deep fakes and AI generated child sex abuse material. Today, almost half of teens in America report being online almost constantly, which has double the share from just 10 years ago. With 95% of teens on social media, the impact is clear. Heavy social media users will be more likely to experience cyber bullying, mental health challenges, and poor sleep. While the digital world has become central to childhood, it is where kids learn, play, communicate, form habits. It is also where predators operate. Elicit markets reach children directly and harmful content is only one click away.

Parents across the country are telling us the same thing. They want to protect their children, but the threats are complex, fast moving, and often invisible until harm is already done. This is why I was proud to introduce HR 6289. The Promoting a Safe Internet for Minors Act with my friend and colleague from the great state of Florida. Mr. Soto, I want to also thank the Chairman, Chairman Bilirakis and Chairman Guthrie for including it in today's markup. This legislation responds to the reality by strengthening education, transparency and access to reliable safety tools so families are not left guessing and children are not left exposed at its core. This bill directs the Federal Trade Commission working with law enforcement, medical professionals, nonprofits, states and industry to carry out a national public awareness and education campaign that is focused on online safety for minors. That campaign includes identifying and protecting children best practices for parents, educators, platforms and minors, conducting nationwide outreach and education, ensuring up-to-date information sharing and expanding access to publicly available online safety resources.

A central focus of this bill is parental empowerment. It emphasizes the effective use of parental controls, safety safeguards, and tools that allow families to make informed decisions about their children's online activity. Parents should not have to decode complex systems or rely on guesswork to protect their kids. This bill helps ensure that they have practical, usable resources that reflect modern threats, not yesterday's internet. The bill also squarely addresses serious dangers facing minors online including cyber crime, access to narcotics and illegal substances, gambling, alcohol and adult content and compulsive online behavior that damages physical and mental health. Finally, the bill includes an important accountability component. The FTC must submit annual reports to Congress for 10 years describing the activities carried out under this program that ensures continuous oversight, data-driven adjustments and sustained congressional engagement. As threats evolve, Florida families and American families are counting on our commitment on issues that involve child safety, parental authority and public trust. That is what this bill reflects. HR 6289 is a measured practical step to strengthen online safety for minors by empowering parents, informing communities and closing the gaps. Predators exploit. Thank you. And I yield back the balance of my time.

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back now. Recognize Representative Soto from the great state of Florida for his five minutes to speak on the bill.

Rep. Darren Soto (R-FL):

Thank you Mr. Chairman from the great state of Florida along with my colleague, Representative Lee, thank you for working with us. Thank you Chairman for working with us on this. Parents have had it up to here on what's happening to kids online. We do need to empower parents. You're going to hear this throughout the day to day and protect our kids online and information on threats is absolutely critical so that we could shield kids from adult content, online predators, scams and addictive overuse of social media and internet use in general. So the promoting a safe internet for minors act is an important step towards that, making sure that parents, educators, platforms and minors in general have the best practices and warnings on scams, on predators, on any negative trends impacting young people online and it makes sure the FTC is doing its job to save lives and to protect folks from all these different dangers.

It is important though that we recognize that the FTC right now is fighting with one hand behind its back because we have only three of five members that are still on the FTC. Unfortunately, president Trump unlawfully terminated the two democratic members. This undermines all the efforts that we have today because it's hard to have laws mean a lot if there's no cops on the beat. So I do encourage our committee to continue to work to make sure we have a fully funded and fully present FTC. We also need to make sure we're not having 10 year moratoriums. I'm pleased to see that that effort has struggled. It wasn't on the NDAA, it passed out of the house but then didn't make it into the final big ugly law because of the Senate. So we know this is not the way, the way is making sure we work on these bipartisan bills today and the bills that still have work to do.

We need to continue to negotiate in my home state of Florida. They've done some great work. Who would've thought, right? Sometimes I don't always agree with what the Florida legislature does, but in this case, HB three a law banning children under 14 from opening accounts on social media, having parental approval for young people, 14 and 15 years old to create their own accounts. This is something that we want to make sure we're protecting some aspect of states to be able to have some work in this area. Even as we work on preemption laws, we see the stories If that doesn't happen locally. In Orlando we had a young man, Sewell Setzer ii, a ninth grader in Orlando Christian Prep who died of suicide last year at 14 after being prompted by a chatbot to take his life. These are the types of stories that break our hearts and it's the reason why we want to make sure parents have all the best information. That's what this bill does and it's an important step forward to get parents, educators, and other folks the info they need to protect our kids online and without a yield back.

Rep. Gus Bilirakis (R-FL):

The gentleman yields back. I want to thank him also for clarifying that the moratorium was not in the NDAA, nor is it here today as well. So anyone else wish to speak on the bill? Seeing none, any amendments offered on this particular bill? Seeing none, the question now occurs on forwarding HR 6289 to the full committee. All those in favor say aye. Aye. All those opposed say no. The ayes have it and the bill is agreed to and forward to the full committee. All alright, moving along here. The Chair calls HR 6437 and asks the clerk to report

The Clerk:

HR 6437. A bill to direct the Secretary of Commerce to establish the kid's internet safety…

Rep. Gus Bilirakis (R-FL):

Without objection, the first reading of the bill is dispensed with and the bill will be open for amendment at any point. So ordered. Does anyone seek to be recognized on the bill? Representative Fry from the great state of South Carolina, you are recognized for five minutes to speak on the bill.

Rep. Russell Fry (R-SC):

Thank you Mr. Chairman. When I go home to South Carolina, parents don't ask me about subsections and legal citations. They ask me very basic, very human questions. How can I help my kids stay safe? What can I do when technology changes every single year, every moment? How can I keep up and what should I do? They're not anti-technology. They use these tools every day. They just want to know that as the internet evolves, the people who built it, the people who study it and the people who live with its consequences are working together to keep kids safe. That's why I introduced the Kids Internet Safety Partnership. Kipa. It's a bipartisan bill that I'm proud to lead with Representative Landsman and it's designed to do one thing. Well bring the right people into the same room to develop practical age appropriate best practices for kids online. It turns anecdotes into evidence.

It asks the partnership to map the real risk kids face online. The real benefits they get from staying connected and the real impact of different safety tools. Not in the abstract but by age group and by design feature. And then it has to publish that work in the open on a regular schedule so that parents can see. That means that when this committee writes or refines other bills in the future, we're doing so with the best available evidence. We have a shared playbook and it's been shaped by families experts, state Attorneys General, and by the companies who have to implement it. And I urge everyone on this very important bill to vote Yes. With that, Mr. Chairman, I yield back.

Rep. Gus Bilirakis (R-FL):

Thank you, gentleman yields back any further discussion on this particular bill and any amendments being offered on this particular bill? Seeing none, the question now occurs on forwarding HR 6437 to the full committee. All those in favor say aye. Aye. Those opposed say no. The Aye's have it and the bills agreed to and forward to the full committee. The Chair now calls up HR 5360 and ask the clerk to report

The Clerk:

HR 5360 A bill to direct the Federal Trade Commission to develop and make available to the public educational resources for parents, educators, and minors with respect to the safe and responsible use of AI chatbots by minors and for other purposes be it enacted by the Senate and the House of Representatives of the United States of America in Congress assembled section one short title. This act may be cited as AI warnings and Resources for Education Act or the AWARE Act. Section two, AI chatbots and minors. A educational resources not later than 180 days after the date of the enactment of this act. The commission and consultation with relevant federal agencies shall develop and make available

Rep. Gus Bilirakis (R-FL):

Without objection, the first reading of the bill is dispensed with and the bill will be open for amendment. At any point does anyone wish to speak on the bill? The Chair recognizes himself and we have an amendment at the desk without objection. The first reading of the amendment is dispensed with and we're open for amendment. So I have an amendment. Report the ANS, the amendment, please.

The Clerk:

Amendment to the nature of a substitute to HR 53 60 offered by Mr. Rakus of Florida. Strike all after the enacting clause and insert the following section one…

Rep. Gus Bilirakis (R-FL):

Without objection. Yes. Okay. The Chair recognizes himself. I have an amendment at the desk. Yeah. Okay. You gave me the wrong. Okay. Alright, is there any further discussion on the bill? Okay. No further discussion on the bill. We're going to vote on the A. Yeah. Vote. Let's vote on the ANS. We'll vote on the ANS. On the ANS. So any favor? All those in favor of the ANS say aye. Aye. All those opposed? Okay, so the Ayes have it. The Ayes have it. The question now occurs on forwarding 5360 as amended to the full committee. Those in favor say aye. Aye. Those opposed? No. The Ayes have it and the bills agreed to and forward to the full committee. The Chair calls up HR 6499 and asks the clerk to report

The Clerk:

HR 6499 A bill to require the Federal Trade Commission…

Rep. Gus Bilirakis (R-FL):

Without objection, the first reading of the bill is dispensed with and the bill will be open for amendment at any point. So ordered. Does anyone seek to be recognized on the bill?

Rep. Russ Fulcher (R-ID):

Mr. Chairman?

Rep. Gus Bilirakis (R-FL):

The Vice Chairman. Mr. Fulcher, you're recognized.

Rep. Russ Fulcher (R-ID):

Thank you Mr. Chairman. I move to strike the last word.

Rep. Gus Bilirakis (R-FL):

You're recognized.

Rep. Russ Fulcher (R-ID):

My bill, The Assessing and Tools for Parents and Minors Act, directs the FTC to conduct a study on the effectiveness of voluntary industry efforts to protect minors online in areas such as parental controls, age appropriate content, labels, and education efforts. The FTC will also have to produce recommendations for congress and industry on how to improve safety for minors and their parents. While the industry has pursued many voluntary efforts to protect kids online at best, we have mixed reviews on specific efforts. The goal of this bill is to provide a comprehensive assessment on their work with industry when it comes to online safety standards. This is a bipartisan agreement. I thank Representative Lanman of Ohio for being co-lead with me on this bill. I urge my colleagues to support the bill. I yield back the balance of my time. Thank you, Mr. Chair.

Rep. Gus Bilirakis (R-FL):

Gentleman yields back any further discussion on the bill? Are there any amendments being offered to this particular bill? Seeing none, the question now occurs on forwarding HR 6499 to the full committee. All those in favor say aye. Aye. Those opposed? No. The ayes have it and the bill is agreed to and forward to the full committee. The Chair calls up HR 2657 and asks the clerk to report

The Clerk:

HR 2657, Ability to require large social media platforms, providers to create

Rep. Gus Bilirakis (R-FL):

Without objection. The first reading of the bill is dispensed with and the bill will be open for amendment at any point. So does anyone wish to speak on the bill? Dr. Schrier, you're recognized to speak on the bill.

Rep. Kim Schrier (D-WA):

Thank you, Mr. Chairman. I'm proud to speak in support of Sammy's Law, a bill that I've worked closely with, Representatives Wasserman Schultz, Carter and Miller Meeks on. And I am thrilled today that we will be voting to move this bill forward and give parents more tools to make sure their kids are staying safe online. Sammy's Law is named for 16-year-old Sammy Chapman, who bought inadvertently, unknowingly a fentanyl laced pill over Snapchat and that pill killed him. And just a moment ago I saw on the video a whole bunch of parents out here mourning their children because Sammy is not the only one. Every day kids in the United States are interacting with dangerous content on social media and they don't understand the dangers that they're presented with. It can feel impossible for parents to protect their kids online and that's why Sammy's law is so important.

My bill would allow third party watchdog apps to alert parents when their child is experiencing or at clear risk of experiencing harm. And examples include suicide, depression, eating disorders, violence, sexual abuse, trafficking, child pornography, sextortion, you know the list. These watchdog apps can only provide enough of the child's data to help the parent understand the immediate risk that their child's facing. And this way, parents are alerted when kids are truly in trouble but aren't monitoring their children's every move because teens do need privacy, but parents also need to know when their kids are getting in trouble and are at risk. These watchdog apps have already been proven to be lifesaving, but currently their scope is limited. Sammy's law would ensure that parents can use these apps on all social media platforms. Through this bill, we are giving parents a valuable tool that will help protect their kids on social media and intervene immediately and know when their kids are in danger. I urge my colleagues to vote in favor of this bill and I yield back,

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back and Mr. Kean will be recognized from the great state of New Jersey for your five minutes to speak on the bill.

Rep. Thomas Kean (R-NJ):

Thank you, Mr. Chairman and I rise in support of Sammy's law. First I want to recognize and applaud my colleagues, Representatives Buddy Carter and Debbie Wasserman Schultz for their tireless leadership on this critical bipartisan legislation to save children's lives. This bill addresses silent epidemic in homes across America, social media platforms that have become hunting ground for bad actors. This legislation is named for Sammy Chapman, a 16-year-old who was approached by a drug dealer on Snap Chat and tragically died of fentanyl poisoning. Sammy's brave father was here in the committee room last week along with so many angel parents and advocates to attend our hearing and to make their voices heard. Tragically Sammy is not the exception. One in four young people see illicit drugs advertised for sale on social media. Sammy's law safeguards children by giving parents the option to use third party safety software to protect their children from harmful situations online.

Currently, many large social media platforms keep their ecosystems closed. This prevents parents from using the very tools that alert them to the dangers like drug distribution, human trafficking, or suicidal ideation. This bill ends that practice. It empowers parents by requiring large platforms to allow these third party safety tools to function providing a parent or a guardian explicitly authorized them. We are giving parents the tools that they have explicitly asked for to help them put them back in control of their kids' online experience. This is a shield, not a weapon. We are focusing on empowering parents against universal harms like fentanyl, sales, predatory sexual, sexual exploitation, and suicide. Parents know their kids best and they deserve the option to use these tools if they believe it is necessary for their family's safety. I urge my colleagues to support this important bipartisan bill to put parents back in the dryer's seat. I yield back. Thank you.

Rep. Gus Bilirakis (R-FL):

Appreciate it. Thank you very gentleman yields back. Anyone else? Any discussion on this particular bill? I want to thank Representative Carter as well. Yes, you're recognized.

Rep. Debbie Dingell (D-MI):

Thank you Mr. Chair. I move to strike the last word.

Rep. Gus Bilirakis (R-FL):

You're recognized, Ms. Dingell.

Rep. Debbie Dingell (D-MI):

I want to start off by saying that I believe we should be empowering parents to keep their children safe. Online tools that give parents visibility when their children encounter harmful content are lifesaving and the intent behind Sammy's law is absolutely the right place and I know every member of this committee wants to prevent another tragedy. However, I do have concerns that I hope we can address before this bill gets to a full committee markup. As many of us know, not every child comes from safe or supportive home. Congress must write laws that reflect the real complexity, the realness, the everydayness of American families, including minors living in unsafe or abusive households. LGBTQ teens who may not be supported at home. And young people in states where accessing reproductive or gender related information may carry legal consequences. We cannot unintentionally put children in harm's way. Also, the bill's mandate for indefinite retention of harm reports would effectively create a permanent digital record of our children's mistakes online, which would compound the real risks they already face.

We need to address privacy legislation someday, sometime. I hope we get there sometime, but we also cannot craft policies that shift responsibility from the platforms to the parents. Parental tools are important, but they cannot be a substitute for requiring platforms to fix the dangerous design features that expose our children to predators, drugs, and harmful content. And we know the stakes investigations by media outlets over the years have shown platforms. Design choices have facilitated everything from fentanyl trafficking to online exploitation. Our young people are navigating all of this while parents are just struggling to keep up. I want to support this legislation, but I can't as it's currently written. I hope that Mr. Chair, we can work together with you and my colleagues to improve this bill. The underlying goal of this bill is critical, but details do matter and we've got a responsibility to ensure that our laws do not endanger the very children we are trying to protect. Thank you, Mr. Chair and I yield back.

Rep. Russ Fulcher (R-ID):

Thank you. Is there any further discussion on the bill? Hearing none, the question now occurs on forwarding HR6265 to the full committee. All those in favor say aye. Those opposed? No, no, no. The Chair of the ayes have it and the bill is agreed to and forwarded to the full committee. Where are we here Chairman, our call up HR 6265 and asks the clerk to report.

The Clerk:

HR 6265. The bill to require online video game providers to provide certain safeguards for minors and for other purposes being enacted by the Senate. Without objection.

Rep. Russ Fulcher (R-ID):

The first reading of the bill is dispensed with and the bill will be open for amendment at any point. So ordered. Does anyone seek to be recognized on the bill? Mr. Soto, do you have the ANS Mr. Soto?

Rep. Darren Soto (R-FL):

Yes.

Rep. Russ Fulcher (R-ID):

You're recognized. Five minutes to speak on a bill. Oh sure. Clerk, maybe your report on the ANS.

The Clerk:

Amendment in the nature of a substitute to HR 6265 offered by Mr. Soto, Florida. Strike all after the enacting clause and insert the following section…

Rep. Russ Fulcher (R-ID):

Mr. Soto, you're recognized for five minutes.

Rep. Darren Soto (R-FL):

Thank you so much, Chairman. The Safe Gaming Act creates parental controls for chat feature on online interactive video games. These controls would allow parents to disable or limit access to chat features for minors. The original bill includes a very broad preemption section that would preempt any state law on video games chatting and kids online. The Ains would lower the applicable age in the bill to under 16. It was previously 18, which could be too restrictive for families with teens of various ages. It also significantly tailors the preemption of state laws to the issue at hand. Online video game chat features used by minors previously, the preemption was broad and far reaching this age would clarify that kids under 16 will be opted into the most protective levels of communications safeguards by default, the default restrictive safeguards can only be changed by a parent. This also gets in line with a lot of states, including the Sunshine State, which has capped that at 16 years of age.

It's the same as our driver's license in Florida. And I'd like to thank Mr. Keen for working with us on the language on this bill and look forward to continuing negotiations across the aisle on the preemption issue. Defaulting kids into the most restrictive communications settings ensures their safety while playing games online. These settings can only be changed by a parent giving parents the tools they need to make informed decisions about their child's safety. While gaming, we've seen some terrible examples in Florida of kids being harmed online by adult predators while chatting through video games. This bill is intended to address the concerns frequently brought up by parents about online chat systems and prevent these terrible stories like one that recently happened in South Florida where an 11-year-old girl was sexually assaulted and her predator got to her through the game. Roblox we're also hearing more horrible stories about this happening, so we really do need to make sure that we get this right and with that, I yield back.

Rep. Russ Fulcher (R-ID):

Thank you. General yields. Is there further discussion on the ANS? Mr. Kean from New Jersey is recognized.

Rep. Thomas Kean (R-NJ):

Mr. Chairman, thank you. This amendment is a powerful bipartisan product to address the dangerous harms online gaming chats present to minors. This legislation provides robust protections to keep kids safe online video game platforms and prevent harmful interactions with strangers and predators. It ensures that every user under the age of 16 cannot participate in chats unless they receive approval from their parents. The knowledge standard in the bill provides clarity for the FTC and State Attorneys General to effectively enforce this bill and to protect American families. I'm grateful to work on this legislation with my friend Mrs. Soto from Florida, and I appreciate his willingness to keep refining it as we work towards the full committee workup markup, he has my commitment to continue working together as well. For many New Jersey families, video games are a daily activity and as I pointed out at our legislative hearings last week, social gaming poses a very real threat to minors.

The data is alarming. Nearly 70% of teens report playing online games with at least weekly. We know that Predators offering use shared interest of a video game to build trust with a child. Statistics show that approximately two thirds of minors say an online only contact that asked them to move from a public chat into a private conversation on another platform. This vulnerability can lead to child exploitation and it must be stopped. My bill takes an important step in addressing these harms. Safer Gaming Act states that if a user is under the age of 16, they should not be participating in chats, particularly with adults unless a parent explicitly allows it. I've heard too many stories of New Jersey families and families across America who have suffered due to a lack of robust protections for children and for teens. The Safer Gaming Act gives parents the choice and the tools to limit their child's ability to communicate with strangers disrupting the cycle and protecting users from malicious encounters with strangers. I urge my colleagues to support this amendment and the underlying bill I yield back. Thank you Mr. Chairman.

Rep. Russ Fulcher (R-ID):

Is there further discussion on the amendment? Ms. Schrier is recognized.

Rep. Kim Schrier (D-WA):

I would thank you, Mr. Chairman. I just want to point out that I'm supportive of this bill and want to make sure that kids are not interacting with strangers being recruited by strangers. But I also need to mention some of the other harms in these gaming platforms, which include indoctrinating young men, tweens, and teens with the white supremacist, antisemitic, racist and anti-woman sentiments. And I think those things need to be addressed as well as our society becomes more divisive and more hateful. And I would like our committee to take up that issue as well. Thank you. I yield back.

Rep. Russ Fulcher (R-ID):

Is there any further discussion on the amendment? If there's no further discussion, the vote occurs on the amendment. All those in favor signify by saying aye. Aye. Those opposed saying Ay, in the opinion of the Chair, the ayes have it. The amendment is agreed to. Question now occurs on forwarding HR 6 2 6 5 as amendment to the full committee. All those in favor say aye. Those opposed no. An opinion to the Chair. The ayes have it and the bill is agreed to and forwarded to the full committee. The Chair now calls up HR6273 and asks the clerk to report.

The Clerk:

HR 6273. A bill to prohibit market or product focus research on children and minors…

Rep. Russ Fulcher (R-ID):

Without objection. First reading of the bill is dispensed with, the bill will be open for amendment at any point sorted. Does anyone seek to be recognized on the bill? Ms. Schrier, you're recognized.

Rep. Kim Schrier (D-WA):

Thank you Mr. Chairman. I have an amendment. The ANS.

Rep. Russ Fulcher (R-ID):

You're recognized. Thank you.

Rep. Russ Fulcher (R-ID):

I'm sorry. I'm sorry. I think we need to call up the amendment. We're not smooth in the Chair today. So

Rep. Kim Schrier (D-WA):

Yeah, it's called HR6273 ANS.

Rep. Gus Bilirakis (R-FL):

Clerk report.

The Clerk:

Amendment the nature of a substitute to HR 6273 offered by Ms. Schrier, Washington. Strike all after the enacting clause….

Rep. Russ Fulcher (R-ID):

No objection. The reading of the amendment is dispensed with and the gentle lady is recognized for five minutes in support of the amendment.

Rep. Kim Schrier (D-WA):

Thank you Mr. Chairman. I am so glad to be working with my colleague, Representative Miller Meeks on this bill to protect kids again from Big Tech. Today's kids have already been subject to a huge experiment by tech companies, which have made these social media platforms more and more addictive. Despite knowing from their own research that this would harm children. Now we are seeing the impacts of social media use on kids in real time with the woefully few protections in place. That is why we are having today's markup now. This bill ensures that at least some of the experimenting will end. Our bill will prevent social media companies from basically conducting market research on kids doing the AB testing to gather even more information about what appeals to kids. And that information will likely be used to figure out what will keep their eyeballs glued to the screen for even more hours of the day.

Rep. Kim Schrier (D-WA):

These companies have already collected so much information on our kids, on their interests and their preferences and their habits, and we have yet to see any of that information being used to protect kids. But it sure benefits Big Tech. This bill will limit Big Tech's grand experiment that is harming kids by figuring out how to keep them hooked longer. There's still room to improve the text in this bill on preemption to ensure that we're not inadvertently curbing the strength of other laws that can protect kids from Big Tech. We'll continue to work on this issue before a full committee work markup. I want to encourage my colleagues to vote for this common sense Bill that will limit Big Tech's ability to further trap our kids in addictive cycles online. Thank you. And I yield back

Rep. Russ Fulcher (R-ID):

Gentle lady yields. Is there further discussion on the amendment Chair? Recognizes the Chair of the subcommittee, Mr. Bilirakis.

Rep. Gus Bilirakis (R-FL):

Thank you Mr. Chairman. I appreciate it very much. A rise in support of HR 6273, the Stop Profiling Youth and Kids Act or the Spy Kids Act. I want to thank Representative Miller Meeks for her leadership on this particular bill. While this linguist is originally a key piece of KOSA, it deserves standalone attention because of the urgent threat these specific practices pose to our children and teens. This bill addresses a specific pervasive problem. Major platforms are treating our children teens, not as users to be served, but as test subjects to be studied. Terrible. Right now. Platforms conduct market and product focused research on children and teens to fine tune their ability to keep young users online longer. Unfortunately, they use specific design features like infinite scrolling, autoplay and appearance altering filters to hook children and then they study the data to make those hooks even harder to escape.

Rep. Gus Bilirakis (R-FL):

This research allows companies to profile youth and engineer their platforms to maximize profit at the expense of a child's mental health and wellbeing. SPY Kids Act puts a stop to this by prohibiting platforms from conducting market or product focused research on minors with only exception being for research that is solely used to improve the privacy, security, transparency, or safety of children and teens on these platforms. We must ensure that the online ecosystem is built for children's safety, not for corporate commercial advantage. And again, this was in the original KOSA, but this is a great bill and it stands alone. We've got to get this across the finish line folks. It's essential. I urge my college to support this measure to protect American families and stop the profiling of our youth. I'll yield back. Mr. Chairman. Thank you.

Rep. Russ Fulcher (R-ID):

Thank you Mr. Chairman. Is there any further discussion on the amendment? Hearing none if there's no further discussion, the vote occurs on the amendment. All those in favor shall signify by saying aye. Aye. All those opposed nay. In the opinion of the Chair, the ayes have it and the amendment is agreed to. The question now occurs on forwarding HR 6253 is amended to the full committee. All those in favor say aye. Aye. All the opposed and no. The ayes have it and the bills agreed to and forwarded to the full committee. The Chair calls up HR 6253 and asks the clerk to report

The Clerk:

HR 6253 Bill to require online platforms to disclose policies and provide options…

Rep. Russ Fulcher (R-ID):

Without objection. The first reading of the bill is dispensed with and the bill will be open for amendment at any point awarded. Does anyone seek to be recognized on the bill? Ms. Cammack of Florida, you're recognized.

Rep. Kat Cammack (R-FL):

Thank you Mr. Chairman. I'm pleased to offer our bill HR 6253, the Algorithmic Choice and Transparency Act. For years Americans have used digital platforms. They believed we're getting neutral search results, fair recommendations and unbiased access to information. But of course what we have learned over and over and time and time again is that these platforms quietly deploy algorithms that shape what we see when we see it and what we never see at all. And here's the problem. These algorithms are invisible, unaccountable, and entirely outside of the control of the very people that they impact and affect the most. The consumer HR 6253 is built on two simple principles, choice and transparency. Two principles that conservatives and quite frankly all Americans have championed for decades. Two principles that shouldn't just disappear because the marketplace is now digital. First, this bill restores algorithmic choice Under HR 6253, major platforms must give the users the ability to opt out of the algorithm that curates their feed, their search results or their recommendations.

If someone wants a simple chronological feed, they should have it. If they don't want an opaque system deciding what's important or what's appropriate, they should be free to turn it off. Look no further than Tiktok's behavioral algorithm that is designed to increase intensity and outrageousness of content. Second, it is all about transparency. Platforms must clearly disclose whether their algorithms factor in things like political orientation, ideology, or viewpoint. They must tell Americans when their data is being used to shape the content that they're being shown. And transparency of course is the bedrock of trust and right now trust in Big Tech as an all time low. Let me highlight something especially critical. This bill matters profoundly for America's children and the parents that are trying to protect them online right now parents have no meaningful way of knowing how platforms are curating what their kids are seeing online.

Parents know that these algorithms are feeding a child harmful content, amplifying issues such as body image issues or promoting risky behaviors or pushing them into dangerous social spirals. Parents have told us this time and time again and they feel outmatched by systems that are designed to keep kids engaged and their eyeballs on the screen even when that engagement hurts them. This Bill, HR 6253 gives parents control again once again by requiring transparency. We make clear how these algorithms work and what they prioritize. And by requiring choice, we give parents the ability to disable those systems completely and default to a safer chronological experience. This bill also empowers parents to make informed decisions and it helps families understand what their kids are actually exposed to. If Big Tech is going to influence what children see, then parents have every right to know how, why, and when that influence occurs.

And let me be clear, this bill does not regulate speech. It does not tell platforms what they can say or show. It simply shines a light on the practices that have been hidden in the dark and it gives families the freedom to opt out of that system. Ultimately, this legislation is about putting Americans not algorithms back in the driver's seat. It's about ensuring fairness, transparency, parental empowerment, and the freedom to choose how we experience social media. I encourage, of course, my colleagues to support this bill, but I would be remiss speaking of freedom and choice and empowering parents and protecting kids. If I didn't mention that there is a critical piece of legislation missing from today's markup. Today, we should be advancing the App Store Freedom Act, which would actually empower parents and protect kids in practice rather than just in words. I am furious that this bill has not been included in today's markup and it's because of the pressure of Big Tech.

I will not bend the knee to Big Tech and no one on this committee should either. It is absurd that we have Big Tech the day before a markup on the doorstep of this committee putting pressure saying this should not be considered. And guess what? It's about profits. Some of the legislation that we are considering today is about giving more tools to the very same actors that have a horrendous track record in protecting kids and their own employees. Apple's own employees have said so, that they are not in the business of protecting kids. They're in the business of making money. The track record is, as I said, horrendous in Florida alone. A 13-year-old girl was a victim of a notify app from Apple's own store that they recommended for four year olds and yet they want to sit here and say that they're protecting kids. That's bs.

And on top of that, they still continue to take profits from all of these apps that they claim are acceptable for kids. This monopoly has no accountability and there have been no real steps to actually protect kids by either Apple or Google. The App Store Freedom Act is a bipartisan bill with strong support on this committee and it should be in this markup today. It's a shame that it's not. We will continue to push for this legislation because it is absolutely the only way that we can ensure that kids will be safe online and it will actually empower parents by allowing them to establish marketplaces instead of being forced to use and Google's products. We cannot put kids' safety on the backseat just because we want to bend the need of Big Tech. And with that Mr. Chairman, I yield.

Rep. Russ Fulcher (R-ID):

Gentle lady yields. Is there further discussion on the bill? Hearing none, the question now occurs on forwarding HR 6253 to the full committee. All those in favor say aye. Those opposed? No. The ayes have it and the bill is agreed to and forwarded to the full committee. The Chair calls up HR 6489 and asks for the clerk to report.

The Clerk:

HR 6489 the bill to ensure that providers of chatbots clearly and conspicuously can disclose

Rep. Russ Fulcher (R-ID):

Without objection. First reading The bill is dispensed with and the bill will be open for amendment at any point awarded. Does anyone seek to be recognized on the bill?

Rep. Erin Houchin (R-IN):

Mr. Chairman.

Rep. Russ Fulcher (R-ID):

Ms. Houchin, you're recognized.

Rep. Erin Houchin (R-IN):

Thank you Mr. Chairman. I moved to strike the last word. Today. Kids aren't just scrolling feeds. They're forming real emotional attachments to AI. Chatbots that can mimic authority, appear trustworthy and respond at all hours. Some of these systems even suggest they can give medical or therapeutic advice and many minors don't realize they're talking to software, not a person which makes it harmful or misleading. When they provide guidance and try to be persuasive in the most serious moments, self-harm addiction gambling has occurred. There's no guarantee they'll be routed or rerouted to real world help. Parents are quite simply outmatched and the status quo is not acceptable. The Safe Bots Act creates clear baseline guardrails. It prohibits AI from impersonating licensed professionals. No chat bots should act like a doctor or a therapist to a child. It requires age appropriate disclosure. So minors know you are talking to ai, not a human, and I cannot provide licensed professional advice.

It mandates crisis hotline information. When a minor raises self-harm, it requires a reasonable take a break and provides prompts after extended interaction and it directs the NIH to study the long-term mental health effects on chatbots on kids. Over the past several days, I've had constructive conversations with my Democrat colleagues and we've continued refining the text ahead of full committee markup. The updates we are preparing directly incorporate bipartisan feedback and will strengthen the bill by addressing the concerns raised on both sides of the aisle. We'll be refining key definitions to ensure the policy targets are clearly defined and the harms are clearly defined rather than broad or ambiguous categories. We'll be clarifying the scope of restricted content so the focus remains on the of harmful activities, not legitimate or informational discussion. We'll be improving user safety provisions by adopting a more reasonable timeframe for break prompts and providing clearer preemption language that mirrors existing bipartisan models. Preserving state consumer protection and common law claims while offering regulatory certainty. The Safe Bots Act protects innovation while ensuring that children are not misled, manipulated, or put at risk by AI systems pretending to be something they're not. I urge my colleagues to support this legislation and I yield back.

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. Is there any further discussion on this particular bill? Any amendments to be offered on this particular bill? Seeing none, the question now occurs on forwarding HR 6489 to the full committee. All in favor say aye. All opposed? No, the ayes have it and the bills agreed to and forwarded to the full committee. Next we have the Chair calls up HR 1623 and asks the clerk to report

The Clerk:

HR 1623 A bill to require certain interactive computer services

Rep. Gus Bilirakis (R-FL):

Without objection. The first ring of the bill is dispensed with and the bill will be open for amendment at any point. So order. Does anyone seek any recognition to discuss the bill? Mr. Chairman? Yes. Mr. Kae, you recognize Mr. Goldman? I'm sorry Mr. Goldman, you're recognized.

Rep. Craig Goldman (R-TX):

Thank you Mr. Chair. I'm an amendment in the nature of a substitute at the desk.

Rep. Gus Bilirakis (R-FL):

You're recognized. No, no, no. He's got, it's got the clerk, ask to report the amendment.

The Clerk:

Amendment in the nature of a substitute to HR 1623 offered by Mr. Goldman of Texas.

Rep. Gus Bilirakis (R-FL):

Without objection. The reading of the amendment is dispensed with and the gentleman is recognized for five minutes in support of his amendment.

Rep. Craig Goldman (R-TX):

Thank you Mr. Chairman. Today the average child is exposed to sexually explicit online images before they even reach high school. In many ways, the adult entertainment industry has been allowed to operate with impute profiting from the addiction of children while hiding behind the excuse that age verification for their websites is too difficult. But Texas has already proven it works. When I served in the Texas House of Representatives, we passed a bill that implemented age verification for websites with sexually explicit content. The industry sued our state immediately. They claimed that they had a First Amendment right to broadcast graphic sexual content to children without checking a single ID and a win for common sense. The Supreme Court of the United States upheld Texas's law and helped protect children. My amendment today follows the same playbook as Texas. It updates the screen act to align it with a successful Texas statute and federalizes it across the country.

This amendment requires explicit websites to do what brick and mortar stores have done for decades. Have an ID check so that minors are not allowed to access this type of explicit content online. The protections that the courts have already upheld for children in Texas should not stop at our border. Every child in America deserves the same consistent standard of safety as a child in Texas has. We must protect children from harmful online content and we can accomplish this better by updating the screen Act. I urge my colleagues to support this amendment and their underlying bill and I yield back.

Rep. Gus Bilirakis (R-FL):

Gentleman yields back Any further discussion on the amendment? Seeing none will vote on the amendment. If there are no further discussion, all in favor of the amendment shall signify by saying aye. All those opposed the ayes have it and the amendment is agreed to. The question now occurs on forwarding HR 1623 as amendment to the full committee. All those in favor say aye. Aye. Those opposed? No. The ayes have it and the bills agreed to and forward to the full committee. The Chair calls up HR 6257 and asks the clerk to report.

The Clerk:

HR 6257 A bill to protect minors from harms associated with ephemeral messaging features and unsolicited direct contact on social media platforms by prohibiting certain

Rep. Gus Bilirakis (R-FL):

Without objection. The first reading of the bill is dispensed with and the bill will be open for amendment. Are there any amendments? No amendments. Okay. Yes. Dr. Dunn, you're recognized to speak on the bill.

Rep. Neal Dunn (R-FL):

Thank you Mr. Chairman. I move to strike the last word on my bill.

Rep. Gus Bilirakis (R-FL):

You're recognized sir.

Rep. Neal Dunn (R-FL):

Alright, so I urge my colleagues to support this legislation. 6257, the Safe Messaging for Kids Act or SMK Act. As I mentioned during our legislative hearing on these important bills this last week, I've worn a lot of hats in my life. I served in the Army, I spent 25 years in the community as a surgeon, but the title that keeps me up at night motivates me every day is grandfather. And today we're facing a crisis of our own making. We've placed devices in our children's hands that are more powerful than the computers that send astronauts to the moon. Yet we fail to provide the digital equivalent of seat belts or smoke detectors. Right now our children are at risk. They're being targeted, groomed and exploited on social media platforms built with features that allow predators to operate in the shadows. And that's why I introduced the Safe Messaging for Kids Act.

This bill is a direct intervention to stop two specific mechanisms that predators use to haunt our children. Ephemeral messaging and unsolicited contact. The problem is disappearing evidence. Let's start with femoral messaging. Ephemeral messaging, the technical term for messages that vanish automatically after being viewed. A good example of this would be a drug dealer sells fentanyl to a teenager in a school hallway. And the moment the transaction is done, the security camera footage automatically erased itself. This is precisely what's happening online. Predators depend on these disappearing messages and they erase the trail of grooming cyberbullying illegal drug sales before anyone can detect it. My bill puts an end to this with Wild West sort of situation. Under section three of the SMK Act, social media platforms are prohibited from offering auto deleting messages for to any user they know or deliberately avoid Knowing is under the age of 17.

If you're a tech company and you know your user is a minor, you should not be handling them. A tool designed to destroy evidence. Very simple. The solution is restoring parental authority. The second threat is unsolicited contact. Today strangers can slip into a child's messages as easily as walking up to a kid on a playground. We would never permit that in the real world. We shouldn't allow it online. The SMK ACT requires parental direct messaging controls that put parents back in charge for children under 13. Direct messaging must be turned off by default for teens under 17. Parents must have visibility into who's attempting to contact the child. Platforms will be required to alert parents to messages from unapproved contacts and give them the power to approve or deny those requests before communication begins. This is not helicopter parenting, it's responsible parenting and we're giving families the tools that they need to protect their children.

Mr. Chairman, in medicine, we always take an oath to do no harm for too long. The social media companies have violated that principle and they've engineered features that put kids at risk simply so they can increase their engagement online. The Safe Messaging for Kids Act is a practical common sense treatment plan and preserves evidence of wrongdoing, restores parental authority and it protects our children from strangers who are hiding in their pockets. I urged my colleagues to support this legislation and I believe our children are depending on us to do it. With that Mr. Chairman, I yield back.

Rep. Gus Bilirakis (R-FL):

Gentleman yields back. I want to thank him. And also, is there any further discussion on this particular bill? Any amendments being offered on the bill? Seeing none, the question now occurs on forwarding HR 6257 to the full committee. All those in favor say aye. Aye. Those opposed? No. The ayes have it and the bills agreed to and forward to the full committee. The Chair calls up HR 3149 and asks the clerk to report

The Clerk:

HR 3149. A bill to safeguard children by providing parents with…

Rep. Gus Bilirakis (R-FL):

Without objection, the first reading of the bill is dispensed with and the bill will be open for any amendment. Does anyone seek recognition on the particular on the bill? Mr. James, you're recognized from the great state of Michigan. You're recognized for five minutes on this particular bill.

Rep. John James (R-MI):

Thank you Mr. Chairman. As a father of three, protecting kids online is one of my top priorities and something parents across Michigan raised with me constantly. My bill is simple. App stores should follow the same common sense rules we expect from every small business. If a corner store can't knowingly sell adult or addictive products to minors, then neither should the world's largest digital storefronts. These multi-billion dollar companies already have the means to do this safely while respecting privacy, but they're instead choosing to put profits over protecting kids. This isn't about one company, it's about holding every platform to the same basic standard of responsibility. Parents have a right to know. Kids cannot consent and no one is above the law. With that, Mr. Chairman, a yield

Rep. Gus Bilirakis (R-FL):

Gentleman yields back. Any further discussion on the bill. So what? Any amendments to the bill? Ms. Chairman, I have an amendment.

The Clerk:

Amendment in the nature of the substitute, the clerk will report amendment in the nature of the substitute to HR 3149 offered by Mr. James in Michigan. Strike all after the

Rep. Gus Bilirakis (R-FL):

Without objection. The reading of the amendment is dispensed with and the gentleman is recognized for five minutes in support of his amendment.

Rep. John James (R-MI):

Well Mr. Chairman, I want to thank you and also I'm grateful to my colleague and friend, Mr. Obno of California for working with me on this bill. His depth of knowledge and his passion for this subject matter shines through in his work for the people of the state of California and for the United States of America. I'm looking forward to continuing my work with you on this bill. This amendment represents a narrow and practical update to the App Store Accountability Act, while at the same time maintaining the core spirit of the bill. My number one goal has always been to empower parents and protect children online. This is what parents want and it's what we must do today. 88% of parents want app stores to require parental approval before minors can download a new app. As I said, my opening remarks, and I'll say again, I welcome engaging in working with each and every single person here across the island, across the country who are united to protecting our children, protecting parental rights and holding Big Tech accountable. Mr. Chairman, I yield.

Rep. Gus Bilirakis (R-FL):

The gentleman yields back any further discussion on the amendment? Mr. Obernolte, you're recognized.

Rep. Jay Obernolte (R-CA):

Oh, thank you very much.

Rep. Gus Bilirakis (R-FL):

For five minutes on the amendment.

Rep. Jay Obernolte (R-CA):

Thank you Mr. Chairman. I am delighted that we are considering a couple of pieces of legislation today that impose a requirement on app store providers and app developers to carry out age verification when users or minors. I think that's going to have meaningful implications for improving online child safety and it's wonderful that we're having that discussion today. I'd like to thank my colleague from Michigan for working with me on improving this bill. As most of you know, I spent 30 years as an application developer and so I want to make sure that the requirements that we impose on app stores and application developers are clear and implementable and the ANS that is being offered makes a lot of changes that improve the bill in that way. However, there is a couple of problems that remain in the bill and I wanted to highlight one in particular.

Right now, the bill as written requires even an application that has no content inappropriate for minors to receive age category information from the app store. And this is bad for a couple of reasons. First of all, to impose a requirement that has no benefit is the definition of bad regulation. And if an application is something like a weather app that has no content that's inappropriate for a child, there's no reason why the developer should be forced to get that information. But there's a second more important reason this is a bad idea and that is that we want to avoid creating repositories of personal information about minors. And when you require an application to request this information, you create a lot of opportunities for malicious actors to get access to that information and to misuse it. And we want to make sure we minimize that. So I am going to support the bill today. I'm hopeful that as the bill moves forward here, we will continue to work on it and correct some of these problems. But thank again my colleague from Michigan for working with me on the bill and for his good work on this issue, I yield

Rep. Gus Bilirakis (R-FL):

Back. I thank the gentleman. Gentleman yields back further discussion on the amendment. All right, no more discussion. So if there is no further discussion, the vote occurs on the amendment first. All those in favor signify by saying aye. Aye. All those opposed saying nay. Nope. Alright. The ayes have it and the amendment is agreed to. The question now occurs on forwarding HR 3149 as amended to the full committee. All those in favor say aye. Aye. Those opposed. Known the ayes have it and the bills agreed to and forward to the full committee. The Chair now calls up HR 6333 and asks the clerk to report

The Clerk:

HR 6333. A bill to ensure responsible age…

Rep. Gus Bilirakis (R-FL):

Without objection, the first reading of the bill is dispensed with and the bill will be open for amendment. At any point, does anyone seek recognition on the bill at this particular time?

Rep. Erin Houchin (R-IN):

Mr. Chairman?

Rep. Gus Bilirakis (R-FL):

Yes. Ms. Houchin, you're recognized.

Rep. Erin Houchin (R-IN):

Thank you Mr. Chairman. I rise in support of HR 6333, the Parents Over Platforms Act, which I co-led with alongside Representative Jake Auchincloss. I do want to begin though by thanking my colleague, Representative John James for his incredible leadership on holding app stores accountable as well as Mr. Ulti for his input in this important issue. They're absolutely right that the app stores have become the wild, wild west and Representative James has started a vital conversation about the responsibilities that these companies must bear. We share a unified goal, empowering parents and ensuring the digital ecosystem is safe for our children. The Parents Over Platforms Act seeks to contribute to this effort by establishing a framework of shared responsibility. Our bill requires app stores to institute age assurance practices and pass a secure age signal to apps. But it also ensures that developers are responsible for what happens inside the app once the download is complete.

We know from recent polling that 70% of parents believe protections shouldn't stop at the point of download. They want measures that continuously keep minors safe while they use an app. In keeping with the committee's comprehensive approach to kids' online safety, the Parents Over Platforms Act provides such protection by requiring app stores and developers to share responsibility. It closes the loophole that empowers kids to bypass safeguards and leaves parents without any avenues to protect their own children. I believe this is a common sense solution to online safety that can bring everyone to the table. I look forward to working closely with Representative James, Representative Auchincloss and the rest of this committee to deliver a bipartisan solution that holds app stores accountable. I urge my colleagues to support this measure and I yield back.

Rep. Gus Bilirakis (R-FL):

I think the gentle lady, she yields back. Anybody else? Any discussion on this particular bill? No further discussion.

Rep. Jay Obernolte (R-CA):

Mr. Chairman?

Rep. Gus Bilirakis (R-FL):

Yes, you're recognized, sir. Your recognized.

Rep. Jay Obernolte (R-CA):

Thank you Mr. Chairman. Again, this is another effort to solve this problem of age verification for app stores and I want to thank my colleagues for their good work on this and for Representative Auchincloss for working with me on this bill. This bill, while well-intentioned, has a few fundamental problems that I think would need to be solved before it became law. First of all, it includes a provision that forces an app that’s intended for adults only to do its own age verification rather than relying on the age verification signal from the app store. And I think this is completely nonsensical. First of all, the App store in most cases is going to do a better job than the application in determining the age category of the user. So I think that there's a lot more opportunity for error if you require each application to do it separately.

But also it feeds into the discussion we were having a minute ago about creating a bunch of different repositories of personal information on minors when we don't need to do that. So if you think about the number of applications that you have on your cell phone, for example, I think the average user probably has a hundred. We don't want to have a hundred different repositories of personal information of minors. We just want to do that once. So I don't think that that's a good idea. It also forces apps to do their own parental verifications, which again is something that I think can be done once at the operating system level by the app store instead of forcing each application to do it differently. And this also would allow another protection, which I think is important. And this was in Congressman James's bill, which is if an application is intended for adults only and does not have verifiable parental consent, it shouldn't even be launched.

We shouldn't force the application to start and then do this edge verification. The operating system should block it even from being launched, which you can do if you do parental verification at that level. There's also a requirement for data minimization imposed on app developers here. And without getting into the pros and cons of that, this is an item of active discussion in the data privacy working group. As everyone knows, we are trying to create a single unified federal standard for digital data privacy and data minimization as a big part of that conversation. I think it's inappropriate to have that discussion here in this bill because it distracts us away from the central issue of protecting kids. And then the last thing that I think could be improved in this bill is that the bill as written right now disallows the use of age category information for any purpose other than the purpose of the act.

And I would say that if we're going to go through the effort of creating a mechanism to pass the age category to an app, if the app wants to use that information to improve the experience for a minor, for example, an application might have no content that's inappropriate for a minor, but they could use the age category to improve the interaction with the user knowing that they are a minor. I think that that should be allowed. I think that that's something that compliments the purpose of the bill. So I'm going to support this bill today moving forward, but I'm hopeful that we can take the time between now and the full committee markup to address some of these issues. And obviously I'm hoping that we can have one unified vehicle here moving forward.

Rep. Erin Houchin (R-IN):

Well, the gentleman yield,

Rep. Jay Obernolte (R-CA):

I certainly will. You recognize? Yeah.

Rep. Erin Houchin (R-IN):

Thank you. I just want to say I agree with Mr. Ulti on his suggested changes largely and would look forward to working together as we try to craft a unified solution here aimed at the same goal. But he's got great thoughts on how best to do this and I welcome the input.

Rep. Jay Obernolte (R-CA):

Great that I yield back, Mr. Chair.

Rep. Gus Bilirakis (R-FL):

Gentleman yields back. The question now occurs on board, Mr. Chairman? Yes, yes, yes. You recognize Ms. Cammack is recognized on the bill.

Rep. Kat Cammack (R-FL):

Thank you Mr. Chairman. I would be remiss if in the conversation that we're having today about kids' safety and the issues that have been highlighted with some of these bills, again, we have to highlight that the App Store Freedom Act would address many of this because our bill would allow parents to actually create a marketplace where they would have apps that they knew were safe for kids. It would allow parents to actually have that freedom and that choice. Whereas today, you don't have the opportunity to build your own marketplace. And instead we're having discussions about whether or not we should empower Apple and Google to be responsible for age verification for these kids. We're basically again asking the Fox to guard the hen house with the same people who have track records, abysmal track records of protecting kids online. Were now saying you're going to be responsible for their data and for verifying ages, but you still have to keep using our app store and be using our marketplace.

And by the way, you don't have the freedom to create your own marketplace. And again, this is why the App Store Freedom Act is a kid safety bill and why it should be a part of the markup today. And I think as we're sitting here having the conversation about how do we address some of the shortfalls in these different pieces of legislation, which I believe are good bills and are exceptionally well intentioned, this is why the App Store Freedom Act is so essential to be a part of the conversation. And I would be remiss if I did not include as part of the testimony here today, that one of Apple's own safety engineers responsible for fraud protection called Apple, their own employer, the greatest platform for distributing porn of children, children's porn. That same engineer also said that child predator grooming was an under-resourced challenge and an active threat within the app store. Every single bill that we are considering here today is moving things around the edges of the issue. We are playing on the fringe. We need to address the fact that the app store and the marketplace is broken. A monopoly is not going to fix it. We have to break the monopoly and force competition back into the space. That is how we protect kids. So I just thought that was important to mention and I yield my time to my colleague, Representative Trahan.

Rep. Lori Trahan (D-MA):

I'd like to thank my colleague from Florida for her perseverance and her ability and courage in calling out the fact that the App Store Freedom Act is notably missing from this portfolio of legislation that we're debating today. Look, we're going to have a lot of debates around what is the best path forward? Is it data minimization? Is it parental controls? But really what is missing is we're not giving parents control or choice to choose an app store that is right for their children. So we can say, oh, should we be collecting data at the device level? Should we be doing age assurance or verification at the app store level or the develop level? How about we just enforce the antitrust laws that are on our books and we finally dispense with giving Google and Apple the sole ability to put a store on our children's mobile phones?

This is ridiculous that we're even debating all of this incremental change or progress when we're not really getting at the heart of the things that my colleague, both colleagues from Florida have talked about so much. And how can we not just imagine a store that is designed for our children that we don't have to be controlling the app experience or the device UI for our children every single day, but rather we get to go to sleep at night knowing that our app store that we chose that is curated by Experts for children sits on our children's phone so that they're not exposed to pornography and all the harms that the engineers of Apple have gone on record saying exist for children. I mean these safeguards are just inadequate. And so until we get to the heart of breaking through this monopoly duopoly on our kids' phones and debating at a minimum the App Store Freedom Act, then we're not going to have the peace of mind of knowing our children have apps on their phone that are safe for them. Thank you. I yield back.

Rep. Gus Bilirakis (R-FL):

Yield back.

Rep. Lori Trahan (D-MA):

Yield.

Rep. Gus Bilirakis (R-FL):

Oh yes, yes. I'm going to yield to the Chairman of the full Committee. Well, you might want your full five minutes.

Rep. Brett Guthrie (R-KY):

I'm fine. I'll take five minutes. I just want to say that I appreciate that and what my friend said, and we were looking, we're going to look at FTC issues and we were looking at the app store marketplace and we'll look at other things as we move forward on these issues. So the question today is how do we protect children? That was the framework of the hearing, the framework of the meeting. And we did look at putting the app store marketplace on the hearing bill. It wasn't because I had a meeting yesterday, it had nothing to do with it. It was already posted before I had a meeting yesterday, so that's just not accurate. But we offered to do that if we said we could open the app store. Just what my friend from Massachusetts said, if the app store marketplace would be open if only app stores that marketed to children 16 and younger.

So if the content of the app store was restricted to 16 and younger, we offered to look at moving that bill forward and we weren't able to come to a conclusion on that. And so I'm just saying we tried to do that in good faith that we didn't get there to focus on protecting children in the app stores. If you can open up the app store marketplace, then anybody can create an app store. And so that's something we need to discuss and we need to have the opportunity to discuss that. But in the name of protecting children from Massachusetts said, we said, okay, let's look at allowing app stores or mandating they have to open the app stores at Apple and others to app stores that specifically focus where parents can create the app stores, other people can create app stores that focus on 16 and younger and we weren't able to get an agreement on that. So yield back.

Rep. Gus Bilirakis (R-FL):

Gentleman yields back any further discussion on the bill, any further discussion? Yes. Mr. Trahan, you're recognized. Thank you. You're recognized for your full five minutes time.

Rep. Lori Trahan (D-MA):

And I won't take it all up, but I appreciate that.

Rep. Gus Bilirakis (R-FL):

Yeah.

Rep. Lori Trahan (D-MA):

I appreciate the Chairman's comments. I think from where we sit, you'll have to understand that just the proximity of Apple being on the Hill yesterday as we're debating this battery of legislation and this bill, which I'm sure is one that's hard fought by Apple and Google on the top of their list, that it is suspicious that we're not debating it like the others. I mean, look, all the bills that we're debating today, they need work before we go to full committee, but this one isn't even being allowed to be discussed or to be debated and that just breeds a bit of suspicion. So I would love your commitment to bringing that bill forward because I do think when we talk about children's safety, we really have to get serious and more comprehensive about it. And I've often the parent parental controls that exist are really tough for parents who are struggling to keep up with new technologies, their children's constant demands to download that app and just the fact that they're working, they should be able to have a store that is curated for their children so that they don't have to keep going back.

That feels like common sense legislation. But the only thing that really is blocking that at the moment is the iron grip that Apple and Google have on these devices. And so I could imagine there also being other stores beyond children, whether those are gaming stores or other curated stores. And we should be having that debate because right now it feels like we're just protecting monopolies that have existed on our smartphones since they've been delivered to our doorstep rather than evolving and making it better and safer for not just children. But like I said, I could think of multiple examples where curated app stores or just more innovation in app stores could exist beyond what Google and Apple offer today. And let me just say competition makes us better. No one should be scared of this. This is something that I think we should be embracing. And with that I'll yield if you wanted to respond on the commitment question.

Rep. Brett Guthrie (R-KY):

And I don't do things suspicious, I don't have ulterior motives. I feel like I've been pretty upfront and everybody, hopefully you think I'm an honest person. And so the accusations I just don't accept that you just made. There are other bills on this story I can tell you Apple was concerned about, but it was focused on children. We wanted to focus on children. We said we will look at the app store marketplace for this hearing even after Apple said if it focused on children. So the question of innovation, should we have monopolies, should it be opened up? I know a lot of that's Judiciary. I know this bill's been drafted, so it puts it into the FTC. I know the Senate is looking at Judiciary. We've looked at this and we will look at it, but in the context of a series of bills focused on protecting children, not looking at the overall we need to look at the overall marketplace and is it fair?

Is it who has control of their device and who doesn't? But that wasn't the context of what we're trying to do and carefully believe. We've had some discussions for months between our staffs and the staff to come up with these bills that we think at a very protected layer protects children. But the idea that we didn't put a bill on the markup place because somebody from Apple came to see there are several bills on this markup that Apple was not happy with. I don't run the committee that way. Hope you don't think I would run the committee that way. We want to get the policy right and the concept of a bigger monopoly versus focus on … because the examples people brought to me is that they have something like this in Europe. Now some people say it's not the exact same thing, but the very first open marketplace was pornography sites that came on. And so if you open up, so let's say you open up Apple to their marketplace where all apps can come on, then you got to start controlling those apps and those apps moving forward. Maybe we should do that, but we need to have time to figure that out and work it out. And it is a bigger context than just children. It's a context of the marketplace for cell phones altogether.

Rep. Lori Trahan (D-MA):

Well thank you. It would be great to just get a commitment from the Chair. I know this takes time and I would rather there be a parental control for ruling out an app store than having to do IT app by app. And so look, I didn't mean to offend you on the proximity of the meeting and this, I do think that you are an honest person, but I do think that we have to get the Big Tech lobby out of the room and have these conversations so that we make progress for our children. Thank you Mr. Chairman.

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. I'll recognize Mr. Fulcher.

Rep. Russ Fulcher (R-ID):

Thank you Chairman. Yield my time to Ms. Florida.

Rep. Kat Cammack (R-FL):

Thank you.

Rep. Russ Fulcher (R-ID):

You're recognized.

Rep. Kat Cammack (R-FL):

Thank you Mr. Chairman, thank you to my colleague for yielding the time. And I want to say to the Chairman, I believe you and I fully support you, and I don't believe that people on this committee are acting in bad faith. I think the people that are acting in bad faith is Apple. I'm just going to be very candid and say that is who we should be directing our ire towards because they're the ones who have won the horrendous track record and they have been playing games candidly. And we see that in the courts. We see that in how they've interacted with the DOJ. Let us not forget that there are 20 plus states that have active lawsuits on the books with them right now. I do think that there is a frustration certainly amongst members on the committee and we've expressed this, but I think the record should reflect it.

We want to be open and transparent. We can't ask for transparency in our legislation if we're not willing to do it in practice here on the committee. I think we want to have that commitment of not just a legislative hearing, but a markup on this bill. And I think part of that frustration is driven by the fact that this committee had a witness here talking about the importance of protecting kids online and the whole slate of bills and sent a letter with a whole coalition of dozens of advocacy groups for children stating that this bill should have been a part of the package and it wasn't. So it's frustrating that the committee's own witness said that the App Store Freedom Act should have been a part of this discussion and wasn't, I don't think that your mind has been changed in the way that you're succumbing to pressure at all.

I don't think that that is the case. Not at all. I just think that we need to be united in tackling this because as I said earlier, it feels like we're just tinkering around the fringes of the issue. And if we're really going to get to the heart of empowering parents and protecting kids, you have to allow parents the proper tools, which would be a new marketplace. They have to have the ability to establish a marketplace of their own because we can't ask people who have the horrendous track record of protecting data and kids and then say, now you're in charge of it. Even more so with that.

Rep. Brett Guthrie (R-KY):

But we did discuss having an open marketplace that parents in this hearing on this bill that would restricted to the parents who would produce 'em that were for 16 and younger. We did have that discussion,

Rep. Kat Cammack (R-FL):

But we also, some of the suggestions were to change the jurisdiction of the bill entirely as well. So there was a number of red line suggestions that were made that basically undercut the spirit of the bill, including exempting Apple and Google. And I am not interested in introducing legislation that exempts the two worst players in the space.

Rep. Brett Guthrie (R-KY):

I agree with you there. I wouldn't have put something for the exempted of those players.

Rep. Kat Cammack (R-FL):

But I just wanted the record to reflect that. I do believe you. I think that you're operating in good faith and…

Rep. Brett Guthrie (R-KY):

I want to continue and we need to look at monopoly. A lot of that is, Judiciary deals with that a lot more than we have on this area. I'll get up to speed and make sure that we understand and we continue to discuss to commit to say that we're going to have a markup on this date. I can't do that. I just don't know. But I will commit to you. We will honestly pursue what and come to a conclusion. Hopefully we can come together on it.

Rep. Kat Cammack (R-FL):

I appreciate it. With that, I yield.

Rep. Gus Bilirakis (R-FL):

Reclaim my time.

Rep. Russ Fulcher (R-ID):

Yields to the chair.

Rep. Gus Bilirakis (R-FL):

Yields back. And I would like to concur. I was in on the meeting yesterday and I would like to concur with what the Chairman said. I'm not going to get into anything further. We've got to move on. But anyway, now we're going to vote on,

Rep. Erin Houchin (R-IN):

Oh, Mr. Chairman.

Rep. Gus Bilirakis (R-FL):

Yes.

Rep. Erin Houchin (R-IN):

May I have time?

Rep. Gus Bilirakis (R-FL):

If you recognize,

Rep. Erin Houchin (R-IN):

Thank you. I just want to make a note that I am certainly open to all ways to get at protecting kids and putting parents back in control. It's one of the reasons why I'm passionate about working on this issue. And some people even in this room, I've shared a personal story with, and I certainly carry their stories with me as a member of this committee, but I want to not distract from the point that we have to have safeguards that have age verification, verifiable preventions of kids under 16, getting access to these platforms. And however we get at that, if anything that we're doing does not include age verification and a 16-year-old standard at a minimum for some of these very egregious apps and algorithms, then we aren't getting at the problem. So I just want to back up our Chairman in saying that. I believe that the focus of the committee hearing today is to try to get those age verification ish safeguards in place and would certainly be open to expanding the app store. But my focus again has just been on ensuring that kids under the age of 16 are not getting access to these platforms. I can tell you that kids are hacking around parental controls and they're getting access whether they're on their Chromebooks at school or they're getting access through a friend's device. And we must do better. And I'm grateful for the hearing in the market today. Would you

Rep. Brett Guthrie (R-KY):

Yield, yield, yield a second?

Rep. Erin Houchin (R-IN):

All yield.

Rep. Brett Guthrie (R-KY):

So one of the things yesterday, of course I'll say they came and said they didn't like your bill, my friend from Florida. But I said, this is what I said, this is repeat. Huh? You repeat that? I said I meant they came and said they didn't care for the Marketplace App store. I can say that. Yeah, I'll repeat that. But what I said to them, and this is what I said, and I don't know devices that well, trying to transfer stuff the other day I couldn't get my device to work. But what I said there has to be a way that Apple or there was Apple in the room. So others as well that when a parent, a family buys a phone and they can put in easily that phone that they're giving to someone younger than 16 or 16 and younger, that nothing comes through that phone ever.

That parent doesn't get to agree to. And it has to be, again, every one the parent, because my kids are older now, I get it, but I know you have a beautiful young lady, she's going to have a phone someday. And if you're, so you're here in a meeting and all of a sudden your kid wants to download an app, then you got to go and check it every time. And surely to me, the way I would envision, and there's got to be possible to do that when Ms. Houchens buys her child a phone, you'll know they're older than 16 now, unfortunately for you too. Or you buy your beautiful daughter a phone that you can put in. This child is under 16 and nothing comes to that phone unless it goes through your phone to get there. And they were saying they have the capability to do that sort of thing and they're doing it.

And then so people, well, we're going to take care of it. I want to put it in statute that they do it. I want to make sure that it's in law that they do it and don't know. And so it's got to be a lot more technical. There's a lot more work to do on it. What you say is we've got to make sure we tell 'em what's possible to do, not what they want to do, but what's possible to do because, and so I would love to have a world where there is nothing that goes through somebody under six because most people under 16, I even said this, I'm sure there's 13 or 14 year olds that work or do something, buy their own phone if they have to. But most phones for teenagers are family plans. And parents were terrified when they hand that phone over to the child.

They can't control everything. But if there's one size fits all one place on the device level that anytime somebody goes on, anything goes on their phone, they get a text. And they were even saying, because I asked this, can you password it? And he goes, well, kids are pretty good at guessing their parents' password. So you could actually, if somebody puts in your password on another device, it automatically texts you that somebody's happening. Those are the protections we need to codify to make sure that parents have that authority and that ability. And that's where I am. I want to protect kids too. And so those are the things that I walked away from the meeting yesterday going, I think there's a way to do this. And we can't just trust them to do it. We have to codify that they do it. And so that's how I walked out of that. And we need to look at the overall marketplace as well. But in terms of what we're doing today, that's what I took out of that meeting. There's actually a pathway to make sure they do what they say they're going to do is protect kids and make sure they do it.

Rep. Erin Houchin (R-IN):

Mr. Chairman.

Rep. Brett Guthrie (R-KY):

Oh, you're back to, I think it's your time.

Rep. Erin Houchin (R-IN):

My time's expired, Mr. Chairman,

Rep. Brett Guthrie (R-KY):

That's Aaron's time. So Oh, okay. Yield back or yield back to…

Rep. Erin Houchin (R-IN):

I yield back. I yield my time to Rep. Trahan.

Rep. Lori Trahan (D-MA):

Thank you. Thank you Mr. Chairman. I believe after this back and forth that we want the same thing. We want choice for our parents. We want protection for our kids and there's no reason why we have to wait much longer. We do have a referral on this bill to this committee. As Congresswoman Cammack pointed out, the majority witness spoke very highly in terms of endorsing this bill. I know that Kid Safety advocates have written to you directly about prioritizing this bill as the best path forward to ensuring that child safe apps on a child safe store could be something that we don't dream about five years from now, but is something that we can put into place very quickly. And so having the debate around, look, there is a lot of innovation in the space for children's safety and we don't want to thwart that by maybe defaulting to Apple, coming up with a kid safe store, common sense media, so many others that contact us.

They understand how to curate, how to rate, how to evaluate children's safety in an app and in an app store. And so I do think that we have to break through and sort of think differently about how when we open up that box of our new phone, how we might choose a different experience as it pertains to our children. And so look, I don't know what a commitment on this issue looks like, but I would love to even prioritize this as we go into markup so that we can, there's obviously a lot of interest in this particular bill and I don't want to lose momentum because I think it's not incremental at all. It actually solves the problem. Thank you. I yield back to the congresswoman from Florida.

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. The question now occurs and now keep in mind folks, let me remind you, we're voting on the how and bill HR 63 33 to the full committee. Everybody got that? Alright, those in favor say aye. Aye. Those opposed? No. The ayes have it and the bills agreed to and forwarded to the full committee. The Chair calls off HR 62 92 and asks the clerk to report

The Clerk:

HR 6292 A bill to prohibit data brokers from collecting, using…

Rep. Gus Bilirakis (R-FL):

Without objection. The first reading of the bill is dispensed with and the bill will be open for amendment at any point. So order. Does anyone seek recognition on the bill?

Rep. Frank Pallone (D-NJ):

I was going to speak on the ANS. Mr. Chairman.

Rep. Gus Bilirakis (R-FL):

Okay, that's great. Anybody else on the bill? Alright, why don't we get to the ANS then? Hold on one second. For what purposes or gentleman seek recognition?

Rep. Frank Pallone (D-NJ):

I move the striker last word to speak in support of my amendment, which is the ANS, HR 6292 ANS.

Rep. Gus Bilirakis (R-FL):

The clerk will report the amendment

The Clerk:

A the nature of a substitute to HR 6292 offered by Mr. Pallone of New Jersey.

Rep. Gus Bilirakis (R-FL):

Without objection, the reading of the amendment is dispensed with and the gentleman is recognized for five minutes in support of his amendment.

Rep. Frank Pallone (D-NJ):

Thank you Mr. Chairman. This is ANS for the bill. Don't sell kids data, act data brokers collect and sell billions of data points on nearly every consumer in the United States, including children, kids and teens are largely powerless to stop this mass harvest and trade of their data. We simply should not allow nameless data brokers to profit off of our kids' data. I continue to believe that we must pass comprehensive privacy that protects the data of all Americans with heightened protections for our children and teens. Because children and teens personal data should be protected regardless the source. However, it's particularly outrageous that data brokers without indirect relationship with their subjects are covertly amassing in selling the data of our children and teens for profit. Ask any parent if they're comfortable with an entity whom they do not know Covertly tracking their kids' every click like share and then post online and ask any parent if they're comfortable with these entities, then selling detailed dossier about their kids' interests and personal information to the highest bidder.

I think you'll receive a clear resounding answer. Kids' personal information is not a commodity. It must be protected. And this makes it imperative to do what we can now to prevent data brokers from exploiting the personal data of our kids. So my bill, as I said, the Don't Sell Kids Data Act will prevent shadowy data brokers from selling minor's data and allow parents and teens to request the deletion of any data already in the hands of brokers. Our kids deserve the right to enter adulthood with a clean slate, not a detailed record of their every activity or preference that will follow them throughout their adult lives. Now I appreciate that Chairman Guthrie's willingness to work together towards a solution to this pernicious problem plaguing our kids and on the broader set of bills we're considering today. This amendment and the nature of a sub that we're considering now will take steps towards a bipartisan compromise that will protect our children and we should build upon our bipartisan success. Last Congress on stopping the free flow of all American sensitive data to our foreign adversaries and recognize that the information of our children and teens requires protection from covert exploitation by data brokers as well. And that's exactly what this bill does. They urge my colleagues to support the ANS and then support the bill. And with that Mr. Chairman, I yield back

Rep. Gus Bilirakis (R-FL):

Chairman. I mean the ranking member yields back and now recognizes Mr. Evans for his five minutes on the amendment. You're recognized.

Rep. Gabe Evans (R-CO):

Thank you Mr. Chairman. And I of course want to start by thanking the ranking member for your initiative and leadership in the kids' online safety space, but I am speaking up on behalf of law enforcement as a former cop to express some concerns with the piece of legislation in hopes that this bill can be substantially improved before advancing further in the legislative process. As I've mentioned earlier, I was a police lieutenant for my agency in the Denver metro area for a little bit over a decade and I personally worked and know many cops that worked kidnapping and human trafficking cases involving minors during my time in law enforcement in the Denver metro area. And so I've seen firsthand just how valuable having every single tool available is to aid in an investigation that actually makes a difference in these horrific cases. And it's because of that personal experience that I want to express my concerns that this bill doesn't include any carve outs or reasonable exceptions for law enforcement agencies and their partners to ensure that we aren't unintentionally handcuffing cops in their mission to keep kids and their families safe. To illustrate this point, I asked for unanimous consent to enter into the record a letter from the National Sheriff's Association

Rep. Gus Bilirakis (R-FL):

Without objection. So ordered.

Rep. Gabe Evans (R-CO):

Thank you Mr. Chairman. This letter shares my sentiments in admiring the bill while still seeking improvements. In part, the letter from the National Sheriff's reads, the National Sheriff Association members depend on third party data and analytics, which supply essential investigative tools needed to do our job. This bill provides no exceptions for our members, their suppliers, and their mission to protect children and investigate crimes against them. And so it's with that spirit and this testimony that I want to express my reservations and ask to work with members of this committee on both sides of the aisle to make meaningful improvements to the legislation through the regular order process. The men and women who work tirelessly protect our kids and our communities and our kids themselves deserve nothing less. So thank you Mr. Chairman. Yield back.

Rep. Brett Guthrie (R-KY):

Gentleman yield to me. Does your time Mr. Evans were yield to me?

Rep. Gabe Evans (R-CO):

Yes, I yield to the Chairman.

Rep. Brett Guthrie (R-KY):

Thanks. I just want to say that and thanks for the ranking member for offering this. I think all of us want our children's data protected. I think that it's extremely important we do, and I know our staffs working together has got where we are today and just really done well working together and we really appreciate that. But we have heard from NCMEC and others, everybody on this committee wants to ensure that kids aren't sexually exploited. That's why we're doing this or exploited. That's why we're doing this. But I know the NCMEC, which actually was the first bill, believe it or I'm old enough, signed by President Obama that I ever did was reauthorizing the Center for Missing and Exploited Children, which we ought to set a tour for people to go see. It is, it's amazing the evil in the world that takes place in the brave men and women over there fighting it. And so we just need to make sure that we're not taking away a tool that they have to do that and we need to explore it more. And so Mr. Re member, my good friend Mr. Pone, if willing to commit before we go to full committee, we'll explore this more and see exactly how law enforcement uses it, how it's protected. If we need to do a carve out, how do we do it If we can just work together on that. I appreciate it.

Rep. Frank Pallone (D-NJ):

Of course my father was a policeman.

Rep. Brett Guthrie (R-KY):

I know. We all agree. We want to make sure these people are caught. So I'll yell back to my firm from Colorado.

Rep. Gabe Evans (R-CO):

Thank you. And Mr. Chairman of the full committee, yield back to the chairman,

Rep. Gus Bilirakis (R-FL):

Gentleman yields back. Now, is there anybody that wants to speak on this particular amendment on the Democrat side? No. Okay. I am going to recognize myself for five minutes. I won't take five minutes. I just want to reiterate the sentiments of the Chairman Guthrie regarding the concerns of law enforcement on this bill. I've met with my sheriffs as well and they oppose the bill and I will tell you this, we had this issue come before us last session as well, so I'm glad that the sponsor of the bill is going to work with us on this and maybe get a carve out. So I don't think anyone wants to be inadvertently make it more difficult for first responders to identify and catch online criminals that prey on our nation's children. I'd like to introduce into the record a letter jointly signed by a major county Sheriff's of America and the Association of State Criminal Investigative Agencies outlining their specific concerns without objection. So ordered. Again, I appreciate the sponsor of the bill working with us on this. It's very, very important. I'd like to work again with the ranking member between now and full committee and of course the Chairman to work through their concerns so we can best protect kids without hindering criminal apprehension. I know that's our goal. So with that, if no one else wants my time, I'll go back.

Rep. Gus Bilirakis (R-FL):

Discussion on the amendment? Yes, you're recognized

Rep. Lori Trahan (D-MA):

Mr. Chair. I have an amendment at the desk labeled Trahan…

Rep. Gus Bilirakis (R-FL):

Okay, well we got to dispense this one first. Oh,

Rep. Frank Pallone (D-NJ):

Hers is an amendment to the ANS.

Rep. Gus Bilirakis (R-FL):

Yeah, amendment to the ANS. Okay, before we vote. Okay. Alright.

Rep. Lori Trahan (D-MA):

We have to pull it up. Clerk want to report the amendment? Specify?

Rep. Gus Bilirakis (R-FL):

Yeah, specify the amendment. So

Rep. Lori Trahan (D-MA):

Oh sorry. Trahan 46.

Rep. Gus Bilirakis (R-FL):

Okay. The clerk will report.

The Clerk:

Amendments to the amendment and the nature of a substitute to HR 6292 offered by Ms….

Rep. Gus Bilirakis (R-FL):

Without objection, the reading of the amendment is dispensed with and the gentle lady is recognized for five minutes in supportive of her amendment.

Rep. Lori Trahan (D-MA):

Thank you Mr. Chair. I'm grateful to ranking member Pallone for his leadership in cracking down on shady data brokers who profit from selling kids data and forgiving parents and teens the right to ask that their data be deleted. Data brokers have engaged in reckless and invasive practices that violate our privacy and in many cases pose national security threats. A 2023 study from Duke University showed us just how easy it is for foreign actors to buy sensitive information about American service members. In one example, for as little as 12 cents per person, researchers were able to purchase family histories, credit scores, home addresses, health information, and even political affiliations. All of this was available from foreign data brokers and the researchers faced almost no vetting. Our children are vulnerable to these same dangers. This is one of the many reasons why shortly after joining this committee four years ago, I introduced the Bipartisan Delete Act.

This legislation would bring real transparency to the data broker industry and give every American the right to have data brokers delete and stop collecting their personal information. While Congress has been slow to move forward on this common sense, bill states like California have stepped up. California passed its own version of the Delete Act and it will soon take effect. Starting in August, 2026, Californians will be the first in the nation with access to a single tool that allows them with one click to ask hundreds of data brokers to delete their data. Instead of making every person contact each data broker one at a time, California's system follows the model of the bipartisan Delete Act by creating one trusted government website with strong privacy protections. Now as a parent of two young daughters, I know it is not realistic to file deletion requests with hundreds of data brokers just to make sure that my child's data is not constantly bought and sold. No parent has time for that. My amendment today would strengthen ranking member Palone's bill by adding a national opt-out for kids data held by data brokers. And it's my understanding that the ranking member will work with me on that issue, so I will withdraw my amendment. Thank you so much.

Rep. Frank Pallone (D-NJ):

Gentlewoman yields to me.

Rep. Lori Trahan (D-MA):

Yes, of course, of course.

Rep. Frank Pallone (D-NJ):

I just wanted to thank Ms. Trahan for offering and withdrawing this amendment and I do look forward to working with her on the important issue of bringing transparency to these shadowy data brokers and making it easier for kids and teens to delete their data and yield back to the gentleman.

Rep. Lori Trahan (D-MA):

Thank you. I yield back.

Rep. Gus Bilirakis (R-FL):

Okay, now we're going to vote on the ANS if there's no further discussion. All in favor of the ANS signify by saying aye. Aye. All oppose say nay. Nay. The ayes have it and the amendment is agreed to. Now we're going to vote on the there other amendments being offered. Doesn't look like it. The question now occurs on forwarding HR 62 92 as amended to the full committee. All in favor of the bill as amended, say Aye. Aye. All opposed? No. No. The ayes have it and the bills agreed to and forward to the full committee. The Chair calls up HR 64 84 and asks the clerk to report

The Clerk:

HR 64 84 The bill to protect the safety of minors on the internet and for other purposes

Rep. Gus Bilirakis (R-FL):

Without rejection. The first reading of the bill is dispensed with and the bill will be open for amendment at any point. Sole ordered the Chair recognizes himself for an amendment at the desk. It's H 6484. The ANSs, the clerk will report the amendment.

The Clerk:

Amendment the nature of a substitute to HR 6484 Offered by Mr. Bilirakis…

Rep. Gus Bilirakis (R-FL):

Without objection, the ringing of the amendment is dispensed with and the Chair recognizes himself for five minutes in support of the amendment. Ladies and gentlemen, again, CSA protects kids across America by mandating default safeguards and easy to use parental controls to empower families. CSA plays a critical role in our overarching strategy. CSA will broadly protect kids and teens while the other bills before us address particular harms or take specific approaches to help ensure no existing threat is left unaddressed. In many ways, those bills make KOSA even stronger by working alongside them and I want to thank the staff. They've done an outstanding job on the majority side on this and we're going to continue to work on this bill. It is the foundation and the safety net with concrete safeguards to keep kids and teens safe. My team and I have spent the past week meeting directly with Angel parents.

Thank you very much for being here in the audience and their advocates, advocates for this particular bill and keeping our kids safe if they're not sick, nothing else matters folks. And we took feedback to heart. We really did and we will continue to do so. This amendment incorporates some of their direct feedback, which is informed by the devastating loss of their own children. In response to their input, we added safeguards that prevent a minor's profile from being recommended to an all adult user. We restrict sharing of minor's geolocation data and banned the use of design practices that limited parent's ability to benefit from the bill's, safeguards and parental tools. We also increased transparency to the public by requiring the audit to include the types of harms reported, not just the number of reports requiring certain information in the audit to be made public and by extending the duration of the kids' online safety council from three years to 10 years with a required report every two years to the parents who have shared their stories, we heard you and we will continue to hear you.

And these changes were a direct result of your testimony to us and I'm going to meet with parents afterwards as well. No other feedback is reflected in this amendment except yours. The parents. We are deeply appreciative of your advocacy. I look forward to continue to work with parents and other stakeholders on this bill as it moves the legislative process. And again, I want to thank the Chairman for working with me on this and making it a priority. So I encourage my colleagues to vote yes on this amendment and on the underlying bill and I'll yield back. I recognize the ranking member for five minutes on this bill on the amendment. We're on the amendment on

Rep. Frank Pallone (D-NJ):

The ANS…

Rep. Gus Bilirakis (R-FL):

We're on the ANS. Yeah.

Rep. Frank Pallone (D-NJ):

Thank you. I have to say at this time I have strong concerns about the broad preemption included in this bill also in COPPA 2.0, in other bills before us today. And I would also like to see a stronger knowledge standard that does not let companies look the other way when they know they have kids on their platforms. This bill therefore needs more work and I can't support it in its current form, but my no vote does not mean that we will not continue to work to find a path on this bill and others including COPPA 2.0. The bottom line is kids safety is too important to get it wrong and thank you to you and Chair Guthrie for working with us in good faith on these bills. I know we all want to get this right and that we're going to continue to work across the aisle to do so. So with that, I'd yield back. Mr. Chairman?

Rep. Gus Bilirakis (R-FL):

Yes. Anybody on my side on the Republican side that wishes to speak on this? The ANS? Nope. Okay. Now what we'll do is we'll go to the next amendment. The next amendment is I believe the castor amendment. Yes. Yes. You recognize Ms. Castor for your amendment to offer your amendment. Is that correct? Yes.

Rep. Kathy Castor (D-FL):

Yes. Thank you. Thank you Mr. Chairman. We've gotten to the two most impactful bills now with the Children's Online Safety Act or the Kids' Online Safety Act and the Children's Online Privacy Protection Act on KOSA.

Rep. Gus Bilirakis (R-FL):

Excuse me, Ms. Castor. Yeah, we've got, I mean once ...

Rep. Kathy Castor (D-FL):

You want to ...

Rep. Gus Bilirakis (R-FL):

Speaking then you can offer your amendment and it'll be obviously we'll talk to the clerk and have the clerk read the amendment.

Rep. Kathy Castor (D-FL):

Okay. And then this is on you.

Rep. Gus Bilirakis (R-FL):

This is, well we're going to speak on your amendment right now.

Rep. Kathy Castor (D-FL):

Yes.

Rep. Gus Bilirakis (R-FL):

So whether...

Rep. Kathy Castor (D-FL):

You need the clerk to ...

Rep. Gus Bilirakis (R-FL):

Yeah, why don't you have the clerk read the amendment. That's okay.

The Clerk:

Substitute for the amendment and the nature of a substitute to HR 6484 offered by Ms. Castor, Florida. Strike all after the enacting clause and insert the following section. One short title, table of contents, a short title. This act may be cited as Kids online safety acts…

Rep. Gus Bilirakis (R-FL):

Without objection, the first reading of the bill, the amendment, the amendment will be, so I'll recognize Ms. Castor.

Rep. Kathy Castor (D-FL):

Thank you Mr. Chairman. And I really want to thank the parents and the young people and the advocates who have worked for years to push this committee and this congress to act to protect kids online. This really does come after many years of hearings and education by members of Congress on how we protect kids from physical and mental harms, from suicide, from bullying. And the larger context here is that we have had bipartisan strong versions of this bill, these bills before just last year out of this committee, we passed a strong bipartisan version of KOSA. It matched up with the bill that passed the United States Senate 91 to three. When does that happen in the United States Senate? 91 to three. They filed that same bill this year again and it has broad bipartisan support in the Senate. We should match their level of ambition and protecting kids across America.

So we're going to point out some ways to get back to bipartisan support here in the Energy and Commerce Committee and Congress. I heard that in our last hearing on the bill, but more importantly, let's get to versions of the bill, of the bills that will meaningfully protect kids online. First, Mr. Chairman, your bill does not have a duty of care. Duty of care is foundational to impose an obligation on the social media platforms and Big Tech corporations to design their platforms in a way that avoids for foreseeable harm to kids. Generally, corporations and industries have to design must assure that their products are safe for consumers without a duty to care. That really gives the Big Tech companies a pass from their infliction of mental and physical harm to children. It leaves parents without recourse to pursue a claim when kids are clearly harmed, it removes any incentive for responsible behavior.

When you have no duty of care. The social media platforms, Big Tech companies will act carelessly without fear of legal consequences. Without a duty of care, you can't enforce the law without a duty of care. These companies really will not be proactive in mitigating the risk and you'll leave huge gaps in the law that will leave them unaccountable. Just think of toys that are on the market today. Those manufacturers have a duty to provide safe products and the same should be true for tech platforms, but they have special protection under the Communications Decency Act that they shouldn't, shouldn't be able to use as an excuse for their maligned behavior. Mr. Chairman, your version also has a weaker knowledge standard. A weaker knowledge standard can lead to significant consequences. It will absolve a party of liability impact the outcome of cases. It would pose several dangers.

They can claim that they just lacked awareness of the harm. It would allow dangerous, harmful conditions to go unaddressed. They can claim willful blindness and it will favor these deep pocketed social media platforms. Mr. Chairman, your version also has really an unconscionable preemption standard that we're going to talk more about today. That sweeping preemption standard must be seen for what it is. Its total immunity for herding kids. You want to wipe away the important work that's gone on at the state level to hold these tech platforms accountable and really you'll not get to a meaningful bill if you keep this preemption standard in the law. We're going to talk more about that today as well. So the duty of care, the knowledge standard, the preemption, all of these must be addressed to work towards a meaningful bill. So we call upon you to not just pass bills today that have a certain title, that kind of wink. Quick, no nod. We're doing something to protect kids online. They don't. The details really matter here. These protections now really give Big Tech companies the immunity that they are, that they desire. It turns your back on these parents, these advocates, these young people today who have suffered such tragedies, turns your back on kids of the future who are going to be subjected to similar harms. So in the spirit that you've offered to work with us to improve these bills, here are some of constructive suggestions. Today I yield back my time,

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. Anybody on the Republican side wishes to speak on the amendment? Seeing nine, anyone on the Democrat side wishes to speak? Yes, you recognize Dr. Your five minutes.

Rep. Kim Schrier (D-WA):

Thank you Mr. Chairman, and thank you to my colleague, Representative Castor, for talking about the improvements needed in this bill. In my career as a pediatrician, I have seen firsthand the impacts of social media on kids. I've seen the rates of anxiety and depression and eating disorders and suicidality dramatically increase in parallel with the rise in social media use. Today's kids, as we've heard over and over today, are bombarded with harmful content online bullying from peers and strangers comparing themselves with idealized versions of their classmates, celebrities, influencers, and even drug dealers. But it's not just social media that poses a serious risk to today's kids. It's any app whose goal is to keep kids' eyes on the screen longer and longer. And these apps are designed to encourage compulsive use whether they are social media platforms, gaming apps or chatbots. Studies show that teens are not sleeping nearly as much as they should be and that that's been exacerbated by social media and phone use.

In fact, 93% of Gen Zs are staying up too late because of social media and that is what these social media platforms are designed to do to keep kids hooked and scrolling. But I just have to mention that good sleep is essential. It helps kids focus at school and keep up with their teammates and sports and regulate their emotions and protect their mental health and sleep deprivation, frankly alone, can explain some of the depression and anxiety and poor coping skills and inattention of this generation. Social media companies and other apps targeted to minors are not concerned with their sleep habits or critical stages of development or their mental health. They're concerned with their bottom line and that means keeping kids online longer, selling more ads and making more money. And that is why it is so critical that we've passed legislation to protect kids from the algorithms and the apps that are specifically designed to keep their eyes on those screens as long as possible.

And this watered down version of Kids Online Safety Act simply does not cut it. The bipartisan KOSA text that I was proud to help lead last Congress would create a duty of care for Big Tech to ensure that they actually prevent or mitigate the harms that they are causing to kids. Big tech companies design their platforms to maximize the amount of time that kids spend on the screen and in the app. And the bipartisan KOSA text would require those platforms to turn off the personalized recommendations that target kids individually by default, they'd have to opt in and not opt out. Importantly, it would force Big Tech companies to change the part of their platform's design that promotes compulsive use of the app for minors. The version of KOSA that we're considering today only requires apps to offer minors the option to limit the design features that promote compulsive use.

The default needs to be the one that protects kids. This is just not enough. We need legislation with teeth that will actually hold Big Tech accountable and will help today's kids break this compulsive and addictive cycle that they are facing online. And frankly that all of us who are using social media are facing right now. We all have an imperative to protect our kids online and we have abundant evidence that we cannot rely on Big Tech to do that for us. I strongly support Representative Castor's amendment, which would replace this weak text with a bipartisan version of KOSA that the Senate passed overwhelmingly in the last Congress. We have a strong negotiated bipartisan KOSA sitting right here that we could pass right now. Kids and parents deserve the best from us, not today's watered down bill. I'm so frustrated that we are ignoring common sense, bipartisan bicameral bills that are strong enough to actually hold Big Tech accountable and actually get over the finish line becoming law. As I mentioned in our legislative hearing last week, there's also a common sense bipartisan bill in the Senate that would keep young kids off social media entirely the Kids off Social Media Act. It would also protect teens from predatory algorithms. And I urge my colleagues to work with me on moving strong bipartisan bills to keep kids safe and that actually have a chance of getting over the finish line. Thank you. And I yield back

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. Anybody on the Republican side which to speak on the amendment Democrat? Yeah. Representative Soer you recognize five minutes.

Rep. Darren Soto (R-FL):

Thank you Chairman. I want to talk a little bit about preemption as we talked a little bit about earlier as well. In the 2024 legislative session we saw our home state of Florida adopted HB3, a law banning children under 14 from opening accounts and social media platforms with addictive features like infinite scroll, auto play, et cetera. This was a bipartisan bill. We don't always, at least the Dems on this committee, don't always agree with the work product out of the Florida legislature, but this is one that we all came together on and we saw it was upheld on appeal. And so Florida can now enforce these laws as the fight continues on in court and there's a potential that some of these broad preemptions found in bills like KOSA and others could jeopardize Florida's ability to enact these critical laws at the state level.

I think when we're talking about how to enforce the vast expenses, enforce these key protections and the vast expenses of the internet, you need multiple cops on the beat, the FTC, which we talked about already, which we need to bring back to full strength state ags in some instances causes of action. You need some ability to coexist to help deal with just how pervasive the internet is and social media in our society. And I think you could look at it as kind of a correlation. If we take a higher duty of care standard on the federal level, one that is more protective than most states, then I think you could look at having more preemption. Right? But if the strategy is that it's going to be a floor like this is the bare minimum on the federal level, we need to give states some ability to be able to work on this stuff. And so I hope some of this may guide us as we're going forward on how we balance out between the duty of care standard and how that affects what kind of preemption we can have in a bipartisan manner to be able to get this done. And I yield back.

Rep. Gus Bilirakis (R-FL):

Gentleman yields back? Anybody on the Republican side wish to speak on the amendment? Castor? Amendment? Nope. Democrats. Okay, we're going. I'll recognize Ms. Castor.

Rep. Kathy Castor (D-FL):

Okay. I'm happy to withdraw the amendment.

Rep. Gus Bilirakis (R-FL):

You're withdrawing. Okay. And again, I want to work with you on this. We've worked on this bill in the past and we've worked on a couple of weeks ago. And again, I'm not going to turn my back on anyone, you know that, you better.

Rep. Kathy Castor (D-FL):

I think we can get back to a strong bipartisan version that has broad support.

Rep. Gus Bilirakis (R-FL):

Okay. We'll work on it. Thank you. We're in the process of getting the best bill we possibly can out of committee. Thank you. Any further amendments to the ANS? Ms. Trahan? Ms. Trahan, you're recognized for five minutes please.

Rep. Lori Trahan (D-MA):

I have an amendment at the desk labeled Trahan 44.

The Clerk:

Yeah, the clerk will report the amendment. Amendment to the amendment and the nature of a substitute to HR 6484 offered by Ms. Trahan in Massachusetts.

Rep. Gus Bilirakis (R-FL):

Without objection, the reading of the amendment is dispensed with and the general lady will be recognized for five minutes in support of her amendment.

Rep. Lori Trahan (D-MA):

Thank you and I won't take up all of that time. My amendment creates a safe harbor that allows public interest researchers to study online platforms in a privacy protective way. It makes clear that researchers can collect publicly available information from social media platforms without worrying that they are violating a platform's terms of service. We urgently need to understand how social media affects our society at a time when we face national security threats. Growing concerns about the impact of social media on our children's mental health and the spread of content generated by artificial intelligence platforms have a responsibility to make sure that researchers, journalists, and policymakers can study how these systems work. Unfortunately, major social media companies have repeatedly used their terms of service to threaten or intimidate independent researchers. In fact, they have consistently tried to stop researchers from sharing honest findings about the effects of platforms.

My amendment would simply put an end to that. It would protect these important watchdogs from legal risk when large technology companies attempt to silence them. It's also carefully designed to address the privacy concerns that some platforms may try to use as an excuse to block transparency. This amendment is based on the bipartisan platform Accountability and Transparency Act that was recently reintroduced in the Senate as well as the research access provisions from the original version of KOSA in the 117th Congress. This is a simple change that supports the goals of KOSA's authors. However, I will withdraw my amendment today with the understanding that the Chair is willing to work with me to address legal safe harbor provisions as we did when we marked up KOSA last Congress. With that, I withdraw my amendment.

Rep. Gus Bilirakis (R-FL):

General lady withdraws her amendment. Any further discussion on the ANS? Okay, so there's no further discussion then the vote occurs on the ANS, the amendment. All those in favor signify by saying aye. Aye. All opposed nay? No, the ayes have it and the amendment is agreed to. Now we're going to vote on the bill as amended. Mr. Chairman the bill? Yes. Roll call. Roll call. Roll call. Vote on the amendment. Okay on the bill. On the bill request a roll call on the bill as amended. So again, I will request a roll call on the bill. The clerk will call the roll

The Clerk:

Mr. Fulcher. Mr. Fulcher votes Aye. Mr. Dunn? Aye. Mr. Dunn votes Aye. Mrs. Cammack, Mr. Obernolte, Mr. Obernolte votes Aye. Mr. James, Mr. James votes Aye. Mr. Bentz. Mr. Bentz votes Aye. Mrs. Houchin, Mr. Fry. Mr. Fry votes Aye Ms. Lee, Ms. Lee votes Aye. Mr. Kean Aye. Mr. Kean votes Aye. Mr. Evans, Mr. Goldman Aye. Mr. Goldman votes Aye. Mr. Guthrie Aye. Mr. Guthrie votes Aye. Ms. Shakowsky,

Rep. Jan Schakowsky (R-IL):

No .

The Clerk:

Shakowsky votes No Ms. Castor, no Ms. Castor votes? No. Mr. Soto No. Mr. Soto votes? No. Ms. Trahan No Ms. Trahan votes? No. Mr. Mullin, Mr. Mullin votes No. Ms. Clarke, Ms. Clark votes no. Mrs. Dingell No. Mrs. Dingell votes? No. Mr. Veasey, Ms. Kelly, no. Ms. Kelly votes? No. Ms. Schrieer No. Ms. Schrieer votes? No. Mr. Pallone No. Mr. Pallone votes? No. Mr. Biilirakis?

Rep. Gus Bilirakis (R-FL):

Aye.

The Clerk:

Yes. Mr. Bilirakis votes Aye.

Rep. Gus Bilirakis (R-FL):

The clerk will report. Wait, wait. Okay. We have a few. Ms. Houchin, how is she recorded?

The Clerk:

Mrs. Houchin is not recorded.

Rep. Erin Houchin (R-IN):

Aye.

Rep. Gus Bilirakis (R-FL):

Aye.Anyone else?

The Clerk:

Ms. Houchin votes Aye.

Rep. Gus Bilirakis (R-FL):

On this side Mrs. Cammack?

The Clerk:

Mrs. Cammack is not recorded.

Rep. Gus Bilirakis (R-FL):

How is she recorded?

The Clerk:

Ms. Cammack votes aye.

Rep. Gus Bilirakis (R-FL):

Anyone else on the Republican side? We're good. Okay. How about on the Democrat side? Is there anyone that hasn't voted? Okay, I think we're good. Alright. The clerk will report.

The Clerk:

Mr. Chairman on that vote there were 13 ayes and 10 nos.

Rep. Gus Bilirakis (R-FL):

The ayes have it. The bill is adopted. Okay. The Chair calls up HR 6291 and asks the clerk to report.

The Clerk:

HR 6291 A bill to amend the Child's Online Privacy Protection Act of 1998 to strength and protections related…

Rep. Gus Bilirakis (R-FL):

Without objection, the reading of the bill is dispensed with and the bill will be open for amendment at any point. So ordered. Does anyone seek recognition on the bill first? Yes. Ms. Lee, you're recognized a great state of Florida. She's a wonderful member from the Tampa Bay area. You're recognized.

Rep. Laurel Lee (R-FL):

Thank you Mr. Chairman. I am proud to sponsor HR 62 91, the Children and Teens Online Privacy Protection Act, COPPA 2.0 alongside Mr. Wahlberg of Michigan. I want to thank Chairman Bill Rakus and Chairman Guthrie for including this important bill in today's markup. COPPA was enacted in 1998 and remains the nation's only federal online privacy law designed specifically for children. It created foundational protections including the requirement that platforms obtain verifiable parental consent before collecting a child's personal information. But the online world that our children and teens navigate today looks nothing like the world of 1998. 95% of teens now access the internet through smartphones, often with no transparency about how their data is collected, shared, or monetized. At the same time, we have seen the explosive growth of social media companies whose business model depends on gathering immense volumes of personal information. These data-driven algorithms and targeted advertising systems have contributed to the rise of harmful manipulative content that preys on insecurity fuels addictive behavior and exacerbates the youth mental health crisis.

Clearly the original COPPA framework, strong as it was for its time, must be modernized to the realities of the 21st century. COPPA 2.0 extends core protections to teens age 13 to 16, who today are left exposed to data harvesting and targeted manipulation despite being at an age of heightened vulnerability. The bill sets strong clear limits on how platforms may collect, use, or maintain a minor's personal information. And it strengthens families rights by allowing parents and teens to delete that information, review it or correct inaccuracies at any time. This bill does not substitute federal judgment for parental judgment. It empowers parents and teens with stronger tools, clearer rights and better information. The bill also updates CAPA's knowledge standard so that high impact platforms cannot continue the longstanding practice of looking the other way. While children's and teens use their services for too long, companies have benefited from young users while disclaiming responsibility for their safety.

COPPA 2.0 closes that loophole and ensures accountability without imposing unnecessary burdens on smaller companies. And it prohibits targeted advertising to children and teens who are still developing judgment, impulse control, and resistance to manipulation. This is one of the strongest and most meaningful protections that we can give families in the absence of congressional action. States have understandably attempted to address these issues on their own, but the result has been a patchwork of different and often conflicting requirements Difference in who is considered a minor, which platforms are covered, what constitutes consent and how data must be handled. A child's level of protection should not depend on their zip code. Many of these state laws have also faced serious constitutional challenges. A fractured legal landscape only delays meaningful protections and creates compliance burdens that small innovators are least equipped to bear. COPPA 2.0 establishes a single national framework that ensures every child and every teen receives the same clear, robust privacy protections no matter where they live.

And it provides enforcement through the FTC and State Attorneys General so families receive protection in real time without years of costly litigation. This framework also supports America's global leadership. If the United States wants to remain the world leader in technology, we must also lead in protecting the privacy and the safety of our youngest users. I am encouraged that the Main Street Privacy Coalition representing small and local businesses around the country supports this bill and recognizes the need for a uniform responsible federal standard. For these reasons, I urge my colleagues to support COPPA 2.0 and give parents teens and children the meaningful privacy and safety protections they deserve. Thank you. And I yield back.

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. Anybody on the Democrat side wish to speak? Ms. Castor, you're recognized for five minutes to speak.

Rep. Kathy Castor (D-FL):

Thank you Mr. Chairman members. This is a gutting version of the Children's Online Privacy Protection Act. It really turns the back of Congress on families. What is going on here? There's such a Big Tech friendly preemption clause here is it swallows the entire effectiveness of the Children's Online Privacy Protection Act. That's not right. It flies in the face of our good bipartisan work. In the last Congress. Frankly, it was a surprise is not the word, it just feels like a betrayal from all of the hard work that we've done together over the years. It feels one Republican member said earlier in the hearing, don't bend the knee to Big Tech. That's what this preemption clause is in COPPA.

It's also a slap in the face to everyone that's worked at the state level on kids' privacy. You, because the preemption is so broad, any law rule, regulation requirements, standard or other provision that relates to the provisions of the bill, you are going to preempt not only children's privacy laws, but you could also preempt other consumer protection and torque claim laws. You could pull the rug out from under many, many lawsuits right now that are going on to hold tech companies accountable. Certainly that's not what you want to do after all of the pain and harm and mental anguish that's been suffered at the hands of these Big Tech companies. So Mr. Chairman, again, we are waving our arms and screaming. I see the parents in the audience today, again with pictures of their kids and I appreciate that you've met with them, but talk is cheap Now when it comes to congress, isn't it?

You've got to follow through with meaningful pieces of legislation and that means not incorporating a preemption clause that simply wipes it away and protects Big Tech. It's like a Big Tech immunity clause and they already have too much immunity under the law. Utilizing this outdated standard really fails kids, parents and states. And I think if you talk to a lot of the folks back home in states that have taken the lead while Congress has been absent, they would tell you don't take away the tools and don't say that the FTC can do this. We know what's happened with gutting of the FTC, but the Big Tech companies, they pay fines under coppa simply the cost of doing business. Remember Google a few years ago, paid a fine and then they got caught again and they're simply paying the fines as a cost to doing business.

You've got to have, when you're talking about kids and their harm, you have to have many cops on the beat. Don't take the cops off the beat here. Don't weaken child protection and privacy laws when it comes to KOSA and Kapa. Kids deserve better than the versions of the bills that you've advanced. And I've heard you and I didn't come in with my hair on fire today, although I certainly appreciated a ranking member Shikowsky's opening remarks because it is a bit outrageous. But in the spirit of working together to get back to the strong bipartisan versions of legislation that we passed out of the committee last year, that is already moving in the United States Senate. And based upon what I heard in our last hearing, broad bipartisan support and I heard it as we discussed all of the other study bills today. I think that we need now to show the intestinal fortitude to stand up for our kids and families across America and make sure that they have adequate protection. I know Big Tech has outsized influence here. They have outsized influence at the state level too. But God bless those states and legislators who have found it and have been able to stand up to their lobbying and everything else they do. Let's find the same. Let's work together in that same spirit and come down on when we return next year with stronger versions, COPPA and KOSA incorporating a lot of the good ideas, bipartisan ideas that we've heard today. I yield back my time.

Rep. Gus Bilirakis (R-FL):

Gentle lady yields back. Any discussion on the Republican side, Democrat side? No further discussion. The question now occurs on forwarding HR 6291 to the full committee. All those in favor say aye. Aye. Those opposed? No. Okay. The ayes have it. We'd like to have a roll call. Yeah, you would like a roll call. A roll call vote has been requested. The clerk call the roll.

The Clerk:

Mr. Fulcher. Mr. Fulcher votes. Aye. Mr. Dunn? Aye. Mr. Dunn votes Aye. Mrs. Cammack, Mr. Obernolte, Mr. Obernolte votes Aye. Mr. James Aye. Mr. James votes Aye. Mr. Bentz. Mr. Bentz votes Aye. Mrs. Houchin, Mrs. Houchin votes Aye. Mr. Frye, Aye Mr. Frye votes Aye. Ms. Lee Aye. Ms. Lee votes Aye Mr. ean, Mr. Kean votes Aye. Mr. Evans Aye. Mr. Evans votes Aye. Mr. Goldman, Mr. Goldman votes aye. Mr. Guthrie Aye. Mr. Guthrie votes aye. Ms. Schakowsky.

Rep. Jan Schakowsky (R-IL):

No.

The Clerk:

Ms. Schakowsky votes no Ms. Castor, no Ms. Castor votes? No. Mr. Soto no. Mr. Soto votes no Ms. Rehan no Ms. Trahan votes no. Mr. Mullin, Mr. Mullin votes no. Ms. Clarke, Ms. Clarke votes no. Mrs. Dingell, Mrs. Dingell votes no. Mr. Fry, Ms. Kelly, Ms. Kelly votes No. Ms. Schrier, Ms. Schrier votes No. Mr. Pallone, Mr. Pallone votes No. Mr. Bilirakis, Mr. Bilirakis votes Aye. Mrs. Cammack is not recorded.

Rep. Kat Cammack (R-FL):

How am I being recorded? Mr. Chairman.

Rep. Gus Bilirakis (R-FL):

How is Ms. Cammack recorded?

The Clerk:

Mrs. Cammack is not recorded.

Rep. Kat Cammack (R-FL):

Yay.

The Clerk:

Mr. Cammack votes Aye.

Rep. Gus Bilirakis (R-FL):

She votes Aye. Anyone else on the Republican side? How about the Democrat side? Anybody wish to be recorded? We're good. Alright. The clerk will report.

The Clerk:

Mr. Chairman on that vote, there are 14 ayes and 10 nos.

Rep. Gus Bilirakis (R-FL):

So the bill passes. Without objection. Staff is authorized to make technical and conforming changes to the legislation approved by the committee today. So ordered, without objection, the committee stands adjourned. Thank you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President of Business Development & In...

Related

News
Congress’s Bipartisan Child Online Safety Coalition is UnravelingDecember 2, 2025
Analysis
How The Kids Online Safety Act Has Evolved as Negotiations EnsueMay 27, 2025
Perspective
The Youth Online Safety Movement Needs to Respect Children’s AutonomyNovember 21, 2025

Topics