Transcript: House Hearing on “Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights”
Ben Lennett / Apr 20, 2024On Wednesday, the US House of Representatives Energy and Commerce Subcommittee on Innovation, Data, and Commerce held a hearing: “Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights.”
The hearing comes on the heels of an active period of legislative debate and action by Congress on privacy and children’s online safety. Last month, after several years of lawmakers expressing concerns over the potential influence of the Chinese government over TikTok, the House of Representatives voted overwhelmingly to force ByteDance to divest TikTok or face a ban from app stores in the US. Then, last week, Senator Maria Cantwell (D-WA) and Representative Cathy McMorris Rodgers (R-WA) unveiled draft legislation for The American Privacy Rights Act of 2024 (APRA). Finally, in the same week, a bipartisan group of representatives, including Rep. Gus Bilirakis (R-FL), chairman of the Subcommittee on Innovation, Data, and Commerce, introduced a House version of the Kids Online Safety Act (KOSA).
The hearing featured six expert witnesses:
- Ava Smithing, Director of Advocacy, Young People’s Alliance
- The Honorable Maureen K. Ohlhausen, Co-chair, The 21st Century Privacy Coalition
- Katherine Kuehn, Member, Board of Directors and CISO-in-Residence, National Technology Security Coalition
- Kara Frederick, Director, Tech Policy Center, The Heritage Foundation
- Samir C. Jain, Vice President of Policy, Center for Democracy & Technology
- David Brody, Managing Attorney, Digital Justice Initiative, Lawyers’ Committee for Civil Rights Under Law
Key Takeaways
Below are key takeaways from the hearing, including topics and debates that are likely to influence and shape the discussion and determine whether Congress succeeds in enacting data privacy rights and protecting kids online.
Optimism to pass comprehensive privacy legislation. One clear theme that permeated the hearing was a sense of optimism among the committee members about the prospects for Congress to finally pass comprehensive privacy legislation. Rep. Jan Schakowsky (D-IL) said in her opening remarks, “I am really happy today because we are finally getting back to the bipartisan business of protecting consumers online.” In his opening statement for the hearing, Rep. Frank Pallone, Jr. (D-NJ) also offered a hopeful message, “I’m optimistic that we'll be able to get comprehensive privacy legislation across the finish line.” The witnesses at the hearing also appeared to share this feeling. When Chairman Bilirakis asked the panel if this was “the best chance we have to get something done on comprehensive data privacy,” they all answered in the affirmative.
Preemption might be the biggest barrier to passing the American Privacy Rights Act. In June 2022, ADDPA was introduced in the House and quickly moved to pass the full Energy and Commerce Committee in July 2022 by a 53 - 2 vote. Then it stalled and never made it to the floor for a vote, likely held up by then-House Speaker Nancy Pelosi (D-CA) due to California members' concerns that the law would preempt states’ data privacy laws. Preemption and other obstacles might also doom the APRA. Finding a workable consensus on preemption will be the linchpin for any privacy bill passing Congress. As Samir C. Jain, Vice President of Policy, Center for Democracy & Technology, offered in response to a question about preemption, “We would probably say we should let the federal privacy law set a floor and then let states provide additional protections, but understand that that isn't going to be the way that this comes out if we're going to get this passed.”
Congress may be finally moving beyond notice and consent as the focus of privacy policy. After more than a decade, the dialogue at the hearing suggests that the members of this subcommittee, and possibly policymakers in general, are shifting their focus away from relying solely on a notice and consent framework as the primary means of protecting internet privacy.
Rep. Kat Cammack (R-FL) summed up the failure of this approach well, noting, “People fail to realize that everything is being tracked, and we're consenting to this in those ridiculous long terms of service that nobody reads.” Fortunately, this realization is also finding its way into legislation. Rep. Pallone noted in his opening statement that “the American Privacy Rights Act discussion draft today adopts many of the key pillars of the ADDPA with data minimization rather than notice and consent as its foundation.” Indeed, the term ‘data minimization’ was uttered as many as 28 times during the hearing, which is considerable progress, given that few in Congress were probably aware of the concept before the introduction of ADPPA in 2022.
TikTok did not feature prominently in the hearing. Among the key issues noted in the Subcommittee’s memo ahead of the hearing was “What dangers persist to Americans if the Chinese Communist Party (CCP) is able to easily access their sensitive information?” Yet TikTok was barely mentioned in the hearing, and only one member brought up the issue. Rep. Jeff Duncan (R-SC) asked, for whom “should we be protecting the data of American citizens? Who's the greatest threat here? Is it Russian hackers, Communist Chinese Party, social media companies, other big American companies, identity thieves, predators?” In response to this line of questioning, Kara Frederick, Director of the Tech Policy Center at The Heritage Foundation, acknowledged the TikTok threat, noting that “the low-hanging fruit right now is the Chinese Communist Party,” while others on the panel pointed to the big tech companies and data brokers. As many critics of the House TikTok bill argue, even if forced divestment cuts off the supposed access to the platform’s user data by the CCP, it (and other foreign governments) could gather sensitive information on Americans through other means, including buying it from data brokers.
The Subcommittee views privacy and kids' online safety as companion policies. The fact that the subcommittee included proposed legislation focused on child online safety, such as the Kids Online Safety Act (KOSA), and privacy, such as APRA, indicates that the Chairman and others viewed these discussions as related. Last week, a House version of the bill of KOSA was introduced, as a revised version of the bill in the Senate version gained further momentum (reaching 67 co-sponsors). In addition, the subcommittee included Ava Smithing, director of advocacy at the Young People’s Alliance, as a witness in the hearing. Smithing’s testimony made the connection between the collection of personal data and how that data can then be exploited to cause significant and direct harm to young women by promoting eating disorder content and ads. Multiple members asked Smithing to articulate the connection between privacy and policies that seek to address design features.
What follows is a lightly edited transcript. Please refer to the hearing video when quoting the speakers.
Rep. Gus Billirakis (R-FL):
Good morning everyone. The subcommittee will come to order. The chair recognizes himself for an opening statement. Again, good morning, and welcome to today's legislative hearing to examine solutions to protect kids online and safeguard Americans' data privacy rights. First, I want to welcome our new members, our new subcommittee members, Representative Olbernolte and Representative James. I'm not sure they're here somewhere. Okay, very good. Excellent. Alright, perfect timing. Perfect timing with business acumen. They are very, very, they have a lot of technical expertise, and are very knowledgeable in these areas. These esteemed members will bring new ideas, in my opinion, to the subcommittee. They will add it greatly to our subcommittee, so welcome.
I'm looking forward to working with both of them. During the 118th Congress, our subcommittee held multiple hearings to examine the need for federal data privacy and security law. These hearings illustrate the need to provide certainty for Americans to know their rights online and for businesses to know their obligations on a consistent basis throughout the country.
It will also help cement America's global leadership and ensure we remain competitive in this evolving landscape. The bipartisan work has culminated with the discussion draft of the American Privacy Rights Act, which is a comprehensive privacy and data security standard. This bipartisan, bicameral bill gives Americans the right to control their personal information, including how and where it's being used, collected, and stored. This legislation establishes a national standard, one national standard long overdue. It preempts the patchwork of state laws, so when consumers and businesses across state lines, there are consistent rights, protections, and obligations. The discussion draft creates requirements for companies contributing to the data ecosystem while protecting small businesses trying to provide for their customers. The bill also directs strong data security standards that minimize and protect against data being used by bad actors and provides Americans notice of their data being transferred to a foreign adversary like China, but it allows consumers the choice to opt-out. Very important.
We're also discussing proposals that require age verification for certain websites and social media companies to streamline terms of service labeling and allow third-party software providers to make social networks safer. There are also two bills that receive significant attention in the Senate: Representative Wahlberg’s Children's and Teens Online Privacy Protection Act or COPPA 2.0, and my bill, the Kids Online Safety Act or KOSA. I'm proud to collaborate on these kids' privacy and online safety measures with our subcommittee Vice Chair Representative Wahlberg, and I thank him for his long-standing leadership in this space. I also am grateful to my fellow Floridian and colleague Representative Castor on a bipartisan basis, we've worked on these particular bills, and I appreciate her cooperation. I'm looking forward to continuing to work toward the passage of these critical safety measures. We know that Big Tech has failed, ladies and gentlemen, to prioritize the health and safety of our children online, resulting in a significant increase in mental health conditions, suicide, and drug overdose deaths. We've heard stories over and over and over again in our respective districts and it is just awful. We've got to do something about it. It's time for big tech to be held accountable for facilitating this activity and manipulating our kids to keep them addicted to their screens for longer than ever before. I'm glad to have Ava Smithing.
Ava, we had a conversation. She's so impressive. Ava is from the Young People's Alliance here to share her personal story about the harm she experienced as a young woman related to social media. Ava, again, thank you so much for being here. I'm also looking forward to hearing from Kara Frederick, who can share her experience as a team lead for Facebook. She understands how these companies operate and how to curtail the harms of their products. In closing, I want to thank witnesses for the testimony in advance and I want to thank a great chairperson representative, Mrs. Rogers, for her historic bipartisan proposal. She's done such an outstanding job on this, and I know it's a priority for her, too, a priority for me and all of us here on the committee. It's true, and then of course, the ranking member who's worked with her. We passed a similar bill last year at a full committee. We're going to get this done this year. And then, I want to thank my ranking member, Ms. Schakowsky, for her cooperation. All she's done, let's get this done together for the sake of all Americans, especially our children. So I'll now recognize the general lady from Illinois, Ms. Schakowsky for five minutes. Give her opening statement. You're recognized.
Rep. Jan Schakowsky (D-IL):
Well, thank you so much, Mr. Chairman. Lemme just say I think this is a really good day. There may be a lot of differences that we may have across the aisle that show up all too frequently, but this is not one of those days. I am really happy today because we are finally getting back to the bipartisan business of protecting consumers online. So this is really an important day. I know that there are 10 bills that we're going to be talking about. I'm going to focus on the one that I have really spent a lot of time on along with Gus , the chairman and the ranking member of this committee, to talk about how we are going to get back to legislation to protect Americans' data privacy.
This is an opportunity that we have that is long overdue. Americans have been feeling the threats that they have as consumers, as business people, and certainly for our children for far too long. We have a history now of bipartisanship. We were able to pass legislation out of this committee and in the last congress, and I want again to thank all of the participants on both sides of the aisle who made this happen. But right now, still consumers find that companies are tracking all of their data on where they go and who they talk to. All of these things are an open book right now because of big tech, the… let's see, and data brokers. Once that data is out, then you have data brokers who buy and sell this information. You ask people on the street, do you know what a data broker is? I guarantee you nobody has really heard of that, and they don't know that their data is part of a business and profit-making. Most people don't also know that almost 80% of people around the world are protected by national privacy laws, but the United States is not.
So, it is definitely time to act. Just since 2002, we have seen the 12 states now go ahead and move on data privacy. So we need a national law. There is no question. And this bipartisan legislation, the American Privacy Rights Act, that has been introduced now by Chairman Rogers and I think lots of co-sponsors will be getting to work on this, brings us to the right place that we need to go. The bill builds on our privacy work over the last several years now, and this is now definitely our time to act again, back to data brokers. Now we know that under this legislation, they will not be able to help scammers get the information they need to go after consumers. We know that consumers will be able to opt-out for algorithms that could jeopardize their freedom to buy houses and their employment or even their health information. I also want to tell you one thing that's very important to me and that is we want to make sure that we have strong protections for data. What is that?
What is it? Fingerprints. Fingerprints or DNA, which we want to make sure is protected, so we have a good deal of legislation that still needs to be done. And let me just mention that we aren't finished yet. So, this legislation is not complete. It is still a work in progress, but we are making that progress right now, and I know that when we would have witnesses that would come before us to a person, when the issue of data came up, Republicans, Democrats, experts, people in the business would talk about the need for us to finally get to having in the United States of America, a data privacy legislation. So there's really been a unanimity among all the players that we do this. So let me just say I'm so grateful to be part of this effort. I know that we're going to run into things that we have yet to negotiate, but I am absolutely confident with our chairman, with our Ranking Member, and all the others that we are going to be able to move forward. And with that, I will yield back.
Rep. Gus Bilirakis (R-FL):
General lady yields back. Look forward to continuing to work with you. It's been a pleasure. Now we'll recognize the chair of the full committee, my good friend Mrs. Rodgers, for five minutes for her opening statement.
Rep. Cathy McMorris Rodgers (R-WA):
Good morning, and thank you, Mr. Chairman. Right now, the average American spends nearly seven hours online a day with two and a half hours of that being spent on social media platforms. The consequences range from increased suicide rates and depression to increased polarization and loss of trust in our institutions. All the while, these companies are collecting nearly every data point imaginable, which they then use to control what we see and when we see it. Many of these tools were created with good intentions to bring us together, but over time, they've strayed from their core mission in the interest of more control over our lives. This isn't the American dream. America was built on individual liberty and freedom, freedom of thought, expression, and speech. Our founders fought for these freedoms because they were tired of living under the crushing weight of tyranny. They were tired of being told how to think and how to live by a few elites who thought that they knew best and who exploited others for their own power and wealth.
Fast forward to today, we find ourselves living in a similar modern form of digital tyranny where a handful of companies and bad actors are exploiting our personal information, monetizing it, and using it to manipulate how we think and act. Many companies are using their control over our data to erode people's agency, their thoughts, their rights, and their identity. It's time for the status quo to change. Today we'll be discussing several pieces of legislation, including the American Privacy Rights Act, to give people the right to control their personal information online and not have it used against them. We're putting people back in control of who they are, what they think, and how they live their lives. And this is especially necessary for our children. As a mom of three School-Aged Children, big tech platforms are my biggest fear. The algorithms developed by these companies are specifically designed to get kids addicted to their platforms, and they've been used to target children with content that leads to dangerous, life-threatening behaviors for me and my husband, as well as millions of parents across the country.
This is a battle for our kids' development, their mental health, and ultimately their safety. We must build a better future for our children. They are our future. The American Privacy Rights Act is foundational to protecting our kids online, working together with other important legislation like the Kids' Online Safety Act, the Children's and Teens Online Privacy Protection Act, and other bills that are also being discussed today. These solutions will ensure the best protections to date for our children. Today, we find ourselves at a crossroads. We can either continue down the dangerous path we're on, letting companies and bad actors continue to collect massive amounts of data unchecked while they trample on core American values like free expression, free speech, and identity, or we can give people the right to control their information online. Congress has been trying to develop past data privacy and security legislation for decades, the American Privacy Rights Act with the American Privacy Rights Act.
We are at a unique moment in history where we finally have an opportunity to imagine an internet that will foster massive economic innovation and growth and truly be a force for good. I'd like to thank Senator Cantwell for working with me on this landmark draft bill. I'd also like to thank ranking member Pallone, who has been a trusted partner over the years as we've worked together on privacy. We would not be here today without his tireless efforts in leadership, and I look forward to continuing to work with him and strengthen the privacy protections for Americans. I'd also like to thank Chairman Gus Bilirakis, ranking member Jan Schakowsky, the members of this committee, Republicans and Democrats who are standing together here today, many who have been involved in these efforts over several congresses, and I am looking forward to working together on behalf of the American people to get this legislation through Congress and signed into law I yield back.
Rep. Gus Bilirakis (R-FL):
I appreciate it, Madam Chair. I'm fired up. We have to get this done, so thank you very much. I'm going to recognize the gentleman from New Jersey, the ranking member of the full committee, Mr. Pallone, for five minutes for his opening statement. You recognize, sir?
Rep Frank Pallone (D-NJ):
Well, thank you, Mr. Chairman, and I'm fired up too. Obviously, we think this bill is very important, and we're working on it in a bipartisan way with Chair Rogers, yourself, and Ranking Member Schakowsky; we do need to get this done, so we are taking a significant step forward today with a strong comprehensive data privacy and data security protection bill. For far too long, Americans have been virtually powerless against big tech's unceasing drive to collect use and profit from the sale of vast amounts of Americans. Personal information, last Congress as chair. This committee I was proud to work with then-ranking Member Rogers and subcommittee leader Schakowsky and Bilirakis to take bold action to protect America's personal information. The American Data Privacy and Protection Act was the first bipartisan and bicameral comprehensive data privacy legislation in decades. It was reported out of this committee with a 53 to two vote and that historic legislation included strong federal data privacy and security standards that put people back in control of their personal data.
Curb data collection abuses by big tech reigned in the shadowy world of data brokers and provided important protections to keep kids safe online. So I'm pleased that the American Privacy Rights Act discussion draft today adopts many of the key pillars of the ADPPA with data minimization rather than notice and consent as its foundation, notice and consent as the basis for the privacy regime imposes unreasonable burdens on consumers, and it simply does not work. By contrast, data minimization limits the amount of personal information entities collect, process, retain, and transfer to only what is necessary to provide the products and services being requested by the consumer. And that means no more flashlight apps collecting and sharing geolocation information. No more dating apps gathering health related information to use for targeted marketing. No more wellness apps selling mental health information to data brokers. This discussion draft combines data minimization with provisions that empower consumers to access, correct, delete, and port their personal data, opt out of targeted advertisements, and prohibit data brokers from collecting their personal information.
And there are several key areas where I believe it can be strengthened. Starting with the area of children's privacy, I've long said that any comprehensive privacy law must provide heightened privacy protections for children. This new draft recognizes that information about children is sensitive, but it does not provide many of the specific protections for children that can be found in the ADPPA. To start, we should explicitly prohibit targeting advertising to children who often cannot distinguish between advertising and non-advertising content. We should also require companies to incorporate privacy by design in their practices and to adopt policies, practices, and procedures that take special care to identify acts and assess and mitigate privacy risks with respect to children. And we should also consider establishing a youth privacy division at the FTC to ensure that substantial resources are provided to protect children's privacy. We need to ensure that COPPA 2.0, which is one of the bills under consideration today, provides sufficiently robust privacy protections for children.
It provides for data minimization but leaves websites and apps largely free to collect, use, and disclose minors' information after obtaining consent from a teen or the parent of a child. COPPA 2.0 would actually provide children and teens with less robust privacy protection than those provided to adults in the American Privacy Rights Act. We should also explore whether there are additional tools that we can give consumers to control the data in the possession of data brokers. The chair's discussion draft directs the FTC to create a single mechanism that would allow consumers to opt out of future data collection by all data brokers. ADPPA went one step further and directed the creation of a universal deletion mechanism, which would allow consumers to direct all data brokers to delete their information. Without such a provision, consumers who don't want data brokers retaining and selling their data would have to visit hundreds of data broker websites and opt out of each one.
Furthermore, in a digital society, privacy rights are civil rights. The combination of artificial intelligence and personal data can be weaponized to deprive people of the equal opportunity to find housing, look for a job, or receive information about goods and services. As we advance comprehensive privacy legislation that includes provisions on algorithmic accountability and discrimination, we should examine whether the current legislation adequately reflects what we've learned about AI, particularly generative AI, since ADPPA moved through this committee two years ago. So I look forward to hearing from our witnesses and other stakeholders about ways we can strengthen the discussion draft beyond what I've highlighted. I'm optimistic that we'll be able to get comprehensive privacy legislation across the finish line, and I'm committed to working with Chair Rodgers and my colleagues to get it done. I also want to hear more about the other bills on today's agenda. With that, I know I'm over my time. Mr. Chairman, I yield back.
Rep. Gus Bilirakis (R-FL):
Thank you. I thank the gentleman. The gentleman yields back and now we'll hear from our witnesses. Our first witness is David Brody, managing attorney of the Digital Justice Initiative at the Lawyers Committee for Civil Rights Under Law. You recognize, sir, thanks for being here or you're recognized for five minutes.
David Brody:
Chair Bilirakis, Ranking Member Schakowsky, and members of the subcommittee, thank you for the opportunity to testify today. My name is David Brody and I'm the managing attorney of the Digital Justice Initiative at the Lawyers Committee for Civil Rights. Under Law, the Lawyers Committee uses legal advocacy to achieve racial justice, fighting inside and outside the courts to ensure that black people and other people of color have a voice, opportunity, and power to make the promises of our democracy real. We care about privacy because it ensures that who we are cannot be used against us unfairly. Privacy rights are civil rights. The lack of a federal privacy law enables discrimination and other harmful data about black communities and other historically marginalized groups, often reflecting the history of inequality and segregation in this country. Tech companies collect that data, feed it into algorithms, and use it to make life-altering decisions. Attached to my testimony is an appendix documenting hundreds of examples of how these practices lead to discriminatory harms and unequal access to goods and services.
Consumer data also fuel disinformation campaigns by foreign adversaries that seek to undermine American democracy. That is why we are encouraged by the bipartisan and bicameral American Privacy Rights Act. I would like to thank Chair Rodgers and Senator Cantwell for producing this impressive achievement. The Lawyers Committee looks forward to working with both chambers to strengthen it. The American Privacy Rights Act will establish building codes for the internet. Strong data protection rules are the infrastructure for American leadership and online commerce. This foundation includes data minimization, civil rights and consumer protections, transparency, data security, individual control, and multi-layered enforcement. The act has several key improvements over past bills. It prohibits forced arbitration of claims involving discrimination. It provides the right to have major decisions made by a human instead of ai. It has stronger protections for health data and it prohibits dark patterns that undermine individual autonomy. The bar for federal legislation has risen in the last two years As states enact more privacy and civil rights protections.
California continues to strengthen its privacy laws and regulations. Maryland, Maine, and Vermont are advancing comprehensive privacy bills. Based on this committee's work. Washington enacted protections for health data. Federal legislation must be at least as strong as the state laws to justify preemption, but while residents of some states may enjoy data protections, they're the minority. Many state laws, such as Virginia's and its copycats, are inadequate. Other states have no laws at all. Nationwide, most people are being left behind. We cannot afford to wait. The American Privacy Rights Act represents an imperfect but needed bargain to protect everyone's rights. First, the bill would prohibit discriminatory uses of personal data and require companies to test their algorithms for bias. Algorithmic decisions should be based on individual merit, not stereotype shortcuts. Expediency is no excuse for segregation. However, the current civil rights provision contains an exception that could allow advertising that segregates based on protected traits.
This could allow the return of whites-only solicitations. Fixing this provision is easy but important. Second, the bill would require companies to collect and use only as much personal data as is necessary, proportionate and limited to provide the services the consumers expect. The bill also provides transparency and individual rights to access, correct, delete, and import personal data. These protections build consumer trust, help uncover discrimination, and reduce the risk of fraud, theft, and deceptive practices, which disproportionately impact communities of color. However, service providers for government entities need to be covered by the bill as they were in ADPPA third. We applaud the enforcement authority that this bill vests in federal, state, and individual actors. The ability to bring a private lawsuit is particularly important for communities of color that historically could not rely on the government to vindicate their rights. However, we are concerned that this bill has backtracked from ADPPA by narrowing the private right of action for violations involving sensitive data.
Lastly, the bill gives important new responsibilities to the FTC. The FTC, however, has been underfunded for decades and needs new resources. In addition, the displacement of the Communications Act is vague and overbroad. This could endanger the FCC’s consumer protection authorities and its work to combat illegal robocalls. It is time for Congress to act. The internet is not coded on a blank slate. The future of equal opportunity depends on whether we prevent today's data-driven economy from replicating the mistakes of the past. The promise of the internet and the democratic aspirations imbued in its creation depend on it. Thank you.
Rep. Gus Bilirakis (R-FL):
I thank the gentleman for his testimony, and now I recognize our next witness, Maureen Ohlhausen, the co-chair of the 21st Century Privacy Coalition. You recognized for five minutes.
Maureen Ohlhausen:
Chair Bilirakis, Ranking Member Schakowsky, Chair McMorris Rodgers, Ranking Member Pallone, and members of the subcommittee. Thank you for the opportunity to testify at this important hearing on legislative solutions to protect kids online and ensure American's data privacy rights. I'm Maureen Ohlhausen, co-chair of the 21st Century Privacy Coalition and a partner at Wilson Sini. I also had the honor of serving as an FTC Commissioner and Acting Chairman. I'm testifying today on behalf of the coalition; we commend Chair McMorris Rodgers and Senate Commerce Chair Cantwell for the release of the American Privacy Rights Act discussion draft. The coalition, which has advocated for comprehensive national privacy legislation for a decade, believes that this draft shows potential for a bipartisan path forward on this urgently needed legislation. We look forward to engaging with you as it moves ahead. All of us share a strong desire for strong consumer privacy protections that apply uniformly throughout the nation based on the sensitivity and use of data and which allow consumers to continue to benefit from countless services and technologies.
Consumer's personal information should not be subject to varying protections because of the state they are in or the entity collecting such information. Federal legislation should also provide strong enforcement against the misuse or disclosure of consumer data that could result in harm while also allowing companies to develop innovative products. The draft incorporates a number of foundational elements for privacy legislation. First, it is strong and comprehensive, addressing issues such as transparency, consent and other consumer rights, data security, and the relationship between companies, vendors, and third parties. Second, the draft designates the FTC as the federal enforcer and permits State attorneys general to assist the FTC with enforcement. Third, and as a former acting chair of the FTCI particularly appreciate that the draft provides the FTC with several useful enforcement tools to protect consumers from privacy harms, such as civil penalty of authority for a first violation limited, a rulemaking authority, consumer restitution and jurisdiction over common carriers.
Such tools should also be accompanied by appropriate guardrails to ensure that the FTC does not exceed its authority. Fourth, the draft provides a national privacy and data security framework that generally preempts state laws and regulations. American consumers and businesses deserve the clarity and certainty of a single federal standard for privacy. Fifth, the discussion draft recognizes that legacy privacy requirements in the Communications Act must be preempted. This would allow a holistic approach to consumer privacy under FTC oversight based on the type of information collected rather than the legacy regulatory history of the entity collecting it. We believe, however, that the draft raises several concerns that warrant further consideration and discussion. First, although the draft would preempt FCC privacy and data security authority, it stopped short of preempting the FCC’s data breach notification authority to facilitate a consistent approach to the bill's privacy and data security requirements, the FCC’s authority should be eliminated.
Second, the draft appropriately seeks to replace the Communications Act provisions addressing video privacy requirements with equivalent protections enforced by the FTC, but the draft language could unintentionally cause significant disruption to common and beneficial practices in the TV marketplace. Third, the bill should better reflect a risk-based approach based on the nature of the relevant information and its usage. While we appreciate that first party marketing is included as a permissible purpose, we are concerned that sensitive information is not included in this exception to the Bill's data minimization requirement. Given how broadly the discussion draft defines sensitive data, the draft would undermine the ability of communications providers to tailor offerings to existing customers based on how they use our services. Fourth, while the draft seemingly provides broad state preemption, it includes exceptions that may unduly limit its application. Permitting states to adopt privacy-specific laws would be problematic, as would allowing plaintiffs to invoke broad types of claims to circumvent the bill's prohibition on privacy-specific laws. We would be glad to engage with you further on this proposed provision. Fifth, by adopting an overly broad definition of the term substantial privacy harm, the draft would abrogate arbitration agreements while inviting class action lawsuits that would undermine compliance with the legislation. Thank you again for the opportunity to participate in today's hearing and I look forward to your questions.
Rep. Gus Bilirakis (R-FL):
I thank the general lady. Our next witness is Ava Smithing, director of advocacy at the Young People's Alliance. You're recognized for five minutes,
Ava Smithing:
Chair Bilirakis, Ranking Member Schakowsky, Chair McMorris Rodgers, and Ranking Member Pallone, members of the subcommittee. It's an honor to testify before you today I hope to strengthen your understanding of the issues we're discussing with my
I hope to strengthen your understanding of the issues we're discussing today with my own personal story and the knowledge and experience of the young people I have the privilege of representing through the Young People's Alliance. My name is Ava Smithing, and I was one of the teenage girls on Instagram with an eating disorder. I say this all the time; I've said it to many members of your staff, but there are no words I could use that would describe and fold the dehumanization that comes with valuing your appearance over your health and well-being or the pain and anger that comes from knowing. This was inflicted on me for profit. I first downloaded social media when I was 10 or 11 years old. There was a brief moment when the platforms did as they promised and peacefully connected me to my friends and new ideas.
Then Facebook bought Instagram, and everything changed. We could have never imagined what would happen and what would've been ushered onto social media in the coming years, along with algorithmically recommended content and targeted advertising. Soon after this change between photos of my friends and family appeared advertisements of women with unrealistic bodies; my natural tendency to compare and, therefore, inclination to pay attention to these posts was taken by Instagram as an invitation for more like them. The company's ability to track engagements, such as the duration of time I looked at a photo revealed to them what would keep me engaged, my own insecurity. They stored my insecurity as data and linked it to all of my accounts across the internet. They used my data to infer what other types of ads and content I might like, leading me down a pipeline from bikini advertisements to exercise videos to dieting tips and finely to eating disorder content.
I have a very specific memory of one post titled Ballerina Diet, suggesting that the daily intake of only a black coffee, an orange, and 16 almonds would keep me thin and a sea of photos of unrealistically thin women. This ballerina diet was my life raft. The data they collected represents my greatest vulnerability. I wasn't thin enough. I interacted with one picture of one thin girl one time, and that's all I was ever able to see. For 10 years, I was trapped inside this echo chamber where social media takes a classic American beauty standard and puts it on a loop in front of my face while also showing me in real time the huge number of likes and comments of adoration these posts were getting and reminding me the limited number of posts and likes mine was getting. Societal validation through thinness, and my specific case is the carrot.
Instagram, Pinterest, and TikTok are the stick, and I, along with the rest of America's youth and the donkey. None of this would've happened to me if we had a national data privacy standard that ensured data minimization and gave me the ability to correct these ill-informed inferences. None of this would've happened to me if we had the Kids Online Safety Act that ensures companies exercise reasonable care to mitigate harm. To me, none of this would've happened to me if I had the option to opt out of algorithms that use my data to target me with harmful content. Or better yet, if these algorithms had been defaulted off, I harbor no resentment. I understand the unprecedented nature of social media made it impossible for us to know what was coming and to properly act to prevent these harms, and I'm grateful for the progress we have made this far, but we desperately need to pass legislation like the Kids' Online Safety Act to protect against downstream harms caused by specific design features.
We need to pass COPPA 2.0 to update that old tired privacy law and we need to pass comprehensive data privacy to protect all Americans. Whatever steps we take to protect kids alone will eventually prove fruitless if the democracy they grow up in is too fractured to function. Data privacy will protect users from the harm and polarization caused by social media upstream by limiting the information platforms can collect on them and use to sort them into echo chambers. Data privacy will ensure that future iterations of exploitation of our data are protected against and create a framework for which we can.
Samir Jain:
It would give individuals greater visibility and control by establishing a data broker registry through which a person could submit a request for data brokers to no longer collect data about them. Another bill under consideration at the hearing today, the Delete Act would go a step further and establish a centralized mechanism through which individuals could seek the deletion of their information. In addition, the house will soon be voting on the Fourth Amendment as Not for Sale Act, which would prohibit law enforcement intelligence agencies from collecting certain information by purchasing it from data brokers instead of the legal process that would otherwise be required. We urge Congress to pass this law as an initial but critically important step. While APRA's basic framework is strong, it will need further refinement as it moves through the legislative process. For example, a comprehensive privacy law that establishes its baseline protections for everyone is also the right context in which to consider what additional protections are needed for kids.
ADPA would've provided explicit protections for children, including a ban on targeted advertising for those under 17 and on transfers of children's data without consent; these types of bipartisan reforms would provide meaningful and important protections for children. Some of the requirements in other proposals, such as the Kids' Online Safety Act, while intentioned and pursuing an important goal, do raise some concerns. Legislation that restricts access to content because government officials deem it harmful can harm youth and present significant constitutional issues. Further requirements or strong incentives to adopt age verification systems to identify children often require further data collection from children and adults alike and thereby can undermine privacy and present their own constitutional concerns. CDT is encouraged by the release of the bipartisan bicameral draft of APRA And thanks to this committee for the Work to advance comprehensive privacy legislation. We look forward to working constructively with you to make any necessary changes to APRA and help to move it forward through the legislative process and finally into law. Thank you and I look forward to your questions.
Rep. Gus Bilirakis (R-FL):
I thank the gentlemen. Gentlemen, yield back. Our final witness is Katherine Kuehn, a member of the board of directors and CTO at the National Technology Security. You're recognized for five minutes.
Katherine Kuehn:
Thank you. Chairman McMorris Rodgers, Ranking Member Pallone, Chairman Bilirakis, Ranking Member Schakowsky, and members of the committee. Thank you for the opportunity to testify today on the American Privacy Rights Act of 2024. My name is Catherine Keuhn and I'm a member of the National Technology Security Coalition Board of Directors and serve as their CISO in residents established in 2016. The National Technology Security Coalition is a nonprofit, nonpartisan organization that serves as the preeminent advocacy voice for the chief information security officer, chief privacy officer, and senior security technology executives. The CISO is the senior technology risk executive who is responsible for the information and data security of an enterprise. We are charged with the protection of the enterprise from information security risk, be it from nation states, cybercriminals, hacktivists or unknowing employees committing non-malicious violations of the organization's policy. The CISO is on the front line of serving our nation's data and securing individual's private information and our nation's critical infrastructure.
I sit before the committee today with over 25 years of experience leading and advising cybersecurity technology and innovative AI, artificial intelligence strategies, and teams to help the public and private sectors achieve more informed risk decisions. I have strived in my career to maintain a comprehensive understanding of all facets of the cybersecurity ecosystem, acting in numerous capacities both on the practitioner and operator side. My career as a risk executive actually started here in Washington. I was an intern 30 years ago for Senator Tom Harkin. One of my responsibilities was to create his first website. I was given a book, HTML for Dummies, and told to build my first site. I remember when we launched the first page that I was concerned about the security and wondered, while we were so proud of it, if would there be ways in the future to take the data we provided, the people we highlighted on the site, and would it be used in negative ways?
It was the first time I'd ever had a concern about the bright future, the still very new internet hat, promising, achieving amazing things for us, but still concerned about the security around it. My career in cyber was born all these years later now, as a mother of five twin 15-year-old daughters, a 14-year-old son, an 11-year-old daughter, and a 2-year-old son. Covid surprise. I still believe that the future looks bright, but I'm concerned about the digital revolution we have witnessed and what will the future of my children face considering privacy where the future of the internet goes. It gives me pause for concern. As a career risk executive. It's not a surprise of the complexity of consumer privacy. Data privacy has brought us here today. In a recent study, it was highlighted that nine out of 10 Americans consider their online privacy to be an important issue with 85% of global adults wanting to do more to protect their online privacy.
I reflect on a comment made by Vince er widely accepted as one of the fathers of the internet and my mentor in 2013. He said that privacy may be an anomaly. I remember disagreeing with the comment then, and now, over 10 years later, with the rapid acceleration of social media data mining and targeted influence campaigns, it's clear to me that privacy with respect to our data cannot be an anomaly and must be protected. Today, American consumers and corporations navigate a complex landscape of state-specific privacy laws. These state laws, while pioneering, create a patchwork of disparate regulations that can be confusing for consumers and burdensome for businesses. 17 states have enacted privacy laws and regulations, with another 18 states actively pursuing various pieces of legislation. In the absence of federal privacy law, the average consumer has little to no understanding of the protection state offers, with one in four Americans being asked to agree to a privacy policy every day and the potential ramifications of their privacy differing from state to state.
In addition, there is a risk that states could compete by offering loose regulations to attract business investment, leading to a race to the bottom in terms of rapid adoption of privacy standards. This introduces new areas of risk, especially with the rapid adoption of generative AI both in the consumer and business space. States are trying to offer more comprehensive consumer data. Privacy could end up being penalized for trying to do the right thing. While the state-level protections are noble in thought, inaction from a federal government has the potential to hurt both American consumers and businesses, potentially the states themselves. Individual data protections are not equally created. One of the most significant advantages of APRA is the centralization of privacy standards. Under the current systems, businesses must adapt to the varying requirements of different state laws, which can be inefficient and costly. This fragmentation not only affects businesses but also leaves gaps in protections for consumers, depending on their state of residence.
With two-thirds of global consumers feeling that tech companies have too much control over their data, a federal standard under APRA would be a big step forward in ensuring that all American consumers receive the same level of privacy protection regardless of where they live. This uniformity helps simplify the legal framework for businesses, particularly small and medium-sized enterprises that might lack the resources to navigate regulatory environments. The National Technology Security Coalition supports the American Privacy Rights Act of 2024 as it marks a significant improvement in the landscape of consumer privacy protections in the United States. As you continue to work on the federal privacy standard, please consider the National Technology Security Coalition a resource moving forward. Thank you for the opportunity to appear before you today.
Rep. Gus Bilirakis (R-FL):
Thank you very much. I appreciate it. Now, I'll recognize myself for five minutes of questioning. Ms. Smithing, thank you for sharing your personal story with us. Each of us here welcomes your perspective and insights into how we can protect our kids online. As you know, I recently introduced the House companion to KOSA along with my co-sponsor, prime Co-sponsor Kathy Castor, alongside several, really several of my colleagues. Can you speak to how design obligations could have spared you so much harm? And I know that you touched upon this, but I want to give you more time. It's still very important. Please.
Ava Smithing:
Yes, of course. Thank you, sir. For the question. Yes, I believe design-specific legislation is incredibly important because while data minimization could have limited the amount of information they had on me, features such as likes, which led me to negatively compare myself to other people, aren't covered by that legislation and could have spared me a world of hurt. It's harmful for young women. I know sometimes celebrities will even post pictures with these beauty filters on, save the photo to their phone and then re-upload them. So there's no tag that the beauty filter is there, which is setting an absolutely unrealistic and unfair standard for young girls who look up to these women, and design features like this are necessary to be covered, and it does so in KOSA.
Rep. Gus Bilirakis (R-FL):
Thank you. I appreciate it. Ms. Frederick, thank you for your testimony today. You've been a tremendous voice in showcasing the harms perpetrated by big tech. We appreciate that. With respect to KOSA and the American Privacy Rights Act, how will these bills curb the power and subsequently give control back to the American people?
Kara Frederick:
Well, first and foremost, as a general theme, they hue towards self-governance, which is what Americans, how we operate and how we want to live our lives. We at the Heritage Foundation are big advocates of transparency, especially when it comes to the harms, the legitimate harms that big tech companies can visit on consumers and their children specifically. So what it does, it provides that transparency, but I think critically, it provides that teeth that I talked about in my opening statement. And this is very important because if you want to get the attention of executives in these big tech companies, you have to have some sort of enforcement mechanism, those private rights of action that are internal to the American Privacy Rights Act.
That is absolutely critical because you have to let them understand that what they care about the most, their bottom line will be at stake should they continue to have that gratuitous imbalance between the consumer and these big tech companies themselves. So I think that the accountability mechanism following transparency is absolutely critical to put the big tech companies on notice.
Rep. Gus Billirakis (R-FL):
Thank you very much. Okay. Comprehensive data privacy has been a major issue over the last 20 years. As you know. I fear our time is running out to get something done. We've got to get it done now. So very quickly down the panel, we'll start over here. Yes or no, do you think this is the best chance we have to get something done on comprehensive data privacy? We'll start from here, please.
David Brody:
Yes.
Rep. Gus Bilirakis (R-FL):
Thank you. Yes, yes, yes, yes, yes. Alright. It's unanimous. We appreciate that. So I know, listen, we have a lot of work to do, but we've got a great start thanks to our chairperson. And let me tell you, the staff has been outstanding. So let's work together, let's work together to get this done, and I'm going to go ahead and yield back. Recognize the ranking member, Ms. Schakowsky for her five minutes of questioning.
Rep Schakowsky (D-IL) :
Thank you so much. To the witnesses, the Americans and American Privacy Rights Act is finally, it feels like it's close and we're finally going to be able to protect, I hope and believe the data privacy of Americans. But I want to ask you, Mr. Brody, why is it so important that we do this and that we do this now?
David Brody:
We're at an inflection point for the future of the internet. As AI and other advanced tools emerge, we can't afford to replicate the mistakes of the past. Every single day we're seeing new data breaches. Every single day we're seeing new research showing how this tool or this algorithm or this thing is disadvantageous to one group or excluding people from opportunities. There are endless reports of people being denied equal opportunity in housing, employment, education, and credit. We're dealing with people being exposed to stalking and identity theft and fraud, and we are facing threats of disinformation from hostile actors. We can't afford to wait.
Rep Schakowsky (D-IL) :
Thank you. Also, another question for you. The discussion draft talks about protections for data like fingerprints and DNA. So if you could talk to me about why we should especially look at that.
David Brody:
So biometric and genetic information is some of the most sensitive information we have. It's immutable. If it's compromised, you can't change it; you can't get it back under control. And so we have to be extremely protective of it and make sure that it gets treated in the appropriate way. We've also seen that this information is often used in discriminatory ways. So, particularly when we're talking about facial recognition technology repeatedly over and over, it's been shown to be less accurate and discriminatory against women and people of color. And as that technology is increasingly being used in places of public accommodation, like retail stores and sports arenas, it poses a serious threat.
Rep Schakowsky (D-IL) :
Thank you. Then also, we know that scammers can buy suckers lists from data brokers or use data that is out there illegally right now. And I wanted to ask Mr. Jain, how does this draft that we're working on right now address this problem of scammers?
Samir Jain:
Thank you for the question. Well, first of all, I think the data minimization requirements that restrict both the initial collection of a lot of data and that restrict the transfer of that data except for permissible purposes or except when necessary to provide the product and service, is going to make a lot of less of this data flowing through the ecosystem where data brokers can today, data brokers can essentially purchase anything that they want, any data that they want from anyone because they're very few restrictions. So they can buy data about your online activities, what you read, what you watch. They can buy data about your purchases, and they can put all of this together into profiles to assemble lists like what you're talking about, seniors who are vulnerable to scams or seniors who have Alzheimer's disease and it's that kind. And then they can turn around and sell those to businesses, to advertisers who can then use them to target ads in a really harmful way. And so by having data minimization, by having some of the other protections here, the ability of consumers to say, Hey, data brokers don't collect my information anymore, that those kinds of protections will really help reign in some of these really harmful activities from data brokers.
Rep Schakowsky (D-IL) :
The more we can take away opportunities for the scammers, the better we're going to be. And you're right, a lot of seniors, they get asked for things like their social security number or whatever, and if we could cut back through minimization, we could really help reduce the scamming. So thank you very much. And with that, I yield back.
Rep. Gus Bilirakis (R-FL):
Agreed. Thank you very much. The ranking member yields back and we'll recognize the chairman of the full committee, Mrs. Rodgers for her five minutes of testimony.
Rep. Cathy McMorris Rodgers (R-WA):
Ms. Smithing, you're an inspiration to me and my daughters who are 10 and 13, as well as millions of other young women across the country. Thank you for being here. Thank you for sharing your story. Appreciate your advocacy for KOSA, but also for a national data privacy standard in our nation. And you highlighted that we need both. We need a path so that these protections work effectively. In aligning your testimony, you referenced that there's an arsenal of data being used against you. Would you just speak to the American Privacy Rights Act and how you believe that would've helped stop this ammunition being used to target you and how it is an effective compliment to KOSA protecting kids online?
Ava Smithing:
Yes, thank you, chairwoman for this question. I think there's a lot of provisions in the ACT that would've helped, specifically the data minimization, which would've prevented them from building this arsenal on me in the first place. Especially how in data minimization sensitive cover data, which covers data from minors, you have to expressly consent for that data to be transferred to third parties. This would've been incredibly helpful and would've helped these poor examples of content from following me across the internet. Also, the ability to opt out of targeted advertising is incredibly important. Targeted advertising is one of the most harmful things we see here on the internet, and if we had the option to opt out of that, it would be great. I will note that not all children have the ability to make the best decisions for themselves. And I think it's important that targeted advertising for minors specifically be defaulted off or banned as it is currently proposed in Kapa 2.0. Just because they don't always have the full story, these companies aren't being honest with how they're using their data. So it's best that it's opted out for them.
Rep. Cathy McMorris Rodgers (R-WA):
Thank you. Thank you, Ms. Fredericks. Welcome back. Heritage is a respected conservative voice, and I know we've had many conversations about why a draft like opera is reflective of conservative values. Transparency is important to us. I know we also share issues about government regulating speech, but do you see why big tech shouldn't be subject to algorithm assessments and design evaluations? Wouldn't it be good that we require a company to give thought to the impact on Americans and the decisions that they make?
Kara Frederick:
Precisely, and what I think conservatives, in particular, have learned in the past few years is that private companies shouldn't be considered sacrosanct. They are just as capable of infringing on American's. God-given rights as the government and often do so hand in glove. So that new history of conservatives should inform the way that we look at specific strictures on private corporations in particular. And I will tell you from my experience these companies say they're transparent, but you, the American public, only sees what they want you to see. So there are times when those enforcement mechanisms need to be utilized in order for these tech companies to promote self-governance among American citizens.
Rep. Cathy McMorris Rodgers (R-WA):
Thank you, Ms. Keuhn, I really appreciate you being here. You have tremendous experience, 25 years protecting data, and advancing new technologies. Your insights are important. You mentioned that I think that the International Association of Privacy Professionals estimates that 80% of the world, 79%, is covered by a form of a national data privacy law. Would you speak to the importance of a uniform federal standard when it comes to American leadership on the global stage and whether this bill would be burdensome to implement? And can you elaborate on the terms of how important data portability provision is? And many others on the panel are very supportive of that as well.
Katherine Kuehn:
The need for a national standard is critical. So from an implementation standpoint, this would actually be a simplification for a lot of American corporations and for international organizations trying to do more business within the United States. One of the issues today is with so many disparate state laws, it's very difficult to make sure that they're maintaining proper privacy regulation between the states. So, creating a national standard one gives us a better parody of what international counterparts are. If you think about the eu, if you look at GDPR, what's happening, it's very similar from that perspective. Thank you. So, one, it's an advancement forward. Two, from a protection standpoint, it simplifies as a risk practitioner what I have to do as far as maintaining privacy standards.
Rep. Cathy McMorris Rodgers (R-WA):
Thank you. Thank you. That's, thank you. Thank you. Commissioner Olsen, would you share with us the members of the 21st Century Privacy Coalition?
Maureen Ohlhausen:
Yes. So the members of the 21st Century Privacy Coalition are at&T. Comcast, Cox Communications, CTIA, DirecTV T-Mobile, and US Telecom.
Rep. Cathy McMorris Rodgers (R-WA):
It's interesting to me that you're just now raising data breaches in the context of comprehensive privacy legislation. I'm sure that it had nothing to do with one of those members recently being in the news for a breach of 7 million customers and 65 million former account holders. I yield back.
Rep. Gus Bilirakis (R-FL):
I thank the chair, and I'll recognize the Ranking Member of the full committee, Mr. Pallone, for his five minutes of questioning.
Rep Frank Pallone (D-NJ):
Thank you, Mr. Chairman. In today's big tech era, powerful social media companies, app developers, ed tech companies, and video game creators treat children's most sensitive data as a valuable commodity, be collected, used, and sold all to online their pockets, all to line their pockets, I should say. The result is invasive commercial surveillance practices that can have a damaging effect on children's well-being. And as we draft a comprehensive privacy bill, we have to ensure that our nation's youth are provided robust privacy protections stronger than those provided to adults. My questions are all of you, Mr. Jain. Can you expand on your recommendation that this committee add robust privacy protection for children into APRA?
Samir Jain:
Sure. I think in thinking about this, it's important to think about why it is that children need additional protection? As you noted in your opening statement, one reason is that children are, it's much more difficult for children to distinguish between advertising and editorial content. Children are unable to provide meaningful consent. We've heard from Ms. Smithing such eloquent testimony about the kind of harm that advertising can do to children. And I think with those in mind, we can think about what are the additional protections that are necessary? So one, let's just stop targeted advertising to kids because we know that it causes so many different kinds of harm. Let's make sure that we're able to bar transfers of children's information without parental consent. Kids can't give meaningful consent. They don't understand or appreciate what happens when the data goes into data brokers' hands and the ways they may be victimized. So, let's say children can't consent to that. You really just need parental consent. I think the provision that was an ad previously about creating a specific youth marketing division within the FTC makes a lot of sense. There's a lot of research going on right now about what exactly social media and other companies that cause harm to kids. What are the steps that can be taken to protect them? And I think creating expertise at the FTC about that issue will enable us to make better policy down the road.
Rep. Frank Pallone (D-NJ):
All right. I've got a bunch of questions. So let me go quickly. In your opinion, would a comprehensive privacy bill that provides protections to all Americans better protect children's privacy than kids' specific privacy legislation?
Samir Jain:
I think it is a good context in which to protect kids' privacy because I think once we establish a baseline that protects comprehensive privacy for all, including kids, so for example, the prohibition on dark patterns, which I think is particularly important for kids, then we can think about, okay, what additional protections do we build on top? And in doing that, we can make sure that we're doing that in a way that provides consistency, that we're not inadvertently creating lower standards for kids in some cases, and that we can put it together as an entire package. So I do think that makes a lot of sense.
Rep. Frank Pallone (D-NJ):
Now, I mentioned COPPA 2.0 in my opening, and I think it relies primarily on a notice and consent regime to protect kids' privacy. So let me go to Mr. Brady. Is notice and consent an effective way to protect the privacy of children and teens?
David Brody:
No, it's not. We've known for a long time that notice and choice just doesn't work. How many people in this room have actually read all the privacy policies that you're subjected to? I haven't. And I do this for a living, and particularly if you're talking about parents, it's just there's too many things to be done. And so, look, I just want my kid to be able to watch Bluey. I'm not consenting to someone building a dossier about them before they learn to read. So it doesn't work in other areas of consumer protection. We just require products to be safe.
Rep. Frank Pallone (D-NJ):
And let me go back to Mr. Jain. Does the data minimization provision and COPPA 2.0 adequately protect kids' privacy?
Samir Jain:
I don't think it does because though there is a data minimization provision in APRA, it primarily applies to just the collection of information, whereas in APRA, it would apply to both the processing, transfer, and other aspects of data use. So I think that's important too. I think the standard is actually stronger in APRA because the standard in COPPA would be relationship-based. It allows you to process and collect data sort of based on the relationship, and it says, including when it's necessary to provide the service or product that the individual requested. But by that language, it makes clear that it's actually broader than just that. Whereas APRA would just limit it to when it's necessary to provide the product or service or for the list of permissible purposes. So I think that APRA actually has stronger standards, and I think it illustrates the point you were raising earlier, which is why it's helpful to actually have this all in one place because we don't want to inadvertently protect kids less than we do adults.
Rep .Frank Pallone (D-NJ):
Let me get a data broker's question here. Mr. Jain, under apra, how would a consumer delete all of their personal information held by data brokers? And do you think that Congress should provide a centralized deletion mechanism for consumers who want all data brokers to delete their data?
Samir Jain:
Yes. Unlike ADPPA, APRA doesn't at this point contain a centralized deletion mechanism, which means that as an individual, you would literally have to go to data broker by data broker to request their deletion, and that's really difficult because in many cases, you don't even know what data brokers have collected information about you. So you wouldn't know how to do that. Whereas if we create a centralized registry, a one-place, one-stop-shop basically, where consumers can go and say, Hey, I want all data brokers to delete information about me, that puts more control into the hands of consumers. And so, I do think that would be a welcome addition to APRA.
Rep. Frank Pallone (D-NJ):
Thank you. Thank you, Mr. Chairman.
Rep. Gus Bilirakis (R-FL):
Thank you, gentlemen. Yields now recognize Dr. Buschon from the great state of Indiana. I believe you're a co-sponsor of KOSA. Yep. I'll recognize you in five minutes.
Rep. Larry Buschon (R-IN):
Thank you, Chairman Bilirakis, for calling Today's hearing. Today, examples of Hoosier's data being captured and used without their consent are all too common. Every day that passes, it becomes clear that families and businesses need clear rules of the road for how their data is used online. The American Privacy Rights Act is a huge step forward in accomplishing that, and I appreciate the work that chair Morris Rogers and her team did to get us to this point, as well as the ranking member, as a practicing physician who ran a medical practice for years before coming to Congress, I know that data-driven biomedical research is the lifeblood of clinical research and privacy legislation must strike the right balance to avoid stifling biomedical research conducted for the benefits of patients. Indiana's privacy law includes a comprehensive research exception that allows clinical trials and other medical research data to be used by companies as necessary to develop new and better treatments for patients in need. It even states there must be a research exception for when the expected benefits of research outweigh the privacy risks. Ms. Kuehn, do you believe that our privacy legislation must be tailored to avoid the unintended consequences potentially of stifling biomedical research conducted for the benefit of patients?
Katherine Kuehn:
No, I do not. So what I believe is, as we looked at, we've talked about the opt-in and being able to opt into different types or being able to have control of your data, I think it actually will help get more information into areas from a health perspective because it gives more Americans the ability to be active in biomedical research if they would choose to be.
Rep. Larry Buschon (R-IN):
Okay. Fair enough. Anybody else have an opinion on that? No. Ms. Frederick, are there other lessons and policies that states have already enacted such as differentiating between first and third party data for the treatment of customer loyalty programs that this legislation should incorporate?
Kara Frederick:
I think in terms of third parties, given what we've seen with TikTok and states that have moved out of banning TikTok, obviously, you need to layer some sort of comprehensive data privacy on that, but when it comes to SDKs and third parties, that is absolutely critical because they can make end runs. So making sure that third party data transfer is absolutely critical. And then age verification for social media. Florida's done some versions in HB three. That is something that I think is very, very excellent when it comes to social media, not just porn.
Rep. Larry Buschon (R-IN):
Okay, great. In the 26 years since COPPA was enacted by Congress, the protections provided by that law have failed to keep up with the changes in the ways that online platforms operate and the way that millions of young Americans interact with the internet. The incentive to keep a user on a platform has led children to be fed content that keeps their attention. Sometimes, this content can even promote suicide. I had a constituent who did the near-hanging type thing from online media and died. Sometimes, this content can promote suicide, alcohol and tobacco use, eating disorders, and more, as has been described. As a father of four whose kids have grown up through this era, I empathize with parents who struggle to keep up and monitor what their kids are doing with the newest trends or apps. And it's hard. This has led to permanent and sometimes deadly results for children.
That's why I'm a proud, original co-sponsor of the Kids Online Safety Act, introduced by Chairman Bilirakis, and the COPPA 2.0 Act, introduced by Representative Wahlberg, that will help protect our children from the dangers that they currently face online. I know the bills still need work and that's why we have hearings, but I'm glad that they're under consideration today. Ms. Smithing, I'm glad that we're considering legislation that will help protect young Americans from online harms, but I also think that it is important that regulation does not bar children from participating in the online world. Since digital literacy is necessary to succeed in the 21st century, what do you think about the trade-offs between protecting young people from harm while still allowing exploration and preparation for their adult lives in the digital world? It's a delicate balance, right?
Ava Smithing:
Yes, Congressman, it certainly is. I believe that if we put the onus on these companies to design these platforms properly, we won't have to borrow young children from them, and there'll be a safe place for everyone to be. There won't necessarily be trade-offs if we design these platforms with young people in mind as opposed to profit.
Rep. Larry Buschon (R-IN):
Okay. And Ms. Frederick, I have 34 seconds. Do you think the legislation strikes that balance that these pieces of legislation strike the balance?
Kara Frederick:
I do.
Rep. Larry Buschon (R-IN):
Okay. Fair enough. Yield back.
Rep. Gus Bilirakis (R-FL):
Thank you. And now recognize Representative Castor from the great state of Florida, who is the Prime co-sponsor on the Democrat side of KOSA. we've been working on these issues for several years, so we have, I appreciate all your patience, your patience, but also your cooperation. Thank you.
Rep. Kathy Castor (D-FL):
For the question. Thank you. Thank you, Mr. Chairman. And I want to thank the Chair McMorris Rogers, and just express my appreciation for her years of work on this, and congratulate you on the breakthrough in the American Privacy Rights Act, along with ranking member Pallone and all of the advocates here today, because Americans value their personal privacy and it's past time for Congress to act, we've got to rebalance the scales because right now they are overweighted, they're weighted too heavily in tech platform's favor. They track everything we do online, where we go, what we buy, and then they use that information to manipulate us and exploit us. And this is particularly harmful to children and adolescents whose brains are not fully developed. Kids are lucrative, ripe targets for a wide range of online actors, from child sexual abuse, cyber bullies, drug dealers, and scam artists, and parents and kids who need help.
Children ages eight to 12 spend an average of over five hours per day on their screens. While teenagers spend about eight hours every day, big tech uses every method possible to keep them online and addicted so they can pocket huge profits. The mental health repercussions for our kids are staggering. Almost half of us teens have experienced bullying or harassment online. Between 2010 and 2019, teen depression rates doubled, with teenage girls seeing the sharpest increase in 2021. Almost a third of girls said they are seriously considering attempting suicide. This committee has heard directly from Facebook whistleblower Francis Haugen and others that the platforms know that their platforms are causing harm, but the kids are just too lucrative for them to change how they do business. So it is long past time for Congress to step in. So I want to thank Rep. Wahlberg and Rep. Bilirakis for helping to lead privacy protections and their age-appropriate content-neutral design code, but I want to especially thank the parents and the advocates, mental health professionals, and the pediatricians who have educated Congress. Ms. Smithing, thank you so much. I agree. Your testimony is eloquent. You state that it's vitally important that we do both things, that we have privacy protections, and that we address the design code. Why are both important? You've got a KOSA bill that's a design code. You've got a COPPA 2.0 that is privacy outdated, but why do we need to marry these up in a modern bill that can protect kids online?
Ava Smithing:
Yes, ma'am. Thank you for the question. The Kids Online Safety Act will address design features that aren't addressed by data privacy. While data privacy is important because it will limit the amount of data they have to work with to make their platforms more dangerous and target us in harsher ways. These data privacy provisions alone cannot address the design features that are harmful, such as likes, endless scroll, beauty filters and other things that keep us on platforms for longer. So, both of these, in combination, will successfully solve the problem from two ends. The upstream version, data privacy, and the downstream version, the Kids Online Safety Act
Rep. Kathy Castor (D-FL):
And you're with a group called Young People's Lines. There's another Youth-led group. Design it for us. Why are young people helping to lead the charge here? Your generation has pressed Congress for years to act, and it must be frustrating. Why have you stuck with it?
Ava Smithing:
Well, we stuck with it because it's incredibly important, and we really don't have many other options. I think that we intuitively understand these issues. We grew up on them. The phrase digital native plays really well here because it helps people understand that we didn't have to take a class on AI. We didn't have to learn about data privacy. We just knew it inherently, and that's what makes us such valuable activists because we can speak to these things in ways that are to the point and don't require an incredible intellectualization of these problems that makes it hard to reach for people.
Rep. Kathy Castor (D-FL):
It wasn't that long ago that if you asked members of Congress about this, they really had no clue. So I want to thank you for your advocacy because I think you can tell here today, this is bipartisan. We intend to act, but there are a lot of barriers in the way, and I wouldn't put it past the big tech platforms that have undue influence here on Capitol Hill to throw up barriers along the way. So now is the time for everyone to press the Congress tact and you have my commitment that we will thank you back.
Rep. Gus Bilirakis (R-FL):
Thank you. The gentleman, lady yields back and now recognizes vice chair of the subcommittee, Mr. Wahlberg, who is also the prime sponsor of COPPA, so 2.0. So in any case, I'll recognize you for five minutes for questioning.
Rep. Tim Walberg (R-MI):
Thank you Mr. Chair, and thanks to the panel for being here, and I wish I could have heard you all as opposed to just reading. Just came from asking the president of Columbia University some very pointed questions on antisemitism and which makes this even more important as we discussed this topic, the power. The power that's out there with our systems, our nation's children. As my colleague and representative Castor said, we're facing tremendous problems with mental health crises and the challenges that go with that. Online platforms collect mass amounts of data about children and teens, and that data is then used to employ sophisticated recommendation systems that promote harmful content to young users and keep them glued to their screens because they know the longer they're glued to the screens, the more money they make or, the more impact it makes. Online safety begins with privacy, and that's why I have introduced HR 78 90, the Children and Teens Online Privacy Protection Act or COPPA 2.0.
The legislation modernizes and strengthens COPPA. It raises the age of protection from 13 to 16, prohibits companies from collecting information on our most vulnerable, and bans targeted advertising to kids and teens. I also want to mention that this legislation is bipartisan and bicameral. 80 parent, teacher, and privacy organizations support it and understand the concerns that you have expressed today. I want to thank Representative Castor for co-leading the legislation and Chairs Rogers and Bill Rakus for including it in our discussion today. I also want to thank the chair for her work on APRA. I look forward to discussing the need for comprehensive privacy legislation. And to that effect, I ask unanimous consent to enter a letter from R street on APRA into the record. Thank you, Mr. Chair. Ms. Smithing, thank you for being here and sharing your story. As I said, COPPA2.0 would prohibit targeted advertising to minors. For adults targeted ads can be helpful, especially for small businesses trying to reach the right customers. But when it comes to young people, why is the practice particularly harmful?
Ava Smithing:
Right. Thank you for the question, sir. Advertising in and of itself is harmful. Advertising is predicated on insecurity, right? You have to think having yellow teeth is bad to want to buy whitening strips. So, this practice itself is not good, but when you add data to it, it makes it incredibly worse. I'll explain with a story that happened to me. If I'm going to search on Google how to lose 10 pounds in five days, Google is then going to share this search with my social media platforms, and I'm going to get hit with ads from cleanse juices, gym memberships, and workout clothing. All of these things are not only predatory because my insecurity is going to make me more likely to buy them, but also reinforce the negative things I already thought about myself, that I did need to be thin, that I did need to work out. So it's really a double-edged sword here, and it hurts us both ways.
Rep. Tim Walberg (R-MI):
Yeah. Well, let me go on. We've seen many different children's privacy provisions through the years. I've introduced related bills in the past. Why is COPPA 2.0 the right direction when it comes to protecting young people's privacy online?
Ava Smithing:
Yes. Well, COPPA 2.0 updates an existing bill. COPPA 1.0 is going to continue to exist whether or not we update it, and if we don't update it, it's going to be a little confusing
Rep. Tim Walberg (R-MI):
And expanding opportunities that are out there of the whole system. Simple and eloquent statement. Thank you Ms. Frederick. In your testimony you identify how big tech continues to recruit younger and younger users to their platforms incentivized by their potential to increase ad revenue. How should this behavior impact our efforts in Congress and is it important to include specific privacy protections and enforce mechanisms for younger users?
Kara Frederick:
Absolutely. As has been demonstrated by almost the entirety of the panel right now, children, these tech companies, they're hemorrhaging users on some platforms. So, as I said, there's a race to the bottom. They're trying to outdo each other to get younger and younger users addicted to their platforms. Tech companies care about three things: their bottom line growth, and number three, avoiding PR fires. So look at those PR fires. I think that's something Congress can do is expose this and help Americans understand that. And then additionally, children's consciences are not properly formed before these companies are going at them. As I talked about in my opening statement, nine to 11-year-olds, there are groups designed to addict and draw in these users. So we have to think about, we have to get in the heads of these big tech executives and these platforms in order to act and prevent them from some of the most egregious privacy abuses that Ms. Smithing here has demonstrated by her very presence in front of this body. Okay,
Rep. Tim Walberg (R-MI) (R-MI):
Thank you. Thank you. I yield back.
Rep. Gus Bilirakis (R-FL):
Thank you, gentlemen, yields back now. Recognize the general lady from New York, Ms. Clarke, for her five minutes of questioning.
Rep. Yvette Clarke (D-NY):
Thank you very much, and good morning, Mr. Chairman. I thank the Ranking Member for holding such an important hearing today and for including my bill, the Algorithmic Accountability Act. Let me also thank our esteemed panel of witnesses for joining us today and sharing your views on how to best safeguard America's right to privacy online. Adopting a comprehensive federal data privacy standard is absolutely essential. This committee has been closely examining legislation pertaining to artificial intelligence and data privacy standards is fundamental to any efforts to establish a regulatory framework around the development and deployment of AI. I commend Chair Rogers for putting forth the bipartisan American Privacy Rights Act(APRA) discussion draft. While I don't agree with every provision it contains, I appreciate this meaningful attempt to forge consensus and move forward with significant data privacy legislation. We cannot afford to wait any longer to adopt a federal data privacy standard.
We've already fallen behind much of the rest of the world, particularly our counterparts in Europe. The time to assert American leadership in this space has arrived, and we must move forward together. First, I'd like to, excuse me, thank the chair once again for including my bill, the Algorithmic Accountability Act, in today's hearing and for incorporating algorithmic accountability provisions into APRA. My first question is directed to Mr. Brody, but other witnesses are welcome to respond as well. Mr. Brody, can you tell us why it is so important to include provisions prohibiting algorithmic discrimination and requiring algorithmic accountability into a comprehensive privacy bill?
David Brody:
Sure. Thank you for the question. The provisions in this bill would prohibit the use of personal information to discriminate in goods and services. And as we have seen in recent years over and over and over again, algorithmic products are being rolled out with inadequate testing, and we find out after the fact that people of color and other marginalized groups are being excluded from opportunities are being charged, higher rates for insurance are being charged, higher rates for loans are being discounted from job opportunities and school admissions. And so it's incredibly important that we put rules of the road in place to prohibit those types of data uses. The other thing that this bill does is require entities to test their algorithmic systems before deployment and then after deployment to make sure that they are working as intended and that they are not discriminatory to see what disparate impacts are happening. Because the only way we will know if people are being judged by their individual merit is if someone is actually testing and looking at the design of the system
Rep. Yvette Clarke (D-NY):
Very well. Does anyone else want to add Mr. Jain?
Samir Jain:
I would just add I agree with all of that, and I think another feature of both of your bills is greater transparency into these systems. We as policymakers, as regulators, as advocates, as researchers, we need to better understand how they're working, what kinds of harm they may be doing so that we can identify those, we can figure out what's the right way to fix them. And so I think the transparency piece is also another key aspect of this.
Rep. Yvette Clarke (D-NY):
Very well. Thank you. I have some concerns about the exceptions for targeted advertising based on status as a member of a protected class. While I understand there are certain opportunities that are directed specifically towards particular communities, I worry that this carve-out could be used to exclude historically marginalized communities. Mr. Brody, what is your view of the exceptions for civil rights protections, specifically in the case of targeted advertising, is protecting a company's ability to more precisely target certain communities with ads worth risking possible discrimination or civil rights violations?
David Brody:
I don't think it is. I have very serious concerns about this provision. Targeted advertising and advertising generally online is one of the main ways to learn of opportunities and we have seen in recent years many examples where people of color in particular and other groups have been excluded and redlined in advertising. Recently, the Department of Justice sued Facebook for propagating discriminatory housing ads and settled that case recently. But we've also seen it happen in education in insurance, and we don't want to go back to an era where opportunities are only directed to specific groups.
Rep. Yvette Clarke (D-NY):
Thank you very much, Mr. Chairman. I yield back. Mr. Jain, I'm sorry; I mispronounced your name just recently. Thank you. Yield back, sir.
Rep. Gus Bilirakis (R-FL):
Thank you. The general lady yields back and now recognizes Mr. Duncan from the great state of South Carolina. You're recognized for your five minutes of questioning.
Rep. Jeff Duncan (R-SC):
Thank you, Mr. Chairman, for holding this important hearing and for your continued work. And Chair Rogers's important work on this important but very complicated issue. Legislative bodies all around the country, as well as all around the world, are looking at this same issue and taking action. I want to take a little step backward and ask all the witnesses, starting with Ms. Frederick, for whom should we be protecting the data of American citizens? Who's the greatest threat here? Is it Russian hackers, the communist Chinese Party, social media companies, other big American companies, identity thieves, or predators? So briefly, from your perspective, who is the threat that we as policymakers need to focus on the most to protect our citizens, especially kids and teens, Ms. Frederick?
Kara Frederick:
I think the low-hanging fruit right now is the Chinese Communist Party, and you look at that when it comes to TikTok; they're owned by ByteDance, which is headquartered in Beijing, and they have to adhere to national intelligence laws, which effectively render no private companies existent in China. They have CCP officials sitting on the boards of their main domestic subsidiaries, and they have the potential to propagate information warfare information operations on this platform that the majority of young Americans use. That's the low-hanging fruit; that's the clear and present danger. And then you take on the big tech companies, which are portals to poison with their products, as, again, we've demonstrated here. So CSCP first, and then you look at the big tech companies.
Rep. Jeff Duncan (R-SC):
Okay, Mr. Jain.
Samir Jain:
I certainly agree that privacy is a national security imperative and that we shouldn't be allowing our adversaries to collect data about Americans and then use that in a way that harms Americans. In addition to that and the social media companies, I would add data brokers to that list. I mean, we've talked a lot about the ways in which data brokers collect so much information, compile profiles, and then sell it willy-nilly to anyone who can then use it to propagate scams. And so I think data brokers are another entity against which we're protecting.
Rep. Jeff Duncan (R-SC):
Thank you, Ms. Kuehn.
Katherine Kuehn:
So, there are four types of threat actors. So you have the hacktivists, you have the nation states, you have those that are criminals from a financial standpoint, financial gain, and then you have terrorists. The reality, though, is that 49% of all breaches actually happen from unintended insider threats. So you don't know what to do with your data, something goes wrong, you lose your data, it's a problem. So two, 50% of almost all breaches come from unintended consequences. So us working on data privacy standards to make sure that we understand and have a better reality of where our data goes is actually super critical because we are to some degree, our own worst threat, and we don't understand where our data goes.
Rep. Jeff Duncan (R-SC):
Thank you for that. Ms. Smithing.
Ava Smithing:
Thank you for the question. I'm not a national security expert, so I'll steer clear, but the big tech companies have proven time and time again that they will not be responsible to the people on their platforms unless we regulate them to do so, and I believe that they have proved they are bad actors in this situation, and therefore we should protect our data and the data of our children from big tech companies as soon as we can.
Rep. Jeff Duncan (R-SC):
Thank you for your testimony too, by the way, Ms. Ohlhausen.
Maureen Ohlhausen:
Thank you. Strong privacy protections will provide wide protection against threats from many different vectors. Certainly, national security should be a high-level concern, and keeping information out of the hands of bad actors and scammers, I think, is also an important benefit of this Bill.
Rep Duncan (R-SC):
Thank you. Mr. Brody.
David Brody:
Thank you. I think we should focus on big tech companies and data brokers because they are the ones that control and design their own systems, and so they are the ones that have the greatest ability to avoid the harm if you design the system in the first place so that it's safe, so that it's secure, so so that the incentives of the business model are aligned with the best incentives of people, then you can neutralize harms upstream before the data gets into the hands of bad actors.
Rep. Jeff Duncan (R-SC):
Yeah. Thank you. Second question for all of you. If we have time, if Congress were to pass federal privacy law such as this, what single provision would be the most essential factor in that new law being successful? Again from your unique individual perspectives, I'm going to start Mr. Brody, if y'all could be brief, we have 45 seconds.
David Brody:
Sure. I think the data minimization structure and the civil rights protections are the most important parts.
Maureen Ohlhausen:
So, data minimization is also balanced with permissible data uses, which can benefit consumers and businesses greatly.
Rep. Jeff Duncan (R-SC):
Thank you for that. Ms. Smithing.
Ava Smithing:
Data minimization and also allowing all American citizens to opt out of algorithms that utilize their data to target them with information and content.
Kara Frederick:
Having an enforcement mechanism.
Samir Jain:
I'm just going to echo data minimization and effective enforcement because without that, really the rights are meaningless.
Katherine Kuehn:
Data minimization and the opt-out rights.
Rep Duncan (R-SC):
Yeah. Ms. Kuehn. Thank you again; I want to thank you all for being here. This has been really interesting. We're learning a lot, Mr. Chairman, I go back.
Rep. Gus Bilirakis (R-FL):
I appreciate it. Thank the gentlemen. Next we have the gentleman from this great state of Florida who's also a co-sponsor of KOSA. Representative Soto, you're recognized for five minutes of questioning.
Rep Darren Soto (D-FL):
Thank you so much. Chairman. We know that the internet, has become a fundamental part of our way of life over the last nearly 30 years, using it for information commerce, communicating with family and friends, telehealth, education, entertainment, you name it. And each transaction, we know it's producing a data point and, when you aggregate it together, paints a really accurate picture of our lives to a disturbing level, and I appreciate that some states have stepped up, including my home state of Florida to put together privacy lost finally for the internet, but it remains a patchwork. This is the energy and commerce committee. Interstate commerce is literally what we do here, and I could not imagine something more related to interstate commerce than the internet. And so we need a national standard and a bill of internet rights. People would be disturbed and know their DNA, our calendars.
Our geolocation, social security numbers, and health information can readily be brokered, and there's nothing to stop it. Most Americans would be shocked about the fact that they're not protected. So I'm pleased that we're finally, it looks like we're going to be acting on this with the American Privacy Rights Act. It's also time to protect our kids, and when I talk to parents back home in central Florida, they're at a loss. Their kids are being exposed to a house of horrors due to algorithms, violence, sex, and bullying online, and we hear it over and over from parents do something and some are pleased to support the Kids Online Safety Act to require online and video gaming companies to prevent exposure to these types of harms, provide parents with the tools to supervise kids' use of platforms, and ban the advertising of age-restricted products. Yesterday I had pediatricians in, and they affirmed what parents have been telling me and already know that a lot of these issues are causing a mental health crisis among our youth across the nation. Mr. Brody, you mentioned in your opening testimony that the bill has a backtracking on causes of action for privacy violations. What did you mean by backtracking?
David Brody:
Sure, so specifically the sensitive covered data minimization provisions under the ADPPA, there was a private right of action for collection, processing, retention, and transfer of that information. Here there is only a private right of action for transfer, and so that seems like a very significant change.
Rep Darren Soto (D-FL):
Thank you, Mr. Jain. We recently had an issue with the University of Central Florida student Alex Bugay. Someone took his identity and used it to make defamatory comments against the Georgia State legislator online, and it wrecked his life. He had nothing to do with the comments, and he was stuck trying to; he lost his job, and he nearly got kicked out of the university. Is there anything in APRA right now that would protect folks from having their identities misappropriated to publish defamatory or harmful statements online?
Samir Jain:
I think there are quite a few provisions. First, to the extent that it's your data that's being used to help facilitate identity theft and the provisions that minimize data that protect against data broker practices help. I think the data security provision in here that we haven't talked about today is also an important piece of that by establishing for the first time a national federal standard for data security, including by social media companies so that someone can't, for example, break into someone else's account and post false statements or that if they do so, then there's potential liability both with respect to the individual who engaged in impersonation, but also because there's a private right of action with respect to data breaches and the like. So, I think there are a number of different provisions in the statute that could potentially help with that.
Rep Darren Soto D-FL):
That's great to hear, and I'm looking forward to working with the chairman and our Rankin member on really tightening this up so that people's identities can't be stolen to say terrible things online and wreck young people's lives. Ms. Smithing, thank you for your powerful testimony. When I spoke to pediatricians yesterday, they talked about all the online body shaming, leading to eating disorders every day. How prevalent do you think this is with our girls and kids generally online across the nation?
Ava Smithing:
I don't know a single young woman who has not dealt with this, and I will extend it past young women, young men are struggling with this too. We just don't talk about it as much.
Rep Darren Soto (D-FL):
Thank you so much. These are the types of stories that are finally getting the support we need together to finally get this done. It's been a long time coming, so thank you for your testimony and I yield back.
Rep. Gus Bilirakis (R-FL):
Gentlemen yield back now. We'll recognize Ms. Lesko from the great state of Arizona and you're recognized for your five minutes of questioning. Thank you.
Rep. Debbie Lesko (R-AZ):
First, I want to thank the committee chairman, Kathy McMorris Rogers. She's passionate about this issue, and I have a strong feeling we're going to get this done this year. I also want to thank you, Mr. Chairman, for your work on this issue and others across the aisle as well. As you may expect, whenever we have a data privacy bill in front of us, there are all kinds of people who come out and say there are problems with it. And so it's hard for me as a congress member to know if those are legitimate problems or if there's somebody just trying to kill the bill or whatever the case may be. And so, my questions today alone, I had a couple of people tell me some concerns, and I want to see if any of you have any input on it, if you think it is a concern. One of 'em that I heard today was that the preemption language is not strong enough, and so do any of you have any input, and I'm talking about the draft language of the American Privacy Rights Act. Sorry, I didn't clarify that. Do any of you have any input on that?
Samir Jain:
I think what I would say is that preemption is an area that we just know that in order to get this done, there's going to have to be a compromise left to our own devices. We would probably say we should let the federal privacy law set a floor and then let states provide additional protections but understand that that isn't going to be the way that this comes out if we're going to get this passed. So, I think what APRA does is try to come up with the right compromise. It sets a preemption, but then it recognizes that there are certain places in which states have particular expertise or history of working where it makes sense for them to continue. So, for example, in healthcare, we have a federal privacy law called HIPAA, and we still allow states to legislate in that area. And so we've seen examples where co-jurisdiction between states and federal enforcers can work. And so I think APRA, in general, is moving in the right direction in terms of a compromise.
Rep. Debbie Lesko (R-AZ):
Anyone else?
Maureen Ohlhausen:
The coalition in its comments raised a concern. They said that while the general language is very good, some of the exceptions that still allow state law, for example, on tort law or common law could be a way to kind of sidestep the preemption and create that uniform federal standard that Congress is seeking to do.
David Brody:
I would just add it's very difficult to strike a careful balance on preemption because data touches everything, and there are going to be so many different categories of state laws that involve the use of data, even if they're not necessarily what Congress is trying to regulate in this bill. And you want to be very careful not to break the ability of states to regulate fraudulent practices, regulate other types of harms, civil harms, criminal harms that are happening at the state level that really aren't being anticipated here.
Rep. Debbie Lesko (R-AZ):
Okay. The next concern that I heard was that I believe in the draft of this language, exempts, or applies to companies that sell the data. And so one of the concerns was, okay, we're a parent company, we have subsidiaries and we don't really sell the data, but we share the data with our subsidiaries. And so now we're going to have to live by the standard even though we don't sell the data outside of our subsidiaries. Could anybody comment on that? Okay, well.
Kara Frederick:
I could adjust it or make an attempt. I think in terms of the bad sort of outweighs the good if you let that go. In particular, with TikTok, I think it's really important to talk about software developer kits and SDKs. There are third party tools that can be employed by American applications that can send user data to those companies. So you have, as Mr. Jain said, you have a whole veritable ecosystem of data transfer that's roiling about that. If you don't put strictures on that, the companies doing nefarious and noxious things will use those loopholes. So I would say you have to cover those loopholes, plug them, and then take care of some of the SDK and other data transfer to third-party issues.
Katherine Kuehn:
Yeah, I think the concern you're talking about here is when you look at the parent company and then the subsidiary, it goes into a bigger issue that you look at called third and fourth-party risk. So the whole ecosystem of the smaller organizations that are taking a look at the data to the larger organizations like we've talked about with TikTok, making sure that there's the right provision across each line for responsible holding of all of the data regardless of the size where they live in that ecosystem. That's what I think is concerning.
Rep. Debbie Lesko (R-AZ):
Do you think that this draft language addresses it?
Katherine Kuehn:
I think it addresses it for the most part. The only thing that raised a caution to me was actually the definition of small business on the dollar cap. And what I mean by that is there's a number, as again we've talked about today, AI is a big concern for us, and as the emergence of AI and generative AI comes up, there's a lot of small organizations that may meet that threshold of small business. They're utilizing data in a very interesting way that should be covered here. So I would look at the definition. I loved it for 200 users, but the dollar cap on it I would be concerned about as we think of the new AI companies that are emerging that are going to be looking for data.
Rep. Larry Buschon (R-IN):
Thank you. The gentle lady's time has expired. Now I recognize the gentle lady from Michigan, Mrs. Dingell.
Rep. Debbie Dingell (D-MI):
Thank you Mr. Chairman, and thank you to all from this committee for holding this important meeting, hearing today, and to all the witnesses for testifying, your input's been important. Privacy is a fundamental right and I'm encouraged by this opportunity to make real bipartisan progress on this issue, including the recent developments around the American Privacy Rights Act. There is still some work to be done, and I encourage my colleagues on both sides of the aisle to think about how we can find solutions to further protect children and their data. We've made significant progress on several of the sensitive data categories in data minimization. I know not everyone will get everything that they want, but I am encouraged by what we're examining today. It's reassuring to see the American Privacy Rights Act discussion draft include data minimization provisions, data minimization is the practice of only collecting, processing, retaining, and transferring data that is necessary, proportionate and limited to provide or maintain a specific product or service. Mr. Jain, are our children and consumers at risk because of the amount of their data that is currently collected and acquired online?
Samir Jain:
Absolutely. I mean, I think we've talked a lot about how data brokers collect data from so many different sources and then compile them into detailed profiles that are then used to target ads along the line that Ms. Smithing, for example, described. So I think there's no doubt that there are tremendous amounts of dangers and harms that are occurring.
Rep. Debbie Dingell (D-MI):
Today. Consumers are overloaded with constant breaches of their privacy and trust. Apps collect, and they sell users' location data to the highest bidder, and data brokers sell the information collected from apps on users and kids, sensitive data names, birth dates, email addresses, GPS, location, history, purchase history, health conditions, and behavioral profiles. Mr. Jain, in your view, does the American Privacy Rights Act discussion draft sufficiently address data minimization?
Samir Jain:
I think this is an example of where the data minimization language really builds on years of work done by this committee and others, and I think the basic standard is a strong one and does a good job. I think one place where we do need to look is in some of the permissible purposes and make sure they are appropriately cabined. I have in mind, for example, the provision that says you can collect and process data to prevent fraud at one level. That makes a lot of sense, but we do know that data brokers, for example, will sometimes say they're collecting lots of data in an attempt to be able to detect fraud or verify identities. And so I want to make sure that doesn't become a loophole through which data brokers can then justify their collection of data. So, I think we need to think carefully about some of the permissible purposes, but I think the overall standard is strong.
Rep Debbie Dingell (D-MI):
Thank you for that. Something else vital to comprehensive data privacy is privacy by design. That is, companies should make privacy considerations central to the design of their products from the beginning. Forcing companies to consider their actions before proceeding is an important structural protection, and it's crucial that usability also remains central to the design. Ms. Frederick, can you talk about privacy by design and why you think it's important to include it in the American Privacy Rights Act discussion draft?
Kara Frederick:
Absolutely. I mean, this is critical because when tech companies, their lifeblood is building and shipping products, if you get into the design phase, so that building portion of the products, you don't have to retroactively fit legislation on the backend. As you said, it'll be easier. You pass something like this, you integrate privacy by design as a requirement for companies, then we don't have to keep legislating, keep legislating because technology always outpaces attempts to govern it. You save yourself a lot of pain, all of us, a lot of pain in that process and tech companies can do this. You look at AI; we've talked about AI; there are approaches to machine learning, like federated models of machine learning, where they can get value out of data without having to personally identify or take up sensitive information from individuals. There are ways to do this. They have the best designers and programmers in the world. They're just not doing it at times because they don't want to. We got to do it Privacy by design.
Rep. Debbie Dingell (D-MI):
Thank you, Ms. Chairman. I don't have enough time for another question, so I'll yield back. Thank you all of you.
Rep. Gus Billirakis (R-FL):
Gentle lady, yields back, now we'll recognize Mr. Fulcher from the great state of Idaho. You're recognized for five minutes of questioning.
Rep. Russ Fulcher (R-ID):
Great. Thank you Mr. Chairman, thank you to the panel for being here and for your input. One of the drawbacks of being late in the questioning is sometimes your questions get asked already, and mine have, so I'm going to wing just a little bit here; Ms. Frederick, in general, just from a realistic standpoint, is it possible, realistically possible to balance algorithms that target advertise without ultimately having the misuse of that data? And can you just speak to that?
Kara Frederick:
We have a phrase in the tech policy community by a former deputy CTO who pedals this, and she's right. If you build it, they will come. And that is exactly what happens with technology. Oftentimes builders, they'll create things for noble purposes, as the chair said, but they will always get corrupted because we're human rights and human nature, and that is something that we see when it comes to the Chinese Communist Party's use of technology, specifically. They have the civil-military concept and dual-use technologies. They make extensive use of those concepts where you have societal use, and you have military use that turns into kinetic actions. So, this is absolutely intrinsic to the tech policy community. So if you build it, they will come. It's always going to be perverted no matter what.
Rep. Russ Fulcher (R-ID):
Thank you for that. I'm going to shift gears to Ms. Ohlhausen for a moment here with your background with the FTC general question again, but I want to just try to get your feedback on this. Let's talk about service providers for a moment. On this topic of data gathering and the potential misuse of that data, where does liability land for them in your view? What kind of liabilities should be placed on the service providers?
Maureen Ohlhausen:
So I think that the bill definitely discusses this in a positive way, which is about having responsibility placed with the party who is most able to protect the data. So when you have the first party, the responsibility is with them, with the service provider, if they're processing and using the data that the responsibility is with them, I do think the bill also sensibly puts some responsibility on the first party to say, we'll do some due diligence before you hand it over to the service provider.
Rep. Russ Fulcher (R-ID):
In your view, the language in the bill as you understand it is appropriate for that.
Maureen Ohlhausen:
Yes. So we are generally okay with that. The coalition may have a few comments to offer, but…
Rep. Russ Fulcher (R-ID):
Alright, thank you for that. So I'm going to move on to Mr. Jain here because you've been talking about the third-party data brokers a little bit, and when it comes to those data providers or data brokers rather, there's often tailoring of marketing and their messaging and advertisements to different customers through marketing automation. We've talked about that and you've talked about that today. Where do you see the line here on the level of tailoring, given that under the American Privacy Rights Act, it keeps the decision ownership with the consumer. Where is that line most appropriately drawn when it comes to the tailoring of these messages?
Samir Jain:
I think with respect to tailoring and including, for example, tailoring advertising, I think there is a balance to be struck in the sense that we know that today's system is broken because data brokers are collecting and using so much data. Advertising is being targeted in ways that are really harmful. At the same time, I think there is a role, for example, contextual advertising, where you tailor advertising based on the content that the user is seeing, or there may be roles for first-party advertising where if you've gone to a store and you've bought particular kinds of sneakers and that store then wants to advertise a similar type of sneaker to you and tailor it in that way. So I think we do want a viable advertising system economically viable because we certainly don't want to end up in a situation where all content is behind subscriptions or paywalls because there's no advertising to support it. The real key is how we figure out how we have a privacy-protective and economically viable advertising system. I think APRA moves us in that direction. It may need some further refinement on that, but I think there's a balance that we can draw.
Rep. Russ Fulcher (R-ID):
Thank you, Mr. Chairman. I yield back
Rep. Gus Bilirakis (R-FL):
The gentleman yields back now or recognizes Ms. Kelly for her five minutes of questioning.
Rep. Robin Kelly (D-NY):
Thank you, Chair Bilirakis and Ranking Member Schakowsky, for holding this morning's hearing, and thanks to our witnesses for your testimony. Privacy rights are civil rights because in the area of big data, personal information can be weaponized for digital redlining, and the harm is already occurring. Studies have found that mortgage algorithms were 80% more likely to reject black applicants than white applicants with similar characteristics. Auto insurance algorithms increasingly assessed applicants based on socioeconomic factors and less on their driving behavior. Retailers use facial recognition technology that erroneously accuses customers of shoplifting and falsely black women and people of color at higher rates than other shoppers. Mr. Brody, what safeguards are necessary to ensure a privacy bill does protect our civil rights?
David Brody:
So, first and foremost, we need the anti-discrimination protections that are in this bill that prohibit discriminatory uses of personal data in depriving equal opportunity to goods and services. We need the assessments of algorithms before they are deployed and after they are deployed. We need access correction and deletion rights because those are the tools that allow us to uncover discrimination and fix it when it occurs, and we need really strong enforcement of that. Enforcement has to be three-tiered. We need a strong federal regulator, we need state attorneys general so that we have more cops on the beat, and we need a private right of action because, as we've seen over and over again throughout our history, sometimes individuals are the only ones that can vindicate their own rights. If you look at the history of major Supreme Court decisions on civil rights over and over again, they're brought by individuals, they're not brought by the government.
Rep. Robin Kelley (D-NY):
Thank you. And does collecting, processing, retaining and storing sensitive data that is not necessary to provide a product or service pose an unacceptable threat to that privacy?
David Brody:
It depends on the circumstances. It can. We really need to have very carefully tailored data minimization provisions for that sensitive information to ensure it's only being used for the reasons that consumers expect and is not being transferred to third parties without the proper procedures in place. And be very careful with that information. As we've seen in recent years, there's been circumstances where people searching for healthcare online are putting themselves at risk and we need to make sure that that information is protected.
Rep. Robin Kelly (D-NY):
Thank you. And Mr. Jain, what are the potential consequences if there is no private right of action against entities collecting, processing, retaining and storing sensitive data that is not necessary to provide a product or service?
Samir Jain:
Well, I think as we've talked about a lot at this hearing, data minimization, particularly with respect to sensitive data, is really a central feature in many ways, the foundational feature of this bill. And so I think that means that we need strong enforcement around that provision because if we don't have that, we sort of undermine the foundation. And I agree with Mr. Brody that the private right of action is a piece of the enforcement that we need both to allow individuals to obtain recovery if they've actually been injured, which is a circumstance in which they can collect damages under this bill, but also to deter and to encourage people, companies to actually in the first place take privacy protective measures out of, because they know that there's a strong enforcement mechanism on the backend. So I do think it's important, and it's one change that we will be advocating for in a R is to make sure that the private right of action applies to minimization, particularly around sensitive data.
Rep. Robin Kelly (D-NY):
Thank you so much, and I yield back.
Rep. Gus Bilirakis (R-FL)
Appreciate it; the general lady yields back now. Recognize a general lady from Tennessee, Ms. Harshberger, for her five minutes of questioning.
Rep. Diana Harshbarger (R-TN):
Thank you, Mr. Chairman, and thank you to the witnesses for being here today. I'm glad we're working to achieve a federal privacy standard, and I'm thankful to the chairwoman for leadership, and I'm especially thankful for the increased protection of children. That being said, I absolutely want to make sure that we don't go too far, and I want to ensure that small businesses can still reach their customer base, Ms Ohlhausen, and one reason why America is the greatest country in the world is because our private sector encourages startups and small businesses to establish themselves and grow. And I'm a small business owner and this is helpful to our constituents and to our economy. The APRA treats companies of different sizes differently. Facebook and Google can comply with almost any log you throw at 'em because they have a room full of attorneys sitting in a building somewhere. So my question is, how do small businesses' interactions with Facebook and Google change under APRA?
Maureen Ohlhausen:
One of the benefits I think of a federal uniform privacy standard is for small businesses. It allows them to design, comply, and create systems around a single standard and not having to try to adapt to a changing landscape. So I think that will be a benefit for them. I also think allowing uses of data that are pro-competitive, that is good for consumers, a lot of advertising, some advertising can be problematic, but a lot of advertising really serves a very beneficial pro-competitive purpose. So, I think this bill strikes a good balance there to allow small businesses to compete.
Rep. Diana Harshbarger (R-TN):
Okay, I'll continue along with you, ma'am. In the FTC’s ‘22 through ‘26 plan, Chair Khan deleted language that stated that the FTC would accomplish its mission without duly burdening legitimate business activity. In contrast, the A RA includes critical privacy protections for Americans as small businesses. I'm concerned that if the FTC moves forward on a privacy standard without Congress, Americans will have weaker protections, and innovators will be unduly penalized. So my question is, what concerns do you have with the FTCs current plan to go it alone?
Maureen Ohlhausen:
I think that the FTC, like going back to when Kapa first got adopted or it gave the FTC really clear guidance from Congress and some useful tools. So I think the FTC, moving forward in privacy, has done what it could with its general authority, but to get these additional tools and the ability to help protect consumers better, having that clarity from Congress, that authority from Congress is really key.
Rep. Diana Harshbarger (R-TN):
Yeah, Congress's intent has to be clear when they pursue this. I have a question for anyone on the panel: How would a dual regulatory regime for common carriers reduce innovation? Anyone can answer, but maybe you don't want to.
Maureen Ohlhausen:
I would be happy. Happy to address that. The FTC has really developed quite a lot of expertise in privacy and data security and in these areas, and I think it is used effectively and this bill would allow it to use it even more effectively. We see enormous convergence among competing services now to really being a unified product in the consumer's mind. So I think having that FTC oversight and expertise really can benefit a business in that way and consumers too, because then they know this is the uniform standard. It's not what is the legacy regulatory status of whoever's collecting the data gives a different set of rules.
Katherine Kuehn:
I think it'll actually help innovation, to be honest with you. So when we look at what's coming from a generative AI perspective and the emergence of a significant amount, to your point of startups having one standard that startups can comply with that understand the rules, the regulatory landscape without having to try to look state by state, by state, I think is going to be an enabler and it gives a very strong set of guidance for how we can actually address third and fourth party risk, which is something I think is a huge concern.
David Brody:
If I could, I would just add that I think it's important regarding common carriers. I agree that the FTC should have strong powers here, but we also don't want to squander the FCC's expertise in telecommunications and the many important things that it does that the FTC would not be equipped to do.
Rep. Diana Harshbarger (R-TN):
Okay. Thank you, sir. I think with that, my time is up, and I yield back.
Rep. Gus Bilirakis (R-FL)
General lady yields back, and now we'll recognize Ms. Trahan for her five minutes of questioning.
Rep. Lori Trahan (D-MA):
Thank you, Mr. Chairman. I'm grateful to you, Ranking Member Schakowsky, for organizing today's hearing and for your commitment to resuming this committee's efforts to advance comprehensive privacy legislation. I'd also like to thank Chair Rogers and ranking member Pallone for their longstanding commitment and years of work on this issue. I'm proud today that this hearing includes two bipartisan bicameral pieces of legislation that I introduced to address widespread problems facing users online. The DELETE Act, which I introduced with Congressman Chuck Edwards and Senators Bill Cassidy and John Ossoff, would give every American the right to have data brokers delete their data and prohibit future collection. This is a common sense proposal that's been discussed before in this committee because of the national security concerns with the way data brokers harvest and sell some of our most sensitive data to the highest bidder, including our foreign adversaries. I ask for unanimous consent to submit for the record this letter from 20 Civil Society organizations supporting the bill.
Rep. Gus Bilirakis (R-FL)
Without objections, so ordered.
Rep. Lori Trahan:
Thank you provisions of the DELETE Act were included in the privacy package that was advanced overwhelmingly out of this committee last Congress. However, I'm concerned that some of the changes to those provisions in the American Privacy Rights Act discussion draft will not fully meet the needs of American users. Mr. Jain, how do the data broker provisions and APRA differ from what is included in the DELETE Act and do you believe that we should be strengthening that part of the discussion draft?
Samir Jain:
Yes, I do think that we should strengthen it in particular by adding in one of the central features of the DELETE Act, as its name implies, which is the ability to create a centralized mechanism so that consumers can, in one shot, ask all data brokers to delete their data. Because otherwise we have to go from data broker to data broker, which is impossible because most of us don't even know what the data brokers are.
Rep. Lori Trahan (D-MA):
Right. Thank you. I couldn't agree more. Under APRA's current draft, a consumer would have to individually visit 871 data brokers' websites and affirmatively delete their personal data. That's how many have registered in the state of Vermont, and that's just not feasible. My second bipartisan bill featured in today's hearing is the TLDR Act, which would reign in companies that force users to agree to unnecessarily long and complex terms of service or to use an app or access a website. A 2022 poll found that nine out of every 10 Americans have agreed to AC company's terms of service without ever reading it. This is an even bigger issue for companies providing services directly to our children, who are often required to agree to the same contracts before getting online. That's why the TLDR ACT takes the important step of requiring standardized short-form terms of service summaries that both parents and young people can understand. Ms. Smithing, how important is it for Congress to maintain the portions of the TLDR ACT in the privacy package we're discussing today, particularly with respect to clear and explainable terms of service for users of all ages?
Ava Smithing:
Yes. Thank you for the question. Incredibly important. Earlier, Mr. Brody said that he, as the king of data lawyers, does not even read his privacy policies. And if he can't do it, then I don't think we should be expecting children to do it. Be incredibly beneficial for kids and help them understand what's actually going on these platforms. I'll also add that children are tired. They have to consent to hundreds of things before they go on their favorite apps, and this leads to fatigue and them paying little to no attention to the things they're consenting to. So a policy like this that expressly says in clear and concise language, what is happening would be greatly beneficial to children.
Rep. Lori Trahan (D-MA):
Thank you so much. Well said. It's essential to privacy and kids' safety online that large data holders are transparent about their business practices and are held accountable by third parties. The best way to do that is to require that qualified researchers are able to study how the decisions made by powerful online platforms comply with the privacy laws that we hope to pass in this committee and impact users. I've been working with Senator Coons on language to empower researchers to take a look under the hood of powerful online companies like Meta and Google in a way that allows them to do their work while protecting user privacy and intellectual property. And I'd also like to submit another letter for the record from the American Psychological Association demonstrating the extent to which researchers' access to data is jeopardized and the urgent need for Congress to act to support.
Rep. Gus Bilirakis (R-FL):
Work without objection, so order.
Rep. Lori Trahan (D-MA):
Thank you. Mr. Jain, how important is it to protect researchers' abilities to access the data they need, and do researchers have that access today?
Samir Jain:
Thank you for the question. I know you've been a real leader on this issue. It's critical for researchers to be able to have access to that data, particularly in the social media context where they have been the ones who have uncovered a lot of the harms and a lot of the negative practices that we've seen. And unfortunately, we're actually moving in the opposite direction. We're seeing company after company withdraw or make less available data that researchers need. So I think it is very critical that we promote and do that. Unfortunately, I think APRA probably needs improvement in that area. I think ADPPA had a specific permissible purpose around public interest research and the ability to collect and process data for that purpose. And I think we should probably add that back in with appropriate privacy protections to make sure that that kind of public research can continue.
Rep. Lori Trahan (D-MA):
Thank you. We have some work ahead of us, but I look forward to advancing a strong, bipartisan, comprehensive privacy package. Thank you, Mr. Chair.
Rep. Gus Bilirakis (R-FL):
I think we're off to a good start. Thank you. The general lady yields back and now recognizes Mr. James, who's the newest member of our full committee. You're recognized for five minutes. I look forward to working with you.
Rep. John James (R-MI):
Thank you, Mr. Chairman. First, I would be remiss if I didn't recognize the grand opportunity I've been given to serve as a freshman on this committee. I want to express my sincere gratitude to Chairwoman Rogers, Chairman Bilirakis, and my colleagues on energy and commerce for having trust in me and giving me this opportunity. In parallel, I would like to recognize the grand opportunity in front of us to pass real, substantial, substantive public policy to benefit the American people as it pertains to data privacy and security. This is an issue that I hear about regularly when I'm back home and it's a testament to the committee's ability to get things done in a bipartisan way that we are working with the Senate Commerce Committee on this issue as well. I'm really glad that my bill, HR 61 49, the Protecting Kids on Social Media Act, is also under consideration and in front of us here today during this hearing.
The damage that social media is doing to our kids is unconscionable and extremely disturbing Big tech. These social media platforms are making our young people more depressed and wreaking havoc on mental health, particularly after the COVID-19 pandemic. As a father of three school-aged Boys, the warning signs of social media's impact on kids are abundantly clear to me. I've said for years Facebook Meta is the Philip Morris of our time, and now is the time to take action. My bill, the Protecting Kids on Social Media Act, aims to do exactly this. First, it establishes the minimum age of 13 for platform use and prevents big tech, big government and strangers from usurping parental consent from parents of teens. Second, it reigns in abuse by big tech using algorithms to target minor children. And most importantly, the bill empowers parents. It gives parents a fighting chance to protect their children, which is why we're here in the first place.
I would appreciate the thoughts of my colleagues and our guests here on ways to fine-tune and advance my bill, HR 61 49, the Protecting Kids on Social Media Act, now and in the future, and on social media of the Social Media Kids Safety package. So, my time is limited. We have two minutes, 45 seconds. My first question is to Ms. Keen here in Washington. As you know, we're constantly searching for the right balance when it comes to the size and scope of government. Could you discuss the political, I'm sorry, the potential benefits and drawbacks, particularly in terms of the role of social media companies and online platforms versus the role of government in keeping kids safe online? I'm a conservative; I believe in limited government, not no government. So, can you help me strike that balance?
Katherine Kuehn:
I think there's going to have to be a public private partnership in it. We've talked a lot about big data companies today and how they're exploited, but there's also the opportunity for better coverage and better work together to do the right things, especially for children. As a mother, I've seen cases where I personally had my children attacked on social media and had to engage law enforcement because of a school issue. I've had sex extortion cases with friends of ours, children we've been through every single time. It's a question of there's going to have to be a public private partnership on this. So the more we can work to create better standards, better understanding, simplified language, the ability for parents to take a more proactive role. I myself have very tight parameters around our children's social media. Those are the things. And take examples too from international opportunities to look at best practices like the child helpline that they have in the UK, no questions asked. There is some type of bullying, some type of problem online. You can call an anonymous line that's funded by the UK government to get help. Those are the kinds of things I think we should look at.
Rep. John James (R-MI):
And I think you bring up a very, very good point, Ms. Keen. Not every parent, not every socioeconomic status, has the time and the day to look over their kid's shoulders 24 hours a day. Sometimes, people have to work two shifts, night shifts. And putting these policies in place will help put up guardrails so that our kids cannot be exploited any further. Data security impacts everyone in every generation. In Michigan's 10th congressional district, home to over a hundred thousand seniors, roughly 18% of my constituents' failure to create a common sense data privacy only makes my constituents more vulnerable. Elder abuse, can you speak to the damage the status quo is causing seniors in Michigan and how this new Bill American Privacy Rights Act could improve their,
Katherine Kuehn:
The damage is significant. My mother had her personal identity stolen in a nursing home. My father had his identity and his data stolen when he was suffering from late-stage dementia. And three weeks later, we received a phone call, an erroneous one that he was told that his grandson had been kidnapped and he had to put $5,000 into an account because they had done a hacking onto social media and they understood what to say to get him to believe that it's terrible for seniors. So what I would say is this: the data minimization and the consumer control pieces of APRA are great steps forward to make sure that seniors too simplified language, minimal data, ability to control what's online as you get older, I think is critical.
Rep John James (R-MI):
Mr. Chairman, the federal government has a duty to protect our most vulnerable, and I believe our seniors and our children are among them. Thank you. I yield, sir.
Rep. Gus Bilirakis (R-FL):
Thank you for asking that question about seniors; we appreciate that very much. Now we'll recognize gentlemen, yielding back, and I'll recognize my fellow Florida Gator from the town where I was born, Gainesville, Florida. Ms. Cammack, you're recognized for five minutes.
Rep. Kat Cammack (R-FL):
Thank you Mr. Chairman, and go, Gators, you better get used to this.
Thank you to all our witnesses and to everyone who's with us today in the room; I'll jump right into it. I know that there's a lot to cover in this space, and a lot already has been covered, but I want to, for the benefit of folks who are tuning in, I'm going to start with you, miss; I'm going to say it old if I botch it, I'm sorry. I think it's important to understand what data is collected and certainly, Americans, I think, are woefully unaware of how much data is being collected and harvested. So, I want you to describe, if you can, and I know you can, the type of data that is collected by online platforms from consumers during a typical user experience, specifically what is collected, stored, and or sent to other entities in the following instances. And these three are scenarios that are very common for everyday Americans. So when an individual reads, types, and sends emails on just, say, Gmail, that's the first scenario when an individual interacts on Facebook, Twitter, or any other social media application and when an individual shops online.
Maureen Ohlhausen:
So..
Rep. Kat Cammack (R-FL):
I know that's a lot.
Maureen Ohlhausen:
Yeah. Yeah. So, based on the information that the coalition members collect from our telecommunication providers, ISPs, to provide information to provide these services, they may have to collect a wide variety of data for the content of communications. They're not necessarily collecting the content to use for any purpose for selling. Certainly, as consumers interact or users interact for payment, payment processing, fraud prevention, and things like that, they may collect a wide variety of data.
Rep. Kat Cammack (R-FL):
And I think what people fail to realize is that everything is being tracked and we're consenting to this in those ridiculous long terms of service that nobody reads. I think they've even added an option where you can simply press a button, and it takes you right to the very end. You don't even have to scroll anymore. So the amount of data, it's staggering, and I know everyone in this room knows this, but digging more into the legislation, and I'd like to open this up to our witnesses here today under the current draft legislation, their expressly names an opt-out mechanism. Now, personally, I am not for opt-out. I like an opt-in because, let's be honest, people are lazy, right? And so to have to go through the process of having to opt-out and you're automatically opt-in, I think that might create some confusion. I'd like for folks to weigh in, and I'm going to start with you Ms. Frederick, if you could weigh in on any thoughts you have on altering that into an opt-in directive rather than an opt-out.
Kara Frederick:
That makes eminent sense in terms of ensuring the most stringent default settings. It would certainly go in that direction, so I think it's a very interesting consideration for the draft's discussion.
Rep. Kat Cammack (R-FL):
Thank you.
Samir Jain:
I agree. It's an interesting consideration, and we know about the power of defaults that where the user is, where the settings are set initially usually are where they stay because of people, because of inertia, people don't have enough time. I think the one thing to think about is to think about in particular settings, whether the opt-out or opt-in is the right mechanism. In other words, there may be certain settings in which the default, it's okay for people to opt in, but we still want to give them the choice to get themselves out of a particular system. So I think that's the balance we have to strike. Okay.
Rep. Kat Cammack (R-FL):
Thank you.
Katherine Kuehn:
I think the opt-in is interesting, but I think we should look at the lessons learned from some of our international counterparts who've already put some really robust thinking about GDPR and other privacy laws in place. So, we should look at the lessons learned of opt-in versus opt-out. There are also technical ramifications. So, for organizations that are going to have to really hold and maintain the technical aspect of data privacy, there are big differences between the opt-in and opt-out. So I would look at the advice of how, from keeping a main baseline of cyber standard, the differences would look out.
Rep. Kat Cammack (R-FL):
And I know I am leaving you guys here. I have 25 seconds, and I have another question I have to direct to you. Do we have, let me say, do you think that it makes sense? I'll say this: Do you think it makes sense to give the FTC data security authority over the entire economy but leave the FCC in charge of data breach notification for legacy communication?
Maureen Ohlhausen:
So the coalition, in its testimony, suggests that data breach notification should also not be left with the FCC. Other concerns have been raised about that, that it would make sense to have it be unified under one agency.
Rep. Kat Cammack (R: FL):
I had a feeling you might say that, and I have, well, no, I'm over my time, so I apologize, Mr. Chairman. To our two remaining witnesses, if you could submit your answers to the question regarding opt-in opt-out, that would be wonderful. Thank you. With that, I yield.
Rep. Gus Bilirakis (R-FL:
Thank you, the general lady yields back now. I'll recognize Ms. Schrier for her five minutes of questioning.
Rep. Kim Schrier (D-WA):
Thank you, Mr. Chairman, and thank you to our ranking member for holding this really important hearing today. I'm delighted to wave on to this subcommittee. As a pediatrician, I have been looking forward to this hearing. For all five years that I've been in Congress, I have seen in real-time with my patients what immersion in screen time and social media has done to their sleep, to their attention spans, to the exposure to dangerous information, and, of course, the rise in mental illness and eating disorders that we're now seeing at increasingly younger ages. And I'm incredibly concerned about the impact that screen time and social media have on our kids and the dangers that exist online that so many parents are aware of. Kids today are exposed to harmful corrosive content at an early age and can even be exposed to predators' illegal activity without even realizing it.
The Surgeon General made it a priority and has even issued an advisory on social media and youth mental health to warn about the risks that unsafe social media environments pose to our kids. According to this advisory, 95% of all teens aged 13 to 17 use social media almost constantly. This is why my 15-and-a-half-year-old does not have access to social media. Nearly 40% of children aged eight to 12 use social media as well. For teens, this translates as we heard, to eight hours a day, I think about eight hours in school, eight hours on social media, and they're supposed to sleep nine hours a night, and that does not add up. And so all of this is happening also at a critical time for brain remodeling and development for these kids. There are several studies showing that the brain regions involved specifically in social development like the amygdala and the prefrontal cortex.
These undergo extensive changes during adolescence, and social media use is impacting how our kids' ability to interact with people is developing. And I think it is so important that we understand the full impact, but that we don't wait until we understand everything to take action because it feels like we are experimenting on this generation. It's not looking good to combat the risks of unsafe and addictive online activity. Parents should have every tool at their disposal to keep their children safe. And big tech companies, as we've discussed, need to be held accountable. That's why I am so proud to have worked with my colleagues on this committee to introduce the Kids' Online Safety Act and Sammy's Law. Both bills provide tools to help protect children and allow parents to identify and report harmful behavior or content that their children are exposed to. Sammy's law would ensure that third-party apps are able to responsibly and with guardrails, inform parents when their kids are engaging with dangerous or concerning content, including around suicidality, mental health concerns, substance use, eating disorders, and abuse.
And I know that today, in this hearing room, we have several parents who have tragically lost children due to dangerous and ultimately fatal content online. And I just want to take a moment to express my deepest condolences and also to thank them for their advocacy so that no other parent has to endure this. The Kids Online Safety Act will make sure that social media companies are held responsible for ensuring a safe online environment for kids whenever possible. And it would require these covered companies to be transparent about the design features that make these apps so addicting, like autoplay and rewards for levels of engagement. Ms. Smithing, I want to thank you for your testimony and wondered if you could speak to the addictive nature of these design features, which even the parents of these children can operate themselves off and how we might expect children to do that.
Ava Smithing:
Yes, thank you, congresswoman. I think when we're looking at addiction specifically, the most important thing we can do is data. Minimize of course, but also limit the amount of data that can go into recommendation algorithms. Opting out of targeted advertisements is not enough. The cadence at which these posts are delivered is what makes them addictive, not necessarily the content of them. So we can do everything in our power, but as long as companies can use our personal data and what they know, we will respond to and what they know, we won't respond to. Use variable reward schedules to deliver us posts. It won't address the addictiveness. So, as you said, an opt-in would be perfect, but allowing people to opt out of these algorithms is very important.
Rep. Kim Schrier (D-WA):
Thank you. I so appreciate your answer and I so appreciate the action that we're taking in this committee. Yield back.
Rep. Gus Bilirakis (R-FL):
I thank the general lady, and it's great to get your perspective being a pediatrician. So very important. So, I feel like we're on the right track. I am not sure if there's anyone else. I don't think so. So I think this was a very informative hearing and the testimony was outstanding. It really was. So we're going to get moving on this and like you said, for all our constituents, but particularly for our children, so thank you. I asked for unanimous consent to insert it into the record. The documents included on the staff hearing document list without objection were ordered. I remind members that they have 10 business days to submit questions for the record, and I ask the witnesses to respond to the questions promptly. Members should submit their questions by the close of business on May 1st, so without objection, the subcommittee is adjourned.