Home

Transcript: Innovation, Data, and Commerce Subcommittee Hearing on Data Privacy

Justin Hendrix / Mar 2, 2023
Innovation, Data, And Commerce Subcommittee Hearing: “Promoting U.S. Innovation and Individual Liberty through a National Standard for Data Privacy,” March 1, 2023.

The House Energy and Commerce Subcommittee on Innovation, Data, and Commerce hosted a hearing entitled “Promoting U.S. Innovation and Individual Liberty through a National Standard for Data Privacy.”

“Americans need and deserve more transparency over how their information is collected, processed, and transferred," said Subcommittee Chair Gus Bilirakis (R-FL).

Witnesses included:

  • Alexandra Reeve Givens, President and CEO, Center for Democracy & Technology
  • Graham Mudd, Founder and Chief Product Officer, Anonym
  • Jessica Rich, Of Counsel and Senior Policy Advisor for Consumer Protection, Kelley Drye & Warren, LLP

What follows is a lightly edited transcript.

Rep. Gus Bilirakis (R-FL):

The Subcommittee on Innovation, Data, And Commerce Subcommittee will come to order. The Chair recognizes himself for five minutes for an opening statement. Good morning again. I appreciate y'all being here. We got an early jumpstart on the day to accommodate our friends across the aisle who have an issues conference later this afternoon. So I'm confident we'll make the most of our time this morning. We made great strides last Congress, as you know, with the leadership of this committee, demonstrating that we can come together in a bipartisan fashion for the American people. I look forward to continuing and completing that important work. This congress earlier this week, the House passed HR 538, the Informing Consumers About Smart Devices Act from Representatives Curtis and Moulton with broad bipartisan support. I wanna recognize Chair Cantwell and Ranking Member Cruz and the Senate for sponsoring the Senate companion bill, which I take as a strong sign that this Senate cares about Americans' privacy.

I hope I'm right, <laugh>. I thank these members for working on legislation that compliments this committee's broader privacy goals and provides great transparency to Americans about the ability for devices to secretly record them. This is just one of many examples of why congressional action on broader comprehensive privacy and data security is desperately needed and why we are holding this hearing today, the second in a series of three. With that, I want to express my gratitude to our panelists for being here. We appreciate you very much, not only for bearing with us with the early start time, but also for sharing your expertise. Today, each of you brings important insights that will help our committee advance comprehensive privacy and data security legislation. This Congress, Americans need and deserve more transparency over how their information is collected, processed, and transferred. In the past several years, our constituents have likely noticed the internet becoming more personalized for them, whether they're seeing more targeted advertisements, showing items they've recently viewed on another website, or experiencing content on social media that matches what they've interacted with elsewhere.

Sometimes it's scary stuff to some, these practices may be viewed as more convenient for their shopping or useful for how they digest information, but others may find this practice is invasive and unsolicited. So let's give Americans the right to choose if they want this or not. Why not? Mr. Mudd? Thank you again for being here to walk us through how legislation can work for businesses operating in the digital ecosystem, and to share your expertise about how we can both protect innovation in our economy and still give Americans freedom to choose what to do with our personal and sensitive data. I know we can get this done and appreciate you being here. Sir, we also need to ensure legislation works for everyone and doesn't adversely impact our constituents or impede on the basic liberties that every American deserves. Ms. Givens, I want to thank you for your expertise on these matters as well as for your support over the last year in advancing comprehensive legislation.

Thank you so much. Lastly, we need to ensure a responsible government approach to enforcing clear rules for businesses to comply. Companies, especially small startups, shouldn't be subject to random or punitive letters in the mail, notifying them that certain practices could be unfair or deceptive. It is essential that the FTC enforce the laws so that we as a Congress enact and specifically authorized, but not go rogue beyond the rules of the road. We provide. This type of regulatory certainty is needy for businesses to comply, they must comply, but again, it's gotta be fair. Ms. Rich, thank you for testifying today. Again, you have great insights regarding the role of the FTC in enforcing laws, but doing so in a way that doesn't unduly burden le legitimate business activity. I look forward to continuing to work with you on achieving the right balance for the FTC to enforce a national privacy and data security law to protect Americans of all ages, while at the same time ensure that businesses that follow the rules aren't subject to government overreach and frivolous litigation. The committee appreciates your deep institutional knowledge, and thank you so much. Again, thanks again to our panel for being here, and I look forward to your testimony. The Chair now recognizes subcommittee Ranking Member Ms. Schakowsky for her five minutes for an opening statement. Good morning.

Rep. Jan Schakowsky (D-IL):

Good morning everyone. Thank you so much, Mr. Chairman, and I really want to begin by saying how proud I am of the work that this subcommittee has done really over the years, particularly in the in, in, in the last session of Congress in a bipartisan way. And I am really looking forward, as you said in your beginning remarks that we can do this together, that we can go, that we can go forward. We were almost there. We were able to pass in a unanimous way almost the American Data Privacy and Protection Act. Again working, working together. We heard the cry of the vast majority of Americans who are really tired of feeling helpless online. We heard from stakeholders from every corner of, of government and civic society, civil society, and and industry at six different round tables that we, that we had but absent any action by the, by the, the, the Congress, big tech is collecting evermore information about us, our personal information, intimate data and these companies know our our habits.

They know our finances, where we are, where we live, where we're going and and, and, and when you browse the web or where a smartwatch a tech company is tracking you. So they, they, they use this data to manipulate us to addict us, and to keep us on their platform so that they can provide even more ads to us. Or they sell the data to the highest bidder so that companies that you don't even know what their names are or who they are, can build a profile about you. Harmful targeting of advertising on social media has exacerbated the mental health problems that we face, particularly among our, our young people, our adolescents, our kids are, are the most vulnerable our our, our teenagers.

We have to make sure that we are protecting them all. This is in the name of profit. It is time. It is time. And the time has really passed, I think for us to do a data privacy law. And I really, really look forward to working together. Our past effort I think provides once again the guidelines for how we can move together. And I absolutely look forward to building on the momentous gains that we have made. And so I think it's time for us to roll up our sleeves and in a bipartisan way to get to work. The United States is far behind, and we need to catch up with states that are beginning to introduce their own privacy laws, many different ones from around the country and to, to give consumers what they want. And with that, I yield back.

Rep. Gus Bilirakis (R-FL):

I thank the Ranking Member, the Chair now recognized the Chair of the full committee, Ms. Rogers for five minutes for her opening statement.

Rep. Cathy McMorris Rodgers (R-WA):

Good morning. Thank you to the witnesses for being here this morning. Really appreciate this panel. Your testimony is essential as we keep the momentum going, as Ms. Schakowsky was just mentioning for a strong data privacy and security and those protections for all Americans. This subcommittee's first hearing this year focused on data privacy and security to ensure America's global competitive edge against China. Today's second hearing in our series will consider what a strong national data privacy standard will mean in our everyday lives. To reign in big tech, protect kids online and put people in charge of their data, these discussions build on the bipartisan bicameral ADPPA, which moved through this committee last year with a vote of 53 to two. That was the first time this committee reached such a milestone, and no other committee has come close on a national privacy and data security standard with bipartisan support necessary to clear the house and make the Senate take notice.

This is a new Congress with new considerations, so we must continue to improve on the legislation from the last Congress and build consensus among stakeholders. Bringing together experience in business, civil society, and government is the three-legged stool that will support our efforts in developing bipartisan comprehensive privacy and data security legislation. We must continue our work so individuals can exercise their rights. Businesses can continue to innovate, and the government's role is clearly defined today, turns that conversation inward. So we are preserving the engine of innovation while ensuring that we aren't just dollar signs for data brokers and big tech, they are harvesting people's data, selling or sharing it without their knowledge and not keeping it secure. We need a national data privacy standard that changes the status quo regarding people's data. Right now, there are no robust protections. Americans have no say over whether and where their personal data is sold and, and shared.

They have no guaranteed way to access, delete, or correct their data, and they have no ability to stop the unchecked collection of their sensitive personal information. This isn't acceptable. Data brokers and big techs day of operating in the dark should be over. People should trust their data is being protected. We're at an inflection point to ensure our personal information is responsibly collected. So artificial intelligence is developed with our values. We need to ensure that the metaverse doesn't become the next frontier of exploit exploitation for our kids. That requires a broad, comprehensive bill that will address all Americans' data and put even stronger guardrails around our kids. That's why the American Data Privacy and Protection Act included the strongest internet protections for children of any legislation. Last Congress, and as protections did not stop with kids, ADPPA gave everyone data protections no matter where they live, and no matter their age, we will continue to build on ADPPA this Congress and get these strong protections for our kids and all Americans signed into law. I wanna thank the Ranking Member, Ranking Member Pallone, other colleagues, Ranking Member of this subcommittee, Jan Schakowsky, as well as the Chairman of this subcommittee, Gus Bilirakis and colleagues, on this committee across the aisle for working together on this legislation. We have a shared goal here, and we're gonna continue this work and we're gonna get it done in this Congress. I look forward to today's hearing and for our privacy series to continue on March 23rd when the TikTok CEO is before this committee. Thank you. And I yield back.

Rep. Gus Bilirakis (R-FL):

Thank you. I wanted to thank the Chair, and again, as you said we got it across the finish. We gotta get it across the finish line this time. But we did our job last Congress under your leadership Madame Chair and the leadership of the Ranking Member so we can make a good bill even better. So we appreciate that very much. And with that, the Chair recognizes the Ranking Member of the full committee, my friend Mr. Pallone, for his five minutes.

Rep. Frank Pallone (D-NJ):

Thank you, Chairman. Bilirakis. Last Congress, when I Chaired the committee, I was proud to work with then Ranking Member Rodgers and now Chair Rodgers and the other subcommittee leaders on the American Data Privacy and Protection Act. And that was the first bipartisan and bicameral comprehensive data privacy legislation in decades. And this was a historic achievement with a 53 to two vote out of committee in this subcommittee's first hearing of this Congress. I was pleased, but not surprised to hear Chair Rogers reaffirm her commitment to advancing this bill. Simply put, as we will hear from today's witnesses, we need comprehensive federal data privacy legislation, and we need it urgently. Today. Many of our essential consumer products, especially those offered by the largest tech companies, require consumers, including children and teens to trade their personal data for services. And this is not a real choice.

People can't thrive in our digital economy without access to websites, mobile applications, email services, and other forms of online communication. Members of both parties talk a lot about holding big tech accountable, and I firmly believe that the way to do that is by adopting a strong national privacy standard that limits the excesses of big tech and makes the digital world safer. The testimony we will hear today will illustrate the fact that the lack of a national privacy standard doesn't just hurt consumers. It also hurts small and emerging businesses by favoring big providers at the expense of new competitors. Providing certainty to all consumers, businesses, and markets about fair and appropriate data collection and use is crucial for continued American innovation. We simply cannot go another Congress without passing comprehensive privacy legislation. Our legislation last Congress includes input from many of you on this subcommittee and countless other stakeholders.

It directly confronts and reaches important compromises on the sticking points, which derailed earlier congressional efforts. The American Data Privacy and Protection Act will put people back in control of their personal data. Stop data collection abuses by big tech provide important protections for kids, reign in the shadowy world of data brokers, and establish strong federal data security standards. The legislation achieves all this by starting with a fundamental shift in how data is collected, used and transferred. It rejects the coercive notice and consent system that has failed to protect American's data privacy and security. Instead, the bill adopts a data minimization obligation. It requires companies to limit the personal information they collect. They will only be able to collect what is reasonably necessary and proportionate to providing the services that consumers are requesting. At this subcommittee's first hearing this year, we heard testimony that data minimization protects consumer privacy and is critical for cyber security and national security.

And that's exactly what our bill did. And again, the American Data Privacy and Protection Act also protects kids from big tech. It bans targeted, advertised into children under 17 and covered entities will not be able to transfer covered data belonging to children without consent to help enforce these protections for kids. The bill establishes a youth privacy and marketing division at the Federal Trade Commission. Our legislation also shines a light on the shadow world of data brokers that profit from buying and selling our personal data. These companies don't interact with consumers directly, but they do collect and sell massive amounts of consumer data, including se sensitive personal data like health information and precise geolocation data that identifies a consumer's location within 18 feet. We must stop these data bro. Brokers from collecting, using, and selling consumers data without their knowledge of permission. The American Data Privacy and Protection Act will require data brokers to register with the FTC, and we'll provide consumers with a single mechanism to opt out of data collection by all registered brokers.

Now, while Congress has stalled on privacy for years, the rest of the world has not seeding American leadership on technological regulation. The European Union has passed comprehensive privacy laws, and this bill would immediately reset the global landscape. So I want to thank the witnesses for being here today. To shed even more light on the need for a national privacy standard, I want to thank Chairwoman Rodgers, Ranking Member Schakowsky, Chairman Chairman Bilirakis, and the members of this subcommittee for their really tireless efforts and their unwavering commitment to move a data, a comprehensive data privacy bill across the finish line, this Congress, I know that we can do it, so thank you again, and I yield back to the Chairman.

Rep. Gus Bilirakis (R-FL):

I thank the Ranking Member. We've now concluded with member opening statements. The Chair would like to remind members that pursuant to the committee rules, all members opening statements will be made part of the record. We'd like to, again, thank all of our witnesses for being here again earlier than normal to testify before the committee. Today's witnesses will have five minutes to provide oral testimony, which will be followed by a round of questions from members. A witness panel for today's hearing will include Mr. Graham Mudd, who's the founder and chief product Officer of Anonym. I asked him yesterday if he was related to the late Roger Mudd, who was a great journalist, and he said, yes, distantly, that's cool. If you don't ask, you don't get the answer. So and then miss Alexandra Reeve Givens, who is the president CEO of Center for Democracy and Technology, and Ms. Jessica Rich, Counsel and Senior Policy Advisor for Consumer Protection at Kelley Drye & Warren, LLP. So, Mr. Mudd, you're recognized for five minutes. We appreciate you being here again, sir.

Graham Mudd:

Chairman Bilirakis, Ranking Member Schakowsky, Chair Rodgers, Ranking Member Pallone and distinguished members of this committee. Thank you for the opportunity to testify at this important hearing. My name is Graham Mudd, and I am co-founder and Chief Product Officer of Anonym, a privacy technology company. I want to begin by thanking you for pushing forward ADPPA. I'm looking forward to the passage of strong federal privacy legislation along with strong enforcement authority. We're here to talk about creating a more privacy safe internet for Americans. The collection, sharing and use of data for advertising is at the heart of the digital privacy Challenge facing our country and the world. We started anonymous because we believe the notion that you can't have both privacy and an inefficient digital advertising ecosystem is a false dichotomy. While we're focused on building technologies that support privacy, we're also convinced that strong federal privacy legislation is necessary if we wanna make progress on this issue.

We've been part of the development of internet advertising since the early days. We spent more than 10 years helping to develop meta's data driven advertising business. Over the years, consumer data has become an increasingly powerful asset. The companies we work for and compete with adopted increasingly aggressive approaches in how they use data to improve their advertising products. To be frank, we helped develop these methods, but in the past few years, we and many others have become increasingly uncomfortable with the privacy implications of the practices. We helped pioneer and so we started anonymous with a simple goal to provide technically guaranteed privacy protections to consumers while enabling effective digital advertising. Today, digital advertising is supported by the wholesale and unregulated sharing of individual level data between advertisers and the companies that run ads for them. The mechanics are fairly complex, so I'll just use a recent personal example.

My wife and I are doing a few renovations at our home, so I've been spending a lot of time on home improvement sites like Home Depot. Not surprisingly, I see ads for products I've researched, and some I haven't, but might find interesting. Most of you, and most Americans are familiar with this experience. Sometimes it's useful, oftentimes it's a bit unsettling. So how did this come to be? Well, the majority of companies who run digital ads, including the Home Depot, have added tracking software from dozens of ad platforms that they do business with. These trackers are from ad tech companies, most of you have never heard of, in addition to large tech companies like Google and Meta, Pinterest, et cetera. Now, these trackers collect information about my browsing and buying at sites like Home Depot, and they share that data with ad platforms.

This data allows platforms to effectively target ads to me, and it allows advertisers like Home Depot to measure how well those ads work so they can spend their ad dollars efficiently. But at scale, this approach allows ad platforms to build tremendously rich profiles of people's browsing and buying behavior across millions of websites. Now, does the average American expect and appreciate that their internet behavior on millions of sites is being beamed to dozens of advertising companies so they can build a pro profile on them? Of course, they don't. We call this the profiling problem, and we believe the profiling problem is at the heart of the privacy challenge. We should all be focused on the solution to this challenge, we believe it requires two ingredients. First, strong federal privacy legislation, legislation that ensures that America's data is Americans' data is collected and importantly, shared only in ways they'd reasonably expect, or with their explicit consent, legislation that increases protection for children beyond COPPA, legislation that unifies the current protections that exist at the state level to provide protections for all Americans.

Legislation that provides for strong and clear enforcement authority. And we believe that enlightened legislation like ADPPA has all of these components. The second critical ingredient is technology. After all, technology got us into this problem, so it stands to reason it can help get us out of it. Privacy enhancing technologies are used in many other industries, in financial services, in pharmaceuticals, and in government to extract value from data without compromising the privacy of individuals. A number of companies, ours included, are working to apply these technologies to the, to make digital advertising more private. By default, these technologies can in effect reduce the cost of improving privacy. So while technology can help, ultimately we've gotta be clear-eyed about the incentives at play. We'd all love for ad platforms and publishers to proactively adopt more privacy preserving technologies. But doing so alone means oneself at a massive competitive disadvantage. A strong regulatory backstop is critical in addressing this incentive problem with regulation in place. I'm confident that we and others will find innovative ways to leverage privacy enhancing technologies to support business growth while guaranteeing the privacy of all Americans.

Rep. Gus Bilirakis (R-FL):

Thank you. Thank you, Mr. Mudd. Appreciate it very much. Ms. Givens, you're recognized for five minutes.

Alexandra Reeve Givens:

Thank you, Mr. Chair and thank you committee members for the opportunity to testify on the importance of data privacy and their urgent need for Congress to pass a meaningful federal privacy law to protect consumers, create certainty for businesses, and restore trust in the online ecosystem that is so essential to our economy and our society. I'm Alexandra Reeve Givens, and I have the privilege of leading the Center for Democracy and Technology, a non-profit, non-partisan organization that defends civil rights, civil liberties, and democratic values in the digital age. For over two decades, CDT has advocated for Congress to adopt strong privacy protections, and we're grateful for the work of this committee and its jurisdictional counterparts in raising public understanding of privacy harms by our count. This is the 31st hearing in the US Congress on consumer privacy. In just the past five years, substantive hearings have built a rigorous and detailed record about the overwhelming need for a comprehensive federal privacy law.

We commend the committee's focus on this issue early this session because its long past time for Congress to act. Looking for information on your device can feel very private, but with every click and scroll, companies collect information about your activities, typically using sharing or selling that information to make inferences about you or so you can be targeted with ads. A visit to a single webpage can involve hundreds or even thousands of cookies or beacons. Tracking your activities on that site, websites you've visited and search queries you have entered can be collected and shared in addition to your cell phone provider. Knowing your general whereabouts, apps on your phone can track and may share your location with anyone willing to pay a price revealing where you live and work, where you socialize, what doctors you visit, and where you pray. Consumers also share an incredible amount of personal and private information with different apps and online services, whether it be details about our physical health, our sleep cycles, our mental health, or social messages, and family photographs.

In addition to direct collection by companies, all of that data can now be shared with third parties such as data brokers, which are companies that aggregate information about users and marketed primarily for targeting ads. The huge variety in scale of data points gathered by data brokers allows precise inferences to be drawn about individual users. A 2013 report by the Senate Commerce Committee detailed how data brokers assign profiles to people, including categories like suffering, seniors, rural, and barely making it, and ethnic second city strugglers. A report published by researchers at Duke University just last month revealed that data brokers were selling mental health information, including, for example, a list titled "Consumers with Clinical Depression in the United States." This committee published a report on privacy concerns raised by data brokers as early as 2006, but these practices haven't been reigned in. When consumers learn about companies like data practices, they're offended, but the issue is about more than just offensive stereotyping or privacy leakage.

It can lead to social, psychological, and economic harm. It might not seem all that important of a person is targeted with particular clothing ads, but it matters when predatory lenders can hyper-target an audience that is vulnerable to payday loans and exploitative interest rates. As has happened with veterans and families navigating medical crises, it matters when scammers can target their ads to seniors who are more likely to fall for schemes hawking low-cost medical devices. It matters when inferences about people are used to unfairly target ads for jobs, housing, or credit the gateways to economic and social opportunity. My written testimony details how loose data practices can also raise national security harms. The lack of a comprehensive federal privacy law is leaving consumers open to exploitation and to abuse. Under current law, Americans' main privacy protections rely on a theory of notice and consent under which companies can set their own privacy rules and collect whatever data they like provided they disclose it to their customers in their lengthy terms of service.

Any modern user of technology knows why this notice and consent model is broken. Even if a consumer could feasibly read and understand these labyrinthine privacy policies, they often have no real choice but to consent. Many online services are such an important part of everyday life that quitting is effectively impossible. We have to move on from this broken regime of notice and consent to one that establishes baseline safeguards for consumer information, clear rules of the road for businesses and meaningful enforcement of the law. This must include specific protections for sensitive information and protections for civil rights. The Bipartisan American Data Privacy and Protection Act is the place to start. Last year, this committee did admirable work, forging a bipartisan compromise that offers strong protections for consumers, while also accommodating business realities. To be clear, c d t and other consumer groups wish the bill offered stronger protections in places. This is not our perfect bill, but this committee put in the work to achieve meaningful compromise. Respectfully, we urge you to build on that momentum by taking up the bill without delay. I thank the committee again for your leadership, and I look forward to answering your questions.

Rep. Gus Bilirakis (R-FL):

Thank you so very much. I appreciate it. Mr. Rich, you're recognized for five moons.

Jessica Rich:

Thank you, Chairman Bilirakis and Ranking Member Schakowsky and the rest of the members of the committee. I'm Jessica Rich of Council and Senior Policy Advisor for consumer protection at Kelley Drye & Warren, LLP. I'm pleased to be here today testifying on the need for federal privacy legislation. I really wanna thank this committee for its bipartisan leadership on this important issue over the course of years. I also wanna make clear that my remarks today are my own based largely on my years of government service. As background, I worked for over 26 years at the Federal Trade Commission, the last four as director of the Bureau of Consumer Protection. Much of my FTC career was devoted to data privacy and security. I was the first manager of the FTCs Privacy Program and continued to lead its expansion as I rose through the ranks at the agency. In my various roles, I developed or oversaw enforcement against hundreds of companies that failed to protect consumers personal information, information rulemakings to implement privacy laws such as the Children's Online Privacy Protection Act, and dozens of FTC workshops and reports on emerging issues.

During my time there, I also wrote or oversaw multiple recommendations to Congress seeking stronger legal authority and remedies for privacy and security. The years have come and gone with multiple hearings and privacy bills, and as we all know, there's still no federal privacy law over two decades later today. The need for a federal privacy standard has never been greater, and there's no substitute for Congressional action here. Federal privacy legislation is simply the best way to create a consistent set of rules for consumers and businesses to fill in the many gaps in our privacy. Patchwork enlist multiple enforcement in policing the marketplace and provide much needed credibility abroad. Although I could expand on every single one of those points, I'm gonna focus on a related issue, which is why the FTC needs a federal privacy law. As much as the FTC has been able to do with the tools, it needs more authority from Congress to be a truly effective privacy enforcer.

In fact, under current law, the FTC's legal Authority is limited, whether it's pursuing case by case enforcement under the FTC Act or attempting to to develop a privacy regulation. I'll explain why briefly here, but I refer you to my written remarks for more details. First, because there's no comprehensive federal privacy law, the FTC has had to bring most of its privacy enforcement under Section five of the FTC Act. A general purpose Consumer protection law enacted long before the internet existed or was even thought about. Section five prohibits unfair or deceptive practices, and each of these standards has a three-part legal test. Sometimes the legal tests simply don't work for privacy because they weren't written with privacy in mind, for example, to prove unfairness, the FTC must show that a practice causes or is likely to cause substantial consumer injury, which can be very difficult in privacy, where injury can be very sub subjective, and there's a range of different types of harms.

In addition, section five doesn't establish clear standards for companies to follow before a problem occurs. It's mostly reactive, allowing the FTC to challenge data practices afterwards. Finally, the FTC Act doesn't authorize civil penalties for first time violations, and it doesn't even cover nonprofit entities or companies engaged in common carrier activities. Now, the FTC is attempting to plug at least some of these holes by developing a privacy regulation. And in theory, an FTC privacy regulation could set forth practices that companies must follow to do this, don't do that, and also pave the way for civil penalties. But this approach faces even more opticals obstacles than case by case enforcement, and it will use up the FTCs limited resources too. That's because without specific direction from Congress to develop a privacy rule, the FTC must rely on its rulemaking authority under the FTC Act, which is also called Magnuson-Moss rulemaking. The Magnuson-Moss process, which we all have a nickname for, is extremely cumbersome and time consuming as compared with the usual rulemaking process under the Administrative Procedures Act. For example, Magnuson-Moss requires the FTC to prove that each practice it seeks to regulate is not only unfair or deceptive, but prevalent. Magnuson-Moss also includes an extra round of public comments, public hearings, and a more rigorous standard for judicial review. Rule rules developed under this process have temp. Simply take ha ha, have typically taken years to complete. And with all the controversies surrounding privacy, we can also expect legal challenges here. There's simply no substitute for federal privacy legislation. Congress can write a law that says, do this, don't do that. It can plug the holes in the FTC act as well as in the US privacy patchwork that we all know overall, and only Congress can resolve the thorniest issues here. Put them to rest preemption and the right of action. Thank you very much. I look forward to your questions.

Rep. Gus Bilirakis (R-FL):

Thank you very much. I appreciate it. I thank all the witnesses for their testimony today. Excellent testimony, by the way. We'll now move into the question and answer portion of the hearing. I'll begin the questioning and recognize myself for five minutes. Thank you again. To the panel. We made clear that the American people deserve to have more control over their data, and we're hard at work to pass comprehensive privacy and data security legislation to do just that. But we're also committed to this effort because businesses, especially small and medium sized businesses need certainty. They should not live in fear of spending their time and resources on legal compliance to survive in the digital economy. Unfortunately, the opposite is occurring, and the growing state patchwork is unsustainable for the American economy, and California is still adding more layers to the regulation. Ms. Rich you referenced the FTC's current privacy rulemaking in your testimony. I want to highlight that their rulemaking would not preempt state laws, meaning more regulatory uncertainty. How will adding another layer to the current patchwork lead to negative economic impact and a disruption for small and medium sized businesses to operate?

Jessica Rich:

I agree that that would be problematic, especially since the FTC can't work through the difficult issues related to preemption, that this committee and Congress can.

Rep. Gus Bilirakis (R-FL):

Thank you very much. Mr. Mudd, would you like to comment on this, please?

Graham Mudd:

Sure. You know, I think it is absolutely the case, Congressman, that a patchwork of state legislation really does hurt smaller businesses and, particularly smaller publishers more than it does larger ones. Larger tech companies have armies of engineers that can adjust their technologies state by state, jurisdiction by jurisdiction. That's just not possible for smaller publishers in companies.

Rep. Gus Bilirakis (R-FL):

Thank you. Ms. Rich protecting All Americans from Unfair and Deceptive Acts is no small undertaking. As you may know, the ADPPA included a section for FTC approved compliance mechanisms for small businesses who may have difficulty complying with the law. I know Safe harbors have also helped the FTC and their ability to enforce laws. Can you speak more on that and explain why safe harbors would be helpful to the FTC and legislation such as the ADPPA.

Jessica Rich:

Thank you if done right, safe harbors or compliance programs can increase compliance overall while also providing the certainty and the flexibility that certain businesses, especially small and medium sized businesses need. The idea is that an independent organization can create a compliance program that meets or exceeds the standards in the law, and then the FTC approves them using a rigorous process. And then companies that need this kind of structure and guidance and help can join the program and be evaluated and certified for compliance and thus comply with the law if the requirements are rigorous which the, they, they are in the ADPPA, it expands both compliance while also providing certainties for the companies that join these programs.

Rep. Gus Bilirakis (R-FL):

Very good. Thank you very much. I'll, I'll yield back and and now we'll ask the Ranking Member of the subcommittee give her five minutes for her questions. Thank you.

Rep. Jan Schakowsky (D-IL):

First of all, let me just say how cheered I am by the consensus that we have. You know, we've got a practitioner, we've got a not-for-profit, we've got a government, we've got it seems, Republicans and Democrats, so let's move forward. So, the question, let me start with Ms. Givens. So it seems to me that the current notice and consent privacy regime really doesn't work very well for consumers. So is there a better approach and how would you describe that?

Alexandra Reeve Givens:

Thank you for the question. And you're absolutely right that the current model of notice and consent is broken. And I think any person that uses the internet or a device today knows that, right? We are forced to click through long terms of service that many people do not stop and take the time to read. And even if we could take the time to read them, consumers don't feel like they have a choice. Often, we need to be able to access a service to communicate with friends or family, for example. So instead, what we need is the model pilot pursued in the ad P P A, which is strong baseline protections for consumers data that don't rely on somebody clicking through on whatever a company has chosen to disclose in its terms of surface, but instead provide baseline protections and rules of the road. These include things like protections for data minimization. So the assumption that companies can only collect, process and share data in the course of delivering the service that the user expects, as well as heightened protections for sensitive areas of data, which conclude anything from precise location information to health data, for example to other biometric information. Those are the types of rules that we need to give customers confidence, again, in the online ecosystem, and also help businesses know how to govern their practices.

Rep. Jan Schakowsky (D-IL):

Thank you. So mis Mr. Mudd, I also wanna talk to you about the burden that I think is really right now on the consumer, themselves the notice and consent regime. And, and how, but how does this play in the ad tech world? I mean, do you know anybody who reads all of the I, I mean, I, I once brought in the pages and pages of the terms of service and, and all of that. So I just wondered how you'd comment on what we need to do better, and I don't wanna see more burdens than saying the consumer has to do more to protect themselves.

Graham Mudd:

Sorry, I couldn't agree more. I do believe that the current approach is wholly insufficient in protecting consumers. And I think your assumption that the vast majority of people do not read privacy policies or terms of services, of course, is correct. And therefore, consumers do not understand how data that they emit is being used, transferred, collected, et cetera.

Rep. Jan Schakowsky (D-IL):

So what do you do to help the consumers? How is your business different?

Graham Mudd:

As Ms. Givens pointed out, I think the, the, the, the whole point of the technologies that we and others are developing is to just raise the baseline to not allow the kinds of data sharing that have taken place in the past as opposed to asking consumers, putting the work on them to make decisions that they're not well informed to make. And they certainly don't have the time or inclination to focus on. And so it is all about privacy by default, data minimization, moving the bar up instead of putting the work on the consumer.

Rep. Jan Schakowsky (D-IL):

Yeah. and let me ask, ask you Ms. Rich, so what, what you're talking about is that we have the tools or we can have the tools through the federal Trade Commission. And how important then, do you think is the role of the FTC as a, as a regulator?

Jessica Rich:

Oh, the FTC as the regulator here is critical. They've been doing this work for 25 years. They have enormous sophistication about the issues. They have the will to protect consumers, and they just need better tools, stronger legal tools to protect consumers across the marketplace.

Rep. Jan Schakowsky (D-IL):

Well, you know, I, I do have time, but I, but I wanna say that the witnesses that we have today, I think can really be helpful to us as we move forward to make sure that the law that we did pass can be improved, can be made better, so that we can, during this Congress get across the line that I think we have really come close to right now. And you know, the United States of America really owes it, I think, to our consumers. We are just too far behind. We owe it to our children, we owe it to our families. We owe it to legitimate businesses to make sure that we move forward. So lemme just say thank you very much, and I yield back

Rep. Gus Bilirakis (R-FL):

The lady yields back, and I'd like to say the Ranking Member, you're right, we're too far behind. Too far behind. That's the bottom line. Okay. now I'll recognize the vice Chairman of the subcommittee, Mr. Walberg, for his five minutes of testimony.

Rep. Tim Walberg (R-MI):

Thank you, Mr. Chair. And I would, sir, concur that we're far behind, but we've taken an early start here, and that's a good thing. As we've already talked about the patchwork of competing laws that are out there at the state level. And now we're working on something, I think we can come together. We've shown that one area I think we can all agree is the need to address children's privacy. Republicans in the house are committed to putting parents back in the driver's seat and even grandparents back in the driver's seat. Being a little personal there it includes providing more tools to protect them online kids'. Privacy has long been a priority for me. In past congresses, I introduced bipartisan legislation that would update COPPA for our increasingly digital world. And the ADPPA included additional protections for those under the age of 17. Ms. Givens how should children's privacy protections be addressed differently than those for adults in comprehensive privacy law?

Alexandra Reeve Givens:

Thank you for the question, and thank you for your leadership on this issue to protect children across the country. The ADPPA includes some important protections for children and is specific in calling them out. One is the additional division created at the FTC to focus on this issue, but additionally, there are protections, for example, prohibiting the targeting of ads to children under the eight and teenagers under the age of 17, and also express limits on the sharing of their information without express opt in consent. This matters because our children are being targeted online. And unfortunately, copper right now is not up to the job in addressing abuse of data practices. But what's critically important is that we can't just focus on the privacy needs of kids. We need to do this in a comprehensive way that protects all consumers. And the reason we need to do that is when you only focus on protecting the interests of children, you actually create new obligations, for example, to test for people's ages that can sometimes undermine people's privacy. So what we need to do is the approach that's followed in ADPPA today, which is to lift up privacy protections for all consumers, and then charge those additional resources to protecting kids in additional ways to make sure that we really are living up to what our children need online.

Rep. Tim Walberg (R-MI):

Okay. What's good for one can be good for all in a great way as we do it comprehensively. COP currently includes an actual knowledge standard for information Ms. Rich collected on those under age of 13. The law was passed in 2000, and the FTC last made rule changes in 2013. How has the landscape changed since then? And is an actual knowledge standard still appropriate?

Jessica Rich:

You went right to the heart of the issues, didn't you?

Rep. Tim Walberg (R-MI):

Sometimes I do it right. Yeah.

Jessica Rich:

Yeah. The FTC has not updated COPPA, and a lot of people wonder why in public remarks, there was some suggestion that they're waiting for CO to see if Congress updates the, the, the law so that, you know, they don't have to do it twice. But I can't fully understand why they're not using the tools that they have. Kapa is very outdated. Information collection has just exploded even since 2013. And it was pretty considerable then. And all sorts of new, new, new, new practices in the marketplace. And we really do need special protections for kids and teams as is included in the ADPPA.

Rep. Tim Walberg (R-MI):

Okay. Ms. Givens in your testimony referenced a report by Duke University, which was interesting, which revealed that data brokers were selling mental health information to advertisers. This included whether someone has depression, insomnia, Alzheimer's disease, other medical conditions. I read the same report or read the same report, and then extremely concerned HIPAA was created to protect our medical information. But with the explosion of health apps, that data is no longer just held by your doctor's office. What gaps are there in protecting medical privacy, and how do we fill 'em?

Alexandra Reeve Givens:

Thank you for the question, because this is an urgent problem. You're right that HIPAA does protect data, but it's only when it's held by a covered entity, which doesn't include any of the commercial apps or services that users interact with every day. Sometimes sharing really important mental health insights if you're using an app to kind of do, you know, journaling and in addition inferences that companies can make about you based on your behavior from which they might be inferring some of the medical conditions that we just described, which is why we have to ...

Rep. Tim Walberg (R-MI):

True or untrue.

Alexandra Reeve Givens:

Yeah. Yeah. So, which is why we have to have a comprehensive privacy law to fill those gaps for all of the non HIPAA covered entities that are still making inferences and deductions about people's mental health status, as well as other medical conditions as well.

Rep. Tim Walberg (R-MI):

Well, thank you. My time has expired, Mr. Mudd, I have another question, the best question you could ever have, but I'll submit it for the record.

Rep. Gus Bilirakis (R-FL):

Sounds good. <Laugh>, thank you very much. I appreciate that, <laugh>. And now we'll recognize the gentleman from Florida Mr. Mr. Soto for his five minutes. Florida is very well represented on both sides of the aisle in committee. It's just a coincidence. Right.

Rep. Darren Soto (D-FL):

Thank you, Chairman. Thank you, we're recognizing you. Go ahead. You're making Florida proud. You know, it was nothing short of a, a mild miracle last term when we saw both parties come together to pass out of the committee, the American Data Privacy and Protection Act. When you look at some of these key sections like the sensitive covered data section, it reads really like an internet privacy bill of rights information that everyday Americans would think would already be protected. It's still subject to risk of being distributed and used in commerce, like people's social security numbers and health information, financial account information, debit card information, biometric information, genetic information, your precise geolocation at this very moment. Your private communications like voicemails, emails, texts, messages and mail account logins and passwords identifying people's different behaviors socially, as well as calendar information, address book information, so many other things that we would all shutter to know that could be sold and used for profit to help target people in a really intimate way that violates our, our notions of privacy in, in the nation.

You know, Florida does not have internet privacy laws, even though we have a privacy amendment in our constitution, it's failed a couple times over enforcement disputes. So our state of 22 million Floridians are left vulnerable by not having rights, which is why it's time for us to step up to create a national standard. Not to mention that I can't think of anything more related to interstate commerce than the internet. So it's a really important time for us. Ms. Givens it'd be great to get your opinion on, on, on this list of basic data covered on these, these basic rights that we have. And do you think there should be any others added?

Alexandra Reeve Givens:

In my opinion, the ADPPA did an excellent job capturing many of the major categories of sensitive data. You listed many of them and to the point that you made, these are things that Americans already expect to be protected. They are horrified when they find out that it is not protected, and they want baseline safeguards in place to make sure that they can trust the services they consult online while the list is strong and good. Now, there needs to be ongoing flexibility to add to it in future, because we know that the marketplace will continue to innovate. We cannot foresee what data uses may arise in the next 5, 10, 15, 20 years, which is how long, of course this law would likely be in place in governing user behavior. And so one of the important innovations in the bill is to leave some room for the FTC to fill in the gaps where needed and be responsive to emerging cases, which the FTC can do based on rule making procedures, stakeholder consultation as new norms evolve. And that, I think, is the right approach that the bill takes today.

Rep. Darren Soto (D-FL):

So do you believe that already included, that kind of flexibility is already in enough in the ADPPA already for FTC to recognize these new types of information?

Alexandra Reeve Givens:

I do think so. I think the covered list that we have now, the fact that it includes both the data itself and inferences that may reveal that information, coupled with the ability to fill in gaps in future, is a really important combination.

Rep. Darren Soto (D-FL):

Thank you. And Ms. Rich, we know how important enforcement is. We saw in Florida, that was the key sticking point that kept our state from actually having a, a, a new law. And I'm very concerned that we don't end up having a toothless tiger here. The bill we already passed out of the committee last year had both a rule for the ftc, a private right of action and rules for State Attorney General as state attorney generals, how critical is it to have all three of these mechanisms in place? And can you give us any guidance on that?

Jessica Rich:

Well, having been part of this debate for over 20 years, I would say whatever it takes for you guys to agree on a law is what I support. But I do think that some level of consistency is important and which is why I do support some level of preemption and some limits on private litigation, especially since private litigation sometimes benefits attorneys more than consumers. But I actually think the model in the ad P P A is, is very good because it empowers the ftc, it empowers not just all the state attorney generals, but other officers in the state that might have a role in privacy. And so given that the state attorneys general have been very active in privacy, I think this new tool would empower them even more. And we'd have a lot of cops on the beat.

Rep. Darren Soto (D-FL):

Well, thanks for that opinion. You know, many tech companies are running circles around government enforcement right now, and so very important to have balance, in my opinion, between FTC or state attorney generals and having some private right of action. Thank you for your opinions, and I yield back.

Rep. Gus Bilirakis (R-FL):

Thank you, sir. Appreciate it very much. Now I'll recognize the gentleman from South Carolina, Mr. Duncan, for his five minutes.

Rep. Jeff Duncan (R-SC):

Thank you, Mr. Chairman. And this has been an informative hearing. I'm an energy guy, so this isn't in my wheelhouse, but it is educating me on the issue. So, just one real quick question, because Ms. Rich, we've heard a lot from downtown over the preemption clause in the ADPPA, namely, that does go far enough, especially with respect to overly restrictive provisions coming outta California while the previous speaker of the house disagreed with that sentiment. I understand there are concerns over certain carve outs that are not otherwise addressed in the bill. Could you speak to that?

Jessica Rich:

Well as I said, I do think that there is some level of preemption that should be in the bill, so we get as much consistency as possible. I would also note that by, by the measure of many consumer advocacy groups who are reading all of these bills and laws very carefully, the ADPPA is stronger than existing state laws for the most part, and may be one provision here and there and stronger. But, you know, in an effort to compromise this committee carved out certain things including the California private right of action. So, as I said, whatever it takes but I do believe the ADPPA is the strongest law we've seen, the strongest bill we've seen anywhere on privacy.

Rep. Jeff Duncan (R-SC):

Well, thank you for that. I think some sort of uniformity where states know how to comply with a lot of different things. The ADPPA being an example of that Mr. Chairman legislative hearings and, and hearings like this are very informative. I appreciate you doing that. I don't have another question. I yield back.

I appreciate that very much. Thank you. And now we're recognized a Ranking Member of the full committee, Mr. Pallone, for his five minutes.

Rep. Frank Pallone (D-NJ):

Thank you, Mr. Chairman. I, I, I'm concerned about data brokers collecting and selling massive and detailed amounts of information about consumers who've never interacted with these data brokers. So let me start with Mr. Mud. In your written testimony, you point out that the scale of data collection and transfer using online mechanism me mechanics, is difficult to comprehend. Based on your experience working in advertising technology. Could you tell us what types of information data brokers have about consumers, how they collect that information and what they do with it? And about a minute or so,

Graham Mudd:

<Laugh>, I'll do my best. <Laugh>. so in terms of the types of data that that are collected by data brokers, it's again difficult to, to to be comprehensive here. Certainly basic demographics, your age, your gender, your household composition and so forth. But certainly also well beyond that, your profession the makeup of your household, the age of your children the location of course of, of your household. Oftentimes also your workplace, oftentimes even your real-time location, your income and other financial statistics about you. And then of course, your behaviors your behaviors on the web through you know, pixels, cookies, and, and so forth as well as off the web in the real world. You know, retailers often times will sell data about your shopping behavior to data brokers who will then resell that data onward to others.

And then as others have pointed out, conditions are oftentimes also gathered and inferred as well. Now, where do they get this information? Well as of right now, there are very little cons. There's very little constraint on how they can go about gathering it. And so they, of course, gather it from everywhere they possibly can. That means public databases, that means the websites that we interact with and so forth. It means, as I mentioned, real world you know, re retailers. And then there are even, you know, specialty location companies that try to understand where you are in the, in, in the physical world and, and share that data with, with brokers.

Rep. Frank Pallone (D-NJ):

Oh, thank you. Well let me have some follow up on that with Ms. Givens. Ms. Givens, are consumers aware of these data brokers and do consumers have any practical options to tell data brokers to stop collecting or to delete their information? And does the American Data Protection and Privacy Act that we, you know, pass us outta committee less Congress? Does that take the right approach on data brokers?

Alexandra Reeve Givens:

So consumers are largely unaware of data broker practices, and I think it would be extremely hard pressed to name any part of the problem is that they operate in an opaque layer of the digital ecosystem and don't have to interact directly with consumers, which means they don't need to earn consumer trust. Some data brokers allow users to opt out, and some states are beginning to require that they make this option available, but it's incredibly hard to exercise. First of all, you need to know who the data brokers are, and there are thousands of them. So even knowing where to go to opt out is a challenge. Second, even if one is able to go through that interface, and often it involves many steps, you have to keep going back to do it again and again, because the settings might change, they might collect new data over time. So the ADPPA has some really important provisions on this. One is the data brokers need to disclose who they are and they need to register with the ftc. So there's a one stop shop for users to go and see who data brokers are. Data brokers also need to comply if you opt out of sharing your information with them. And they need to have a centralized mechanism that allows opting out across the entire data broker ecosystem. That's hugely important for consumers to actually be able to influence and operate their rights.

Rep. Frank Pallone (D-NJ):

Alright. Thank you so much. I got a little over a minute left. Let me ask Ms. Rich can you tell me about some of the most egregious practices you saw by data brokers in your time at the ftc?

Jessica Rich:

Well, this is gonna seem kind of old fashioned since, since companies can do so much more with data, even than when I left the FTC in 2017. But selling data to con artists with reason to know it could be used for fraud. We had a bunch of cases like that. Failing to secure sensitive data leading to massive breaches. Many of the breach cases, especially in the early days, involved data brokers who had amassed all this sensitive information and led, you know, to identity theft when that data was breached. Failing to vet buyers leading to significant access to sensitive information by the, you know, anyone that could pay. And again, identity thefts. So, so these are, this is what we saw all the time. And again the marketplace is so much more sophisticated that I, you know, I'm sure there are all sorts of other things that we could list that are even worse.

Rep. Frank Pallone (D-NJ):

Thank you very much. Thank you Mr. Chairman.

You very much, I appreciate it. Next we all recognize the Ranking Member, excuse me, the Chairman of the committee. She's wearing her E&C colors today. That's right. So we appreciate all your great work, and I recognize you for five minutes. Thank you.

Rep. Cathy McMorris Rodgers (R-WA):

Thank you. Thank you very much, Mr. Chairman. And again, thank you to the panel for being here. I wanted to start with an issue that we've been focusing on debating over the last few years around targeted advertising. Mr. Mud, there's a particular line in your testimony, which I think really hit the mark, and you said, over the years attention began to emerge, the development of the rich consumer profiles that were so powerful in improving products of all kinds came at the cost of individuals' privacy. This trade-off is why we're here today. And I believe that you're right on that assessment, even if the online advertising industry doesn't want to admit its reliance on personal information and freely flowing following Americans as they browse the internet. So the question is, do they really need personally identifiable information in order to facilitate e-commerce? And you're, you're suggesting that there's a middle ground here. So I just wanted to ask, would you, would your privacy enhancing technologies also called pets permit innovation in the digital advertising ecosystem to continue? And how can pets be used to help small businesses advertise to their customers without customers feeling that the businesses know too much about them?

Graham Mudd:

Thank you, Chair Rogers. Yes, we do believe that this is a reasonable middle ground, and that it would protect from the flow and sharing of personally identifiable information from one company directly to another. Which leads, as I mentioned, to the development of these very rich profiles which we've talked a lot about today. And the way that that happens is, you know, reasonably straightforward. What we need to do is ensure that only the aggregated, anonymized insights that are required to understand how ads work and to improve their relevance are shared. Not the individual level data that is not as not required. To give an example in another industry, pharmaceutical trials, they need to bring data together from, you know, the drug companies as well as the practicing physicians, but they don't wanna share individual level data. They can use these exact same privacy enhancing technologies to understand whether the drug worked or didn't work. But they don't have the user level data to do that. That's not important for the, for, for the case.

Rep. Cathy McMorris Rodgers (R-WA):

Thank you. Thank you. Ms. Rich you mentioned in your statement the importance of creating a regulatory climate that's conducive for businesses to be able to comply. And you know, there's a lot's been said about the negotiation that took place on an ADPPA. We included a right to cure in the private right of action to ensure businesses are able to comply with the law. And this is important. So businesses are not buried underneath piles of demand letters seeking payments without the opportunity to cure an alleged violation, because I don't think any of us wanna be there. So, would you speak to the benefits of a right to cure for businesses who face an alleged allegation?

Jessica Rich:

Well, when I was at the FTC, I wouldn't have supported a Right to cure because it does give people a second bite of the apple to, you know, violate the law. But when it comes to a private right of action and their concerns about the effect on not, you know, on companies that aren't the largest companies and can't afford all this litigation, I think the right to cure is a very reasonable response to make sure that instead of private right of action leading to a lot of litigation companies have a chance to, to get it right, one chance and then comply and have the protections in place for consumers. And I would note that other privacy regimes include rights secure both for private right of actions and for government enforcement.

Rep. Cathy McMorris Rodgers (R-WA):

Thank you. Ms. Givens, I, I know that we may not see eye to eye on every aspect of ADPPA, but I, I, I do want to thank you for your support of our work. A private right of action is a tough nut to crack in the scope of a bill like ADPPA. You highlight some of the boundaries of where FTC can enforce harms. In your testimony, you state the FTC's unfairness statement, which courts still cite in their opinion, says that emotional impact and other subjective types of harm will not ordinarily make an unfair practice unfair, but might do so in extreme cases when tangible injury can be shown. So I am sympathetic for why there's strong desires to include a private right of action in such an instance when big tech may harm someone, especially a child. However, I also wanna make sure that it's not abused by the plaintiff attorneys who would rather laws be so stringent. So businesses are more likely to be out of compliance in order to sue. Would you be willing to work with us to ensure that there's parameters for how the private right of action operates for businesses, especially businesses of different sizes?

Alexandra Reeve Givens:

Yes, Madam Chair, of course, we're always happy to work with this committee. A private right of action really is essential because the FTC and state AGs alone won't be able to keep up with the pace of commercial activity. And consumers deserve the right to be able to vindicate their rights when state enforcement isn't stepping in. But respectfully, the ADPPA actually already puts in a lot of protections to help small businesses and others from this risk of litigation, sometimes over the objection of consumer advocates and the negotiations. But the committee did an awful lot of work to get there. Just to give a couple examples, the private right of action only applies to some portions of the law and cannot be used against small businesses. In addition to that, there are limits on the damages that can be pursued. So right now, the private right of action can only be used for compensatory damages at injunctive relief, not for statutory damages, which might remove a lot of the incentives for more speculative litigation.

In addition, you already mentioned the right to cure. There's also an obligation for any plaintiffs before they file suit or even send a demand letter to give notice to the FTC and to state attorneys general in case either the FTC or state AGs want to bring the enforcement action instead. And there's a 60 day waiting period for that to happen as well. When you couple that with restrictions that courts have already brought on standing, making it hard for consumer groups and, and class actions to be filed, there are a lot of protections that I think address the concerns that you have raised, coupled finally with the reporting obligation in the bill for the FTC to assess the impact on small businesses. So again, what we see here is a hard fought compromise but it's one that helps make sure consumers can vindicate their rights in some circumstances while mitigating the risks of abuse against small businesses or extraneous litigation.

Rep. Cathy McMorris Rodgers (R-WA):

Thank you. I'm just to clarify the that was a quote I was asking you. I, I said it was from Ms. Rich, but I appreciate you addressing and answering my question,

Alexandra Reeve Givens:

And thank you. I'll take credit for her testimony. Any time.

Rep. Cathy McMorris Rodgers (R-WA):

<Laugh>. thank you, Mr. Chairman. I yield back.

Rep. Gus Bilirakis (R-FL):

I thank the Chair. Appreciate it very much. Next we have Ms. Trahan, I recognize you for your five minutes of questioning.

Rep. Lori Trahan (D-MA):

Great. Thank you, Mr. Chairman, for organizing today's important hearing. You know, like many of my colleagues on the dais, I'm disappointed that we failed to pass the American Data Privacy and Protection Act in the full house last Congress. And I urge my colleagues, particularly those who are new to the committee, to continue working in a bipartisan way to pass a comprehensive privacy law that meets the needs of the families we represent Mr. Chairman. The federal laws that govern our privacy today in March of 2023 are the same ones that were in place when we he, when we had a hearing on holding big tech accountable a year ago in March, 2022. They're the same laws that were on the books when the CEOs of Google Meta and Twitter testified before this same committee a year prior to that on March 21.

In fact, they're the same laws that for decades have permitted companies to harvest our sensitive data. Things like medical symptoms that we look up on a search engine or our location that paints a picture of where we work, where we send our kids to school, and where we pray and sell that data to third parties, or use it in ways that are contrary to what any of us would reasonably expect. Many of us have been sounding the alarm about this for a while. In the past two years, I've sent inquiries to phone and messaging apps asking about the misuse and sale of messaging metadata to data brokers about the sale of geolocation data and to online gaming companies about their treatment of data collected on our teens. These companies can and should be doing better, but without comprehensive privacy legislation like ADPPA, they won't act, and it doesn't stop there.

One type of product I wanna highlight, the desperate need for an update is education technology. According to a 2021 study from Center for Democracy and Technology, 85% of teachers and 74% of parents believe ed tech is very important to students' education. And more teachers are becoming aware of the need to thoughtfully consider student's privacy. However, a majority of parents still have concerns about student privacy, and a significant number of teachers still have not had training on privacy policies and procedures. So, Ms. Givens, with the Family Educational Rights and Privacy Act, or FERPA having passed nearly a half century ago back in 1974 and still being the LA of the land when it comes to student data, can you describe to what extent companies that offer ed tech software are or are not covered by FERPA?

Alexandra Reeve Givens:

Thank you for the question. And for citing our report, we spend a lot of time with educators, teachers in the classroom as well as students and their families. And so we see firsthand a level of concern about how kids' data is being used in this environment. To answer your question, FERPA applies to personal information from education records that are maintained by covered entities. That basically means public K through tells 12 schools, colleges, and universities that accept federal student aid. When EdTech software vendors work with those covered entities, they have to comply with ferpa. But really importantly, FERPA falls short in all of the other ways in which EdTech vendors might be engaging and receiving information about students. So first, it doesn't contemplate harms that might result from other types of information, like when the vendor interacts directly with the student and gathers that type of record.

Second, FERPA doesn't address any of the civil rights issues that can stem from algorithmic harms as we're seeing increasing use of AI systems deployed in educational settings. And third purpose enforcement mechanisms fall directly on schools and not on the vendors. And the punishments are draconian. You lose your federal funding. We need the burden for privacy compliance to sit not just with the schools, which are so overwhelmed, but with vendors in this space as well. And so complimenting FERPA with strong comprehensive privacy protections for those commercial uses of this technology is really important as well.

Rep. Lori Trahan (D-MA):

Thank you. You know, in some cases, ed tech software you know, as you mentioned, not offered through business to school contracts. Instead, they may be a free online game or an educational app. And the data collected while on these sites or apps can later be used to target ads or sold to third parties, particularly on our students who are under the age of thir who are 13 and older. So the idea of consent gets murky, as you mentioned. When we are talking about a student or the appearance deciding between participating in class while being tracked versus not participating at all. Can you speak to how the duty of loyalty and data minimization and AD p a would be applied to these types of sites and apps?

Alexandra Reeve Givens:

You're exactly right. So FERPA only applies to vendors when they're processing education records, which doesn't include any of the many other ways that students are interacting with technology Today. I think about the experience with my own children, and they download apps not going through those official channels. They're sharing a lot of information, and they're doing it to be able to have an educational experience. Again, this shows why notice and consent is broken as a model, because there isn't a question of consent. You want to be able to access these platforms. Yep. And sadly, CAPA is falling short here too. Although, of course it does offer some protections to services targeting children under the age of 13, that too, essentially rests on a notice in consent regime that is really hard to operationalize in practice. So that's why we need the broader comprehensive privacy protection, is to regulate those additional uses and create baseline protections for students. Thank

Rep. Lori Trahan (D-MA):

You. Thank you so much for your testimony, and I yield back.

Rep. Gus Bilirakis (R-FL):

Thank you very much. Next we'll recognize Dr. Dunn from the great state of Florida.

Rep. Neal Dunn (R-FL):

Thank you very much, Mr. Chairman. I appreciate the opportunity to discuss the importance of advancing a bipartisan national privacy and data security bill. For years, the FTC and the industry has been calling on Congress to enact a uniform data privacy bill in its high time we did that. A national standard will provide all American certainty that their data is protected while providing clear rules of the road for businesses to follow. But I know that this topic is incredibly complex and has to be carefully crafted to make sure that we protect Americans without stifling our innovation. And our industry. Fortunately, the Chairman and the Ranking Member have assembled a stellar panel of witnesses with outstanding qualifications, in just exactly this very difficult area. So we're counting on the three of you to make this happen. No pressure. During my time on the China Task Force last year it became clear that the Chinese Communist Party poses a huge threat to the free world. And all these digital areas, they cheerfully sabotage freedom, democracy, everywhere they go. And this mentality permeates all of their corporations, including those that operate in America. Ms. Givens, the Center for Democracy and Technology promotes civil liberties and democratic values and the digital edge to help provide context and clarity for our committee. Can you briefly summarize the difference in the civil liberties and the fundamental values in the digital area, you know, between the CCP authoritarian system and our own system?

Alexandra Reeve Givens:

So, I will admit that I am not a China expert, and I know that this committee had an important hearing last month that dived even deeper into these issues. But I will tell you why we fight for privacy legislation as a question of American democracy. And the reason is, when consumers are trying to access information, when they're trying to communicate with their loved ones, when they're trying to find and share information and express themselves, they deserve a right to not be tracked and surveilled with every step click and scroll that they take. People often talk about the right to privacy being the gating item that protects all of our other fundamental rights, our rights to expression, our right to access information, our right to associate with other people. And I deeply believe that we need those baseline protections for people to be able to exercise their democratic rights. And that's what makes this bill an important aspect.

Rep. Neal Dunn (R-FL):

For that. I, I thank you for that. In your testimony, you highlight the ways that data brokers assign profiles to people based on information they compile from multiple sources. I'm concerned by the way the CCP could take advantage of the system to build highly individualized profiles on Americans in general. What would you say the current threat assessment is of the CCP accessing American citizens' data?

Alexandra Reeve Givens:

Well, the problem with the current digital ecosystem today is that consumers have no idea where that data's going. And it could be accessed by anybody, third parties, foreign intermediaries, foreign.

Rep. Neal Dunn (R-FL):

So this is threat. This is threat level orange or higher, huh, <laugh>.

Alexandra Reeve Givens:

And we need controls on that. And the way to do it is by minimizing the amount of data that companies have and putting restrictions on what data can be shared so that people actually can have confidence that when they share something, it's not being accessed by those unknown third parties.

Rep. Neal Dunn (R-FL):

Let me ask you another, so the ADPPA would require companies to notify individuals whether or not their data collected by, you know, whether it's processed in transfer to, stored in any way accessible to China, in addition to a few other concerning countries. Is this an adequate protection or should we be fencing this data just into America alone? I mean, how would we control data once it's outside our borders, whether it's in China or in a great ally like Canada? I mean, how do, how does that happen?

Alexandra Reeve Givens:

Right. So I think the idea of fencing data is incredibly problematic. It's hard to operationalize. It raises much bigger questions. Here, the regime that you talked about in ADPPA provides notice about when data is being transferred to some particular named countries. But more important than that, in my opinion, are the data minimization provisions in the bill, which say that for everyone, let's be careful about how much data is collected in the first place, and then let's impose restrictions on how that data is shared. And that's the way to help train in this unfettered sharing and access to information to any type of unknown party, including foreign entities.

Rep. Neal Dunn (R-FL):

Excellent. Excellent. Mr. Mudd, would you like to comment on the potential benefits of greater transparency about data collection for individuals? And, are there any that present challenges to the businesses in terms of complying with more transparency requirements?

Graham Mudd:

I think transparency is an important element of the solution, but by no means is it, is it sufficient? I think it is important for consumers to certainly understand and have access to the data that is collected about them to control it and so forth. But as we've talked about at length here, it is really important to raise the baseline instead of just putting the work and the burden on consumers to understand the data collected and how they might use it.

Rep. Neal Dunn (R-FL):

Thank you very much for your answers. Mr. Chairman, thank you very much. And Ranking Member, thank you very much for this meeting.

Rep. Gus Bilirakis (R-FL):

My pleasure. Thank you very much. Great questions. We'll now recognize Ms. Kelly from the state of Illinois. You're recognized, ma'am, for five minutes.

Rep. Robin Kelly (D-IL):

Thank you, Mr. Chair and Ranking Member Schakowsky for holding this hearing this morning. I'm encouraged by my colleagues on both sides of the aisle who agree that we must continue working on a national standard for data privacy for American consumers. Although I'd hoped to get to something in this space done last Congress, as we know, it's never too late to discuss such an important topic impacting all of our constituents' lives. As we all know, almost everyone uses a smartphone, tablet, or laptop to complete mundane daily tasks such as ordering food, shopping online or simply searching the web for entertainment. So I'm especially interested in how data practices, which include companies sharing or selling consumer information, can be used to harm Americans. Ms. Givens, you addressed this very concern at the top of your witness testimony. Can you explain some of the specific harms resulting from companies and data brokers using sharing or selling consumer information?

Alexandra Reeve Givens:

Absolutely. And thank you for the question. As I outlined in my testimony, there are examples of how data brokers gather all of these different pieces, information across the web to create very detailed profiles on people and to lump them into categories, which is used for targeting of ads and other types of inference based behavior. When we look at what some of those categories are, you can instantly see what the nefarious harm might be. Ethnic second city dwellers, you know, struggling seniors. This is offensive, but it also is showing why those ads might be targeted to particular vulnerable populations. And that's the type of consumer harm that we need to be careful about and we really need to try to reign in. The other part is when ads are being targeted to people based on protected characteristics that can be race, gender, religious, you know, religious identity, many other factors or approximations of those factors. And that's another instance where we're seeing live instances of economic and social harm.

Rep. Robin Kelly (D-IL):

Also, I'm the founder of the Tech Accountability Caucus, and I wanna dig into this issue around data purpose and use limitations. So I'm interested in making it easy for consumers to understand when their personal information is being collected, how it's used, and when and for what purpose is shared. So, Ms. Givens, toward that end, in addition to requiring data minimization, do you think it would be beneficial to consumers for a privacy, federal privacy framework to include a provision directing the creation of a list of standardized privacy categories and symbols aimed at providing simple, clear indications to consumers about how their data's being treated?

Alexandra Reeve Givens:

So we need baseline rules about how data can be used, but there also, of course, need to be elements about clarifying notices to consumers. We don't wanna rely on notice alone cuz consumers can't keep up. But we do want consumers to better understand what practices are and when there are moments to exercise their rights to agree to particular instances of data sharing, to be able to do that in an educated way and an efficient way. There's language in ADPPA now that talks about what those short form notices are called. That's the term of art in the bill. But I do think that real guidance there about what that looks like, some standardized way of talking about this, perhaps the use of symbols to help people understand particular practices could go a long way in boosting consumer education and therefore having consumers feel more empowered.

Rep. Robin Kelly (D-IL):

Thank you. Lastly, as a Black woman, a member of the Congressional Black Caucus, I'm deeply concerned with the prevalence of discriminatory digital marketing and advertising. We know companies use different data points to discriminate against consumers and cause real harm. Ms. Givens will give you a break, Mr. Mudd and Ms. Rich, if you could answer this question, are there certain use limitations, for example, that could curb discrimination and help protect civil rights, especially as it relates to protecting communities of color?

Jessica Rich:

Absolutely. And the ADPPA, as you well know, includes anti-discrimination provisions that are remarkably powerful given where we've been in this debate as well as assessment and auditing provisions to create greater transparency and accountability. And I would note that many of those provisions I mean the FTC has stated that it can reach discrimination, but many of the provisions of like on accountability assessments, executive accountability, the, the dig restrictions on targeted advertising, the data broker registry, all these things we've been talking about would be very hard for the FTC to, to reach Congress needs to do it.

Rep. Robin Kelly (D-IL):

Thank you.

Graham Mudd:

I would agree wholeheartedly. I would say that it's absolutely reasonable and critical for sensitive data race gender sexual orientation and so forth to be treated very differently from, you know, other types of behavioral data. Not just in its use, but also its collection and sharing.

Rep. Robin Kelly (D-IL):

I dunno, if anything you wanna quickly throw in, I'm running outta time.

Alexandra Reeve Givens:

No, I'll let my colleagues do the talking. Thank you.

Rep. Robin Kelly (D-IL):

Thank you so much. And I yield back.

Rep. Gus Bilirakis (R-FL):

Thank you very much. Next we'll recognize the gentle lady from Arizona. My good friend, Ms. Lesko.

Rep. Debbie Lesko (R-AZ):

Thank you Mr. Chairman. And thank you to all the witnesses for being here today. In-Home connectivity has become a major selling point for homeowners and voice controlled personal assistance such as Apple's Siri, Amazon's Alexa and Google's Google Assistant have been designed to serve as the control center for our homes. In Google's case, it allows consumers to use their voice to control smart home devices around their home that are third party smart home products. This functionality requires specific data sharing between the connected device and the Google assistant device to carry out a simple command. Google is making a change in June of this year to how these integrations work, which will significantly expand the breadth and frequency of data sharing and increase the rate at which data is collected and transferred to Google for their analysis. My question to Mr. Mud, what changes, if any, should be made to the American Data Privacy and Protection Act passed out of the committee? Last Congress to put consumers in control of data shared through their smart home systems.

Graham Mudd:

Thank you, representative. I cannot profess to be an expert in smart home data collection, but I will say that the collection of data not just online but offline, certainly must be in scope for this legislation. And I'm happy to get back to you with some suggestions if, if helpful, on how the legislation might be improved. Also, happy to defer to my fellow panelists here.

Rep. Debbie Lesko (R-AZ):

And Ms. Rich, do you have any thoughts on that? Also, Ms. Rich? should certain types of smart home data be subject to higher standards of privacy controls and sharing limits? For example, data about a door lock or a security system?

Jessica Rich:

I'd have to review the long list of sensitive information detailed in the ad p p a to see if it already captures that. But certainly when there are sensitive categories of information that might be captured by internet of things technology, those should have special levels of protection. Certainly kids, kids' information also should have special levels of protection.

Rep. Debbie Lesko (R-AZ):

Thank you. For Ms. Givens how do we strike the right balance between protecting consumers' data while not creating loopholes for criminals? We've had law enforcement have some concerns about the legislation.

Alexandra Reeve Givens:

Yeah. So to speak, the bill as it stands today, does not limit law enforcement's ability to pursue its investigations to access information from companies. I understand there have been some concerns raised that by reigning in the sheer extreme volume of data that data brokers are able to gather, that might impede law enforcement's ability to do kind of one stop shopping and go to those data brokers as a source of resources for their investigations. I would say on that point, we have to reach a balance here. And when we look at the unfettered collection and the additional harms being perpetrated by data brokers, I think that that's an important balance to strike. And we need to outweigh, we need to weigh those harms. The bill also includes some really important provisions that already consider law enforcement concerns. So, for example, user's rights to delete their information or to opt out of data brokers information on them are limited when it might impact a law enforcement investigation. So the committee has already given careful consideration to this. I think they have landed in the right place, and it cannot be that we allow the unfettered widespread sharing of data purposes just because of this law enforcement concern, when law enforcement can still access the vital records that it needs from the first party holders of that information, for example, credit card companies, et cetera.

Rep. Debbie Lesko (R-AZ):

Thank you. Mr. Mudd, do you believe it's possible to protect personal data while also allowing businesses, especially small businesses, to efficiently digitize advertise? I mean, a number of businesses have been worried that they won't be able to advertise?

Graham Mudd:

Thank you for the question. Yes, I do. I won't pretend that there is zero cost to business from moving to a more private approach to digital advertising. But I do believe that it will not be a, a and, and should not be a catastrophic change. And, that the tradeoff is well worth it. The technologies that we in others are developing, as I mentioned earlier, are employed in many other industries. They have found ways to, to complete what they need to do using privacy enhancing technologies. And I think with legislation in place, we can apply the innovation that has currently been focused on extracting data from as many places as possible to instead using it in as private a way as possible. And so my, my general answer to your question is yes, I do believe it is very possible for businesses to thrive with this legislation in place.

Rep. Debbie Lesko (R-AZ):

Thank you. My time's expired; I yield back.

Rep. Gus Bilirakis (R-FL):

I thank the general lady and now recognize the general lady from Michigan. Ms. Dingell.

Rep. Debbie Dingell (D-MI):

Thank you Chairman Bilirakis in Ranking Member Schakowsky for holding this important hearing today and all of you for testifying. I'm hoping that this is gonna be the Congress. We get this done because the subject is so important. Well, I look forward to this discussion as a continuation of this committee's very strong bipartisan work to enact comprehensive data privacy legislation. We got, I think, total agreement that self-regulation is not sufficient, and that it has created a multi-billion dollar industry through the transfer and sale of consumer consumer data, mostly without the consent or knowledge of the consumer. We wanna empower the consumer to be the ultimate arbiter of their data, while allowing companies to perform any action the consumer should reasonably expect from the use of a platform device or other technology. Any legislation that this committee supports must protect personally identifiable information, including geolocation sensitive health data, provide what everybody's talked about today, additional protections for minors in teenagers who to this day do not have robust protections online, minimize the necessary data captured to perform operations and promote innovation.

This topic is important and has significant ramifications on public health and safety. Our economy, national security, and competitiveness. This, so you're being here on our work really matters. I'm gonna focus on data and how much is being collected and people not realizing it. Mr. Mudd, in your testimony, you mentioned the significant amount of data companies collect to develop profiles of users, which I will respectfully say. Again, nobody has any idea how much is being collected on them. On average, how many pixels would you estimate that these companies collect on average, on one individual? And are there categories of data captured that the user may not have explicitly consented to sharing while using a platform device or being tracked?

Graham Mudd:

Thank you for the question. In terms of, you know, estimates of, of, of the prevalence of, of pixels and data collection there are many out there, but frankly, the scale of the use of these is so large that it is actually quite difficult to study them comprehensively. I would estimate that there are well over three to 400, if not into the thousands of companies that are actually deploying these pixel technologies to collect data. Now, for the average consumer as you visit any given website you're likely to encounter numerous of these for a given retailer, my estimate would be somewhere on the order of five to 15 different pixels that are sharing data with various ad platforms. So, you know, you multiply that by the number of websites that you visit over the course of a week or month. And the ability to collect a very rich profileis certainly there.

Rep. Debbie Dingell (D-MI):

Thank you. And by the way, subject to misinterpretation, always tell the committee, I do a lot of my own research before committees and I was doing opioids and within two hours, sort of getting opioid drug addiction treatment ads. I'm gonna, I've only got a minute and 40, so I'm gonna ask for a yes or no from everybody on the panel. Do you believe that absent a national data privacy law tech companies and others are incentivized to maximize their collection of data to participate in the digital economy and data marketplace? Yes or no, Mr. Mudd?

Graham Mudd:

Yes.

Alexandra Reeve Givens:

Yes.

Jessica Rich:

Yes.

Rep. Debbie Dingell (D-MI):

Thank you. As we have seen at events like the Cambridge Analytica scandal, data breaches present a very real threat to consumers and companies participating in the data economy. To the panel, yes or no? Again, without a national data privacy law, can companies be expected to enact stringent standards to ensure that consumer's data is secure?

Graham Mudd:

No.

Jessica Rich:

No

Alexandra Reeve Givens:

No.

Rep. Debbie Dingell (D-MI):

I get no, from all three. Last question to the panel again, yes or no, do you believe that without a national privacy law, the amount of data that these companies acquire presents a risk to consumers and children using the platforms or devices?

Graham Mudd:

Yes.

Jessica Rich:

Yes.

Alexandra Reeve Givens:

Yes.

Rep. Debbie Dingell (D-MI):

Thanks again to all of you for being here today. Robust data protections in this space will provide safety and security for consumers, children, survivors of domestic violence, which I care about a lot, protected classes while offering businesses and industry, the expectations, regulations, and the tools necessary to operate, innovate, and also the most important thing, mitigate risk from dangerous data breaches. Thank you to all of you for your work and yield back, Mr. Chairman. Thank you. 10 seconds.

Rep. Gus Bilirakis (R-FL):

Appreciate it. Thank you very much. We appreciate that. It all counts. Next we'll have Representative Pence from the state of Indiana, you're recognized for five minutes, sir.

Rep. Greg Pence (R-IN):

Minutes, sir. Thank you, Chairman. Thank you for holding this meeting. You know we thank you, the witnesses, for being here today. You know, you're hearing the same thing from everybody. So I'm, because we all feel the same way. Our constituents all feel the same way. I can tell you feel the same, the data privacy. And Ms. Givens, when you said that we've had 21 hearings in five years, I took all of my comments and I threw 'em out because I thought, well, here we go again. If we're, it's almost like we're, we're just deja vu doing the same thing over and over and over. And Mr. Mudd, in your testimony, which I with Ms. Dingell, you were, you were making th this point again, and I'm gonna quote, the scale of data collection and transfer using these me mechanisms is difficult to comprehend the how big, how much data you're collecting from me.

I walked in this morning and I have a letter from Privacy for America. I don't know anything about 'em really. But here they say that consumers' incomes have been enhanced to the tune of $30,000 because of all this data collection. And I think that's great. So, my question gets to the money. I've been a businessman all my life, and if data collectors and they're sitting in this room or making $30,000 in services, how much are they making to give me that much value? And since it's incomprehensible the amount of data being taken from me, and I'm gonna ask you each of this, can I be compensated for this incomprehensible amount of data that is being taken from me? Start with you, Mr. Mudd.

Graham Mudd:

Sure. Thank you for the question. I, whether users can be directly compensated or not, I think is a, certainly, it's an interesting question. One that's imposed many times in the past and should be further explored. I will say that the notion that the only way that businesses can leverage digital advertising effectively is through this incomprehensible collection of data is absolutely false. And that there are other ways through this problem that do not meet, that do not sacrifice the privacy of, of individuals. And that those technologies, as I've mentioned earlier, are employed elsewhere in, in a proven fashion.

Rep. Greg Pence (R-IN):

You know, if I, if I can go off on that. I've done a lot of digital advertising in business, okay? And we've thrown out the baby with the bathwater when it comes to mail, when it comes to mail, radio, and tv. And I'm not doing an advertisement for the other mediums or the other venues. Okay? But I have found that digital advertising for a small business is not very effective. But back to the same question, can we, can I, can I make money off my data that everybody's taken from me,

Alexandra Reeve Givens:

Like my colleague, I'll say it's an interesting question, but I don't think it gets to the heart of how we protect consumers going forward, because discussions about monetization and compensating users doesn't actually get them the protections they want for their data.

Rep. Greg Pence (R-IN):

So let me, okay, so I know we're, I can see where you're going with that, Ms. Givens, but if 21 committee hearings in five years isn't moving the ball forward now, and in a sense there's almost a sense of delay, keep talking about the same thing. Why wouldn't finding a way to monetize for me to get paid for my information, why wouldn't that maybe change the trajectory?

Alexandra Reeve Givens:

Well, it's my job to be an optimist, and I think this committee has made progress as a result of those 31 hearings and, and, and we're close. And, but what it's important, I think what needs to be addressed here is that really in the advertising world, we have market failure right now. The only incentive is a race to the bottom, to hyper target as much as you can. And for the digital advertising companies that offer the most specific profiles on people, they're the ones that win the waste. And there's no incentive for them to innovate into privacy, protecting ways of delivering ads that matter.

Rep. Greg Pence (R-IN):

So, so I would...

Alexandra Reeve Givens:

That's the innovation we wanna encourage.

Rep. Greg Pence (R-IN):

Thank you. Now I'll move on to the last, last witness, but you know, if, if I get the opt-in or because I'll get paid for it, maybe, maybe that'll change their behavior too. Okay. And then finally the last witness.

Jessica Rich:

You're asking me the same question?

Rep. Greg Pence (R-IN):

Well, yes, ma'am. Yes, Ms. Rich.

Jessica Rich:

Well, one of the problems with the, that, that idea of an even exchange is that it hasn't worked in terms of notice and choice where consumers have to individually, you know, supposedly negotiate with each company.

Rep. Greg Pence (R-IN):

I was in the banking industry and we had truth and lending where you had to make it real simple what you were agreeing to when you clicked. Yes. But with that, I've run out of time. Mr. Chair, I yield back.

Rep. Gus Bilirakis (R-FL):

Appreciate it very much. Now, we'll have the general lady, my good friend from the Tampa Bay area. Representative Castor will recognize her for her five minutes.

Rep. Kathy Castor (D-FL):

Well, thank you, Mr. Chairman and my good friend. And thanks to the Ranking Member and, and to the witnesses. Thank you. You've been very strong and have provided very clear expert advice to the committee. We need it. This is really our kickstart to our privacy effort this session. And it's heartening to understand that it is a priority across the aisle here. I was very proud to contribute to the committee's efforts in the last Congress for the American Data Privacy and Protection Act, particularly the provisions relating to children's online safety. Because ADPPA included elements of my Kids' Privacy Act, such as the targeted advertising ban, age appropriate design provisions, enhanced limitation on sharing children's personal information with third parties, special protections for personally identifiable information about children, a dedicated youth and privacy and marketing division at the ftc, and increase oversight of COPPA safe harbors.

I really urged my colleagues to act with urgency here. The harms to kids online are now very clear. And we really shouldn't take too much longer to act. We need to do this for all Americans, but I think there is a special threat to children's Online Privacy and safety. The Children's Online Privacy Protection Act, COPPA, is wholly outdated. It's been many years since the Congress has acted. And can you all take a look at that, at what has happened since the adoption of COPPA? And give us some examples of what you see as a growing online harm to children and all Americans.

Jessica Rich:

Well, for one thing, COPPA is limited to children under 13. And as this committee and other work done in other committees have shown there were a lot of harms to at least to people who are y you know, under 16 or 17, you could go higher too. But all the, the, the things we've seen with social, social media. So if, if, if this committee and the public is seeking greater protections for teens, COPPA doesn't do it. COPPA also is very basic, and the ftc, even in the 2013 rule rule review, which was the last one, did somersault to try to get at the platforms to, to try to protect information that wasn't listed in the original statute like location data and and, and, and IP addresses. And so, just the, it absolutely needs to be updated to reflect what has happened since COPPA was passed in 1998.

Rep. Kathy Castor (D-FL):

And you highlighted the fact that the FTC isn't, hasn't been using some of its tools. Now, in response to language I authored in the FY 22 Omnibus, the FTC published a report providing details about its work on Kapa. In that report, the FTC stated that the commission dedicates approximately nine to 11 staff and has opened 80 investigations of potential COPPA violations in the past five years. That's woefully inadequate. And even the FTC says as much in the conclusion to that report, they stated with more resources, however, the FTC could do more, and we need them to do more. Do you think the FTC should have more resources and authority to protect kids online?

Jessica Rich:

Absolutely. It is shocking how few resources the FTC has for privacy. It's a fairly large proportion of the consumer protection mission, but it's about 50 dedicated people to privacy, which if you consider that to other countries that are much smaller and the kind of staff they have to police privacy it's just woefully inadequate. The FTC absolutely needs me more resources, but it also needs more authority because the authority is thin.

Rep. Kathy Castor (D-FL):

Ms. Givens.

Alexandra Reeve Givens:

The one point I'd add on the FTC resources is that all of the research shows that it's an excellent investment of taxpayer dollars. The Congressional Budget office has shown that for every dollar invested in the FTC, taxpayers get $3 in return because of the enforcement power that it would add to the agency. So I think it's incredibly important when we think not only about protecting consumers, but good governance as well.

Rep. Kathy Castor (D-FL):

And I wanna, I wanna thank my time's running out, and I wanna be sure that I take time to thank Ranking Member Sikowski for being a leader on giving the FTC more resources to protect consumers. And I hope we'll continue this Congress. And I wanna thank Chairman McMorris Rodgers. I heard her clear and call at the beginning of this hearing loud and clear, and I appreciate her outreach to my office. And we're gonna continue working to make ADPPA strong for all consumers, especially our kids. Thanks, I yield back.

Rep. Gus Bilirakis (R-FL):

I thank the gentle lady. Now I'll recognize the gentleman from the state of Georgia. Mr. Allen, you're recognized for five minutes, sir.

Rep. Rick Allen (R-GA):

Thank you, Mr. Chairman and Ranking Member for holding this hearing on the need for a national privacy standard. I think today we're getting closer than ever to enacting some type of nationwide privacy and data security framework, which will give businesses the certainty they need to innovate while providing Americans more control over their data. I appreciate the hard work done by Chair Rodgers and Ranking Member pong last Congress to get to this point, and I look forward to engaging in, in, in getting this done in, in this Congress. Mr. Mudd is a former employee of Meta, and now as Chief Product Officer you've seen both sides of the advertising ecosystem kinda help us understand exactly how they make the money that they make, in using our information.

Graham Mudd:

Certainly. So the collection of data I described earlier you know, it's based on your behavior on websites and oftentimes is then shared with the ad platforms that any given advertiser is using to find their customers. Now, how do those ad platforms make money using that data? I think that was your question. Well, effectively, the, the, the, the better the ads work, that is, the more effective they are in identifying specific individuals who are likely to be customers of any given company, the more those ad platforms can charge for those ads. And so their incentive, of course, is to improve the relevance of the advertising. Nothing wrong with that incentive in and of itself. It's the means by which they do it that we've talked about that is oftentimes very problematic. And so that incentive challenge gathers more data to become more relevant in order to charge higher prices for ads. Is that really the heart of the vicious cycle that we we're, we're, we're faced with today?

Rep. Rick Allen (R-GA):

Yeah. And as I see it, there's certain information that obviously I just want maybe me and my family to know about me <laugh>, and they like to get that information. Does it bring the highest price? You know, and it,

Graham Mudd:

It's a good question. Yeah. you know, the value of data you know, is, is, is certainly, you know, variable b based on who the advertiser is, right? Location data is very important to an offline retailer who wants to find customers that are near their outlet. You know, whereas healthcare data is very valuable to different types of advertisers. And

Rep. Rick Allen (R-GA):

So what role do the data brokers play in this?

Graham Mudd:

Data brokers oftentimes enrich as the, the term of art is called the profiles of ad platforms that might not be able to collect that information themselves. So, to give an example, maybe a newspaper site doesn't have any real insight into, you know, your financial history and, and so forth, but they'd love to be able to sell advertising to credit card companies mm-hmm. <Affirmative>. And so they go to the data broker, buy that data about your financial situation, and therefore can sell to a credit card company, more effective advertising.

Rep. Rick Allen (R-GA):

And any idea how much of this information is being held and used by data brokers?

Graham Mudd:

I don't know if I could find a way to quantify that for you. All I can say, as I said earlier, is that it's incomprehensible to any ordinary citizen. And the scale is, is, is, is quite massive.

Rep. Rick Allen (R-GA):

Well, you know, my state has been ranked as the number one state to do business in the last 10 years. And in my younger days, I I started a company, a small business, and I was a small business owner until I was elected to Congress. And I know a lot of small companies can unintentionally bear the brunt of regulations if protections aren't carefully crafted. How can companies like yours enable small businesses without sacrificing the privacy of consumers? Yep.

Graham Mudd:

Thank you for the question. I think there are a number of ways. First of all, I would say that we want to encourage competition in the digital advertising ecosystem. And to do that, what we need to do is level the playing field so that smaller publishers and ad platforms can compete with the largest ones more effectively. By enacting legislation like ADPPA, we take a meaningful step forward in making the digital advertising ecosystem, I believe, more competitive, which will serve small businesses in providing them more options for promoting their business. You know, and, and competing with larger businesses.

Rep. Rick Allen (R-GA):

And Ms. Rich, I've got 21 seconds, but why, why is it essential that any data, private law, privacy law protect all Americans regardless of age?

Jessica Rich:

Because all Americans, regardless of age, need privacy protections and haven't had it. And by the way, it hasn't been 31 hearings. It's probably been several hundred since I've been participating in this debate. In addition to the kids' provisions though, in the ADPPA would provide not only would it provide targeted protections for kids of the kind that we've already talked about, but even the adult, even the general provisions, data minimization data security, privacy by design would also protects kids, which is why we need to do it all together.

Rep. Rick Allen (R-GA):

And thank you, I yield back.

Rep. Gus Bilirakis (R-FL):

I appreciate it. Now I'll recognize Ms. Clarke from the state of New York for her five minutes of questioning. Thank

Rep. Yvette Clarke (D-NY):

You. I thank you very much, Mr. Chairman, and I thank our Ranking Member Schakowsky for holding this very important hearing. I'd also want to thank our witnesses for testifying here today. You've really enriched the debate and conversation in this space. I was encouraged that major pieces of my bill, the Algorithmic Accountability Act, were included in the ADPPA that this committee marked up last year. I hope to continue working with members on this committee to ensure any national data privacy standard requires algorithmic transparency and risk mitigation. AI systems are often trained on the data sets that replicate human biases,

And thus, bias is built into the technology itself. I'm concerned that without proper transparency and explicit steps to mitigate against bias, the use of artificial intelligence and critical decisions could erode essential civil rights protections in the digital realm. Discrimination whether done by a person or an algorithm, cannot and must not be tolerated. Ms. Givens, in your testimony, you highlighted how AI and automated decision making is already used in a wide range of decisions like employment lending and tenant screening. Could you elaborate on why requiring transparency with algorithmic use and algorithmic impact assessments are a critical part of comprehensive consumer data privacy legislation?

Alexandra Reeve Givens:

I can, and I have to start by thanking you for your incredible leadership on these issues over the past few years, really shining a light on these concerns and how we can move forward to address them. Tools that use algorithmic decision making are increasingly being used in ways that significantly impact people's lives. To give just one example in the employment context, we're seeing the increasing use of vendor created tools to screen resumes to conduct video-based interviews and analyze those interviews to have people play online games. And the way in which these are AI driven is that those tests are automatically looking for traits that match the traits of existing people in the company, which is an automatic recipe for perpetuating existing systems of discrimination and also raise questions about fitness for purpose in the first place, are you actually measuring things that really are indicative of someone's likelihood to succeed on the job?

And there's been important research done in the field to show that often these tools actually are not fit for purpose. One of the most notorious examples analyzed an AI resume tool where the wiz weighted factors in favor of a candidate was if their name was Jared, and if they'd played lacrosse in either high school or college. The reason this matters is not just for the employees who are being screened out, but also for the businesses that are relying on these tools based on commitments from vendors that they have been screened, that they're appropriately designed and they're bias free. How can businesses actually trust that these tools are doing what the vendors say they do and that they're complying with existing law? So transparency really matters. And what ADPPA does by lifting many of the important provisions from your legislation on this issue is say, we need to have companies, number one, disclose how these tools work, what data they are based on, do it in a way that protects trade secrets and doesn't overwhelm, but analyze this. And the companies need to show they've gone through a rigorous internal process of detecting potential bias and assessing fitness for purpose. The reason that matters is that we need to inculcate a company culture of asking those questions before we put these tools out into the world. And that's what ADPPA will help do.

Rep. Yvette Clarke (D-NY):

And well, thank you. As a member of the Committee on Homeland Security, I'm particularly concerned with what can happen when companies collecting our sensitive information are not adequately protecting that information. Ms. Givens, if companies are largely free to collect, possess, and transfer use data, user data, that is not necessary to provide a specific product or service, does that increase the risk or consequence of a data breach?

Alexandra Reeve Givens:

It does, absolutely, because without those purpose limitations or minimization requirements, it leads to the unfettered sharing of additional information solely for the purposes of helping to target ads, and that's what leads to these massive data sets that can be so vulnerable to abuse.

Rep. Yvette Clarke (D-NY):

Thank you, Ms. Rich, how would you, how would a comprehensive national policy framework increase the FTC's ability to protect the American public from data breaches?

Jessica Rich:

Oh, there are so many ways. First of all there are the data security provisions in the order that would require data security to protect the data, but also all, many of the core provisions would serve the same function, data minimization. So much of, of so many data breaches happen to data that that's sitting there and shouldn't be. And same with protections for sensitive information. If people can prevent their sensitive information from being over collected and stored, it's less likely to be breached. So there are so many ways in which this helps the FTC in and, you know, not even you know, in addition to giving them penalty, civil penalty authority, which they do not have for first time violators to increase deterrence.

Rep. Yvette Clarke (D-NY):

Mr. Chairman. I thank you and I yield back.

Rep. Gus Bilirakis (R-FL):

Thanks so very much. Okay. Now what Lou is we have votes on the floor and we'll recess and come back 10 minutes after the final vote. Cuz I have several members on our side, on the, on the Republican side that haven't had the opportunity to ask questions, whether it's sitting on the committee, but we've had several that have waived on as well. And I really think we need to give 'em an opportunity. I appreciate the witnesses for their patience. Thanks so much. So we'll go ahead and without objection, we'll recess.

Meeting will come to order. I want to thank y'all. I thank the witnesses. Y'all were outstanding, by the way. Okay. And the consensus is, it is really good for us to know that we, I mean, we knew it anyway, how, but to, to know that we passed a good bill last session and we can improve upon it. And you are contributing factors with no question. So why don't I recognize my good friend from East Tennessee, Ms. Harshberger, who is our what is it? Yeah, the youngest pharmacist. Yeah, the youngest pharmacist, youngest on the committee.

Rep. Diana Harshbarger (R-TN):

Yeah. Buddy can still claim oldest.

Rep. Gus Bilirakis (R-FL):

Recognized for five minutes.

Rep. Diana Harshbarger (R-TN):

Thank you, sir. Thank you, Mr. Chairman. Thank you for the witnesses. My first question, sir, is Mr. Mudd one of the reasons that America has such a robust economy is because startups are able to establish themselves in the marketplace. What safeguard should authors of the federal privacy legislation build in specifically for the pro protect protection of these small and medium sized businesses?

Graham Mudd:

Thank you for the question. I, you know, I do believe in a no expert that there are some provisions that safeguard the, you know smaller businesses in their use of data. And I think those are very much appropriate, I would say that it is, it's, it's important to level the playing field between smaller companies that are trying to establish a publishing presence and, and an advertising business online in the very large ones. And I think this bill goes a long way towards leveling the playing field and raising the bar across the board so that it's not just the largest tech companies that have access to the data that is so powerful in an advertising business. And they have such an incumbent advantage.

Rep. Diana Harshbarger (R-TN):

Okay. Very good. That kind of leads me into my next question. You know, I'm thrilled we're working to draft a privacy framework that appropriately balances data privacy for our constituents while also helping businesses receive more clarity about the rules of the road, black and white clarity of what is and isn't permissible, especially important for these small and medium sized businesses who don't have dedicated compliance departments like the resources to survive inlets lawsuits from predatory attorneys. And I can say that because I've been a small business owner of independent pharmacies for over 30 years, and believe me, I know that it's incredibly difficult to navigate. That's right. The rules of privacy rules, the data rules, the healthcare rules from state to state. And then you have to outsource that to be compliant to somebody who knows the rules. And therefore you run the risk of having that data compromised in use. You know, healthcare fraud is very prevalent, as you mentioned. My question is, what can be done to ensure there's clarity for all these businesses to thrive under a federal privacy framework? And I, I will open that up to all three of you, whoever, whoever wants to go first.

Graham Mudd:

I'll, I'll mention one thing and then I'll pass it to my colleagues here. And that is just that I think that a state by state patchwork is particularly onerous to small businesses who are trying to comply with different rules, as you mentioned. And that even just establishing a single benchmark and, and compliance program would go a long way towards supporting small business and innovation.

Rep. Diana Harshbarger (R-TN):

Okay.

Alexandra Reeve Givens:

In addition, the ADPPA has a number of protections to help small businesses. So some provisions don't apply to small businesses, or they have a lesser burden, for example access rights for users to access and understand the data that is held about them. Yeah. And the private right of action as well. So small businesses are shielded from that. In addition, there are provisions like making sure the FTC provides a business resource center to help businesses actually comply and understand their obligations, which is really important as well as some of the other provisions to address these concerns about excessive litigation or runaway litigation. There are limits on the damages provisions that can be sought. There's this notice and cure opportunity that the Chair was talking about. So other measures to make sure that consumers can vindicate their rights, which is so important and what Congress is focused on. Right. While also making sure that businesses aren't overly burdened. Okay.

Rep. Diana Harshbarger (R-TN):

Very good.

Jessica Rich:

I would add one thing to what my splendid co-panelists have just said, which is what we discussed with Chair Bilirakis, which is the compliance programs. Yes. If you have a rigorous compliance program and small and medium businesses can join and get some certainty and help in their compliance, it benefits everybody.

Rep. Diana Harshbarger (R-TN):

Exactly. That I keyed in on that point when you were talking previously. I guess another question and this will be to each one of you too. Can you think of an example of an unintended consequence with federal privacy legislation? And I say that because, you know, if you poll this across the country, across businesses, across citizens, they want one policy. So it's easy to navigate, but would there be any unintended consequences for businesses or individuals?

Alexandra Reeve Givens:

So I will chime in that states have a really important and legitimate role in protecting their citizens, and we need to make sure that at the same time as fighting for consistency for businesses and how they think about consumer privacy, we're not infringing on states' rights to protect their citizens and the values that they care about. I'm thinking about things like consumer protection, laws of general applicability, civil rights laws. A number of states have been really important. First movers on things like child exploitation, online anti-spam laws, data breach notification laws, which are in place in all states around the country. And people have gotten used to those, they've been on the books for a long time, and states have played a really important role. So we need to strike a balance here of creating that certainty for businesses, but still allowing the states to provide that traditional function they have had of cons protecting their citizens. And that's the balance that ADPPA is trying to strike. Okay.

Rep. Diana Harshbarger (R-TN):

Anybody else?

Jessica Rich:

I'll just quickly add that research. We need to incentivize research using data and the ADPPA does it while also having protection, so that's very important.

Rep. Diana Harshbarger (R-TN):

Yes, ma'am. I agree. Well with that, go ahead,

Graham Mudd:

Sorry. Oh, the only thing I would add to that is I do think a potential unintended consequences of a privacy law is to constrain, you know unnecessarily at least the use of data for very productive reasons by small businesses. And so I think technology, again, can really help to bridge that gap. But it would be all the more helpful if to the degree possible the law can is, is very clear about what does constitute Yeah. Reasonable privacy and what doesn't so that technology companies know how to you know, sort of navigate the

Rep. Diana Harshbarger (R-TN):

Solution. Absolutely. That goes back to clarity. Thank you. And with that I yield back, sir,

Rep. Gus Bilirakis (R-FL):

I think generally yields back, no one on the Democrat side. So we'll recognize the gentle lady who represents Gator Nation. Ms. Cameron, who's a great friend of mine.

Rep. Kat Cammack (R-FL):

Thank you, Mr. Chairman. And yes, we do represent the Gator Nation home to one of the best damn football teams in all of the nation as well as a wonderful research institution. So Mr. Chairman, thank you for your support, not just of the Gator Nation, but of this issue. I think it's critically important that we address this issue, and I feel like we've hit on all of the topics, really, in some way or another. So I do want to give you all the opportunity to narrow in on something that hasn't been addressed here yet today. But before I do, while you're thinking of that, I would like to ask you guys particularly when we're seeing federal agencies collect data from various companies and then using that data in ways that may or may not I'm not gonna say ethical, but there's a bit of a gray area in how that data is being used. What are some of the national security implications for that data collection? And then the subsequent breaches that we've seen, and I'll start with you Ms. Rich, and then we'll go down the line.

Jessica Rich:

Well there's so many ways this has international implications. That's, that's a big piece of this for one thing. Well, for one thing, you know, US companies that are having serious problems in Europe, because Europe believes that the US doesn't have strong enough laws, it affects trade, it affects companies ability to process European data, and then as in the US because they're not allowed to transfer it, which creates a lot of inefficiencies. So that's a, a serious trade and credibility issue we have as we deal with, with issues of hacking and surveillance from other countries. Mm-Hmm. <affirmative>, there's not only is the data exposed, but we have very little credibility as we deal with those other com countries. And, you know, it comes up, you know, in the TikTok situation people talking about banning TikTok. Well I think we'd have more credibility talking about that if we had a privacy law of our own. And, and then there's the costs of, on disproportionate costs on US companies of complying with multiple laws.

Rep. Kat Cammack (R-FL):

And I appreciate that. I wanted to make sure I give misgivings and Mr. Mudd an opportunity as well as quickly as you can.

Alexandra Reeve Givens:

So I agree with my colleague and I do think that the biggest risk to all of this is the unfettered collection, storage and sharing of data, and that creates cybersecurity risks that creates national security risks. And so that's why we have to pursue a framework that minimizes the amount of data that companies are collecting and storing and puts limits on how they can share that information.

Rep. Kat Cammack (R-FL):

Mr. Mudd.

Graham Mudd:

Very much agree with that. In, you know, in the tech world, we call that surface area, right? The more the data is out there the, the, the bigger the risk, right? Data is infinitely replicable and can be stored at, you know forever. And so the, and to the degree that we're able to limit it through technology, through regulation then we reduce the risk national security and otherwise,

Rep. Kat Cammack (R-FL):

Do you think that we should require or have a way to incentivize that data servers be housed here in the United States as part of the national security framework? When we're talking about housing our data misgivings?

Alexandra Reeve Givens:

I think that gets you into risky territory really quickly. And part of the reason is we need the global flow of information around the world. It's how the global internet functions, it's how we're able to communicate and do business with other nations. It's the way in which the US has been a leader in innovation around the world is through that free flow of information. So instead of just throwing up firewalls, what we need is strong data protections across the board that make sure that everyone's following the same rules as opposed to having to impose these really hard to enforce walls on data localization.

Rep. Kat Cammack (R-FL):

Something that I haven't heard yet today is the emergence of AI, ChatGPT, how AI is going to essentially revolutionize the data collection models and what, what are some of the implications of using AI with some of these algorithms and these platforms? We can go down the line and then I'll open it up to you. And the one minute and seven seconds that I have left. Ms. Rich.

Jessica Rich:

One implication is this is an area where technology has become so sophisticated and the FTC laws basic laws can't get at it in the way they were able to get at issues earlier.

Rep. Kat Cammack (R-FL):

Thank you.

Alexandra Reeve Givens:

Congress is gonna be grappling with this issue for a long time. As AI transforms our society, one of the first things we need to do is just get a handle on which companies are using these tools and making sure that they're going through a responsible process when they're deciding how to design them and how to deploy them through impact assessments. And that's one of the provisions in the ADPPA and why it matters.

Rep. Kat Cammack (R-FL):

Excellent. Thank you. Mr. Mudd

Graham Mudd:

Would make two points. First. that regulating the use is really important as, as Ms. Rich mentioned, and, and flexibility to adapt you know, to further use cases along the way is really important. The second is I think that explicit bias detection can play a really meaningful role in this. And then, the last would just be around transparency, right? Understanding when AI is at use. And, so consumers have some understanding of the end result.

Rep. Kat Cammack (R-FL):

Excellent. My time has expired, so I'll have to yield back. So, sorry.

Rep. Gus Bilirakis (R-FL):

Thank you general lady.

Rep. Kat Cammack (R-FL):

Maybe Mr. Oberholtz. Maybe Mr. Oberholtz, you'll get a few moments. Thank you. I yield back.

Rep. Gus Bilirakis (R-FL):

Very good. Now we'll recognize, there's no Democrat. They're all getting ready for their issues conference. So we're gonna go with Representative Armstrong, the Vice Chair of the full committee from the great state of North Dakota. You're recognized, sir, for five minutes.

Rep. Kelly Armstrong (R-ND):

Thank you, Mr. Chairman. And before we had to recess Ms. Givens you had an interaction with Congresswoman Lesko and you talked about the balance with law enforcement and primary source versus secondary source data. I think it's important to point out the biggest difference, at least in most cases, between secondary source data and primary source data, one requires a warrant, one doesn't, and I personally think the privacy partner of this is a feature, not a buck. Are you familiar with the September 26th, 2022 letter to the house from various law enforcement associations expressing concern with potential data for data privacy legislation? There's a quote that it has major negative consequences that would make it harder to investigate criminal activity.

Alexandra Reeve Givens:

Not the specifics of that letter, but I am with the general set of issues raised.

Rep. Kelly Armstrong (R-ND):

Well, the letter continues and it says, we'll likely complicate the private sector's ability to continue its ongoing efforts to cooperate and voluntarily share certain information with law enforcement. Essentially, the letter addresses the warrantless purchase of consumer data from brokers to generate investigatory leads. Are you familiar with the Center for Democracy and Technology Report from December of 2021?

Alexandra Reeve Givens:

I am. My team and colleagues wrote it.

Rep. Kelly Armstrong (R-ND):

<Laugh> Legal Loopholes and Data for Dollars, how law enforcement and intelligence agencies are Buying, buying your data from brokers. Would you mind briefly summarizing the general conclusion of that document? Yes. I'll let you weigh in too, Mr. Mudd, cuz you've been pretty fired up about some of this stuff.

Alexandra Reeve Givens:

<Laugh>. so yes, one of the major concerns about data brokers is that they aggregate these vast amounts of information, and in addition to selling it to target ads, it does become a target for law enforcement and law enforcement is able to buy that data on the open market like any other person, and in doing so, circumvent their Fourth Amendment obligations. So CDT and other civil society organizations have been vocal in raising the constitutional concerns that that raises and the protections for people's freedoms and civil liberties. We're not saying that law enforcement work shouldn't happen, of course it should, but it needs to be subject to reasonable oversight in the constraints of their constitution and the law.

Rep. Kelly Armstrong (R-ND):

Well, and I wanna be perfectly clear, law enforcement should use every tool. Good law enforcement officers are gonna use every single tool that exists for them to solve crimes, to do all of those things. It's our job to set the guardrails on this. And it is the federal government's job to set the guardrails on this because it actually implicates what I think is maybe the most existential conversation of the 21st century and what the actual right to privacy means as we continue to move forward. The report cites DOJ'S use of commercially aggregated data for prosecutions related to January 6th. Grand jury information states location, data history for thousands of devices were present inside the capitol. Were em essentially obtained from several sources. I was at the Capitol that day. I was performing a constitutionally and statutory mandated function. Do you think DOJ had access to my data?

Alexandra Reeve Givens:

Sir? I wouldn't wanna speculate on a particular fact pattern, but I didn't.

Rep. Kelly Armstrong (R-ND):

I think they did. So <laugh> Ms. Givens hundred of journalists were at the capitol that day. They were performing activities expressly protected by the First Amendment. And I, you wouldn't wanna speculate, but that information existed well, and I'm more than willing to guess that locator systems and toll records were collected from around the beltway. There are lots of otherly commercially available data that was probably accessed. Mr. Mud, are you familiar with the majority? In USV Carpenter? I'm not. Okay, Ms. Rich, are you? Yes. Yes. So the timestamp data referring to cell site location information provides an intimate window into a person's life revealing familial, political, professional, religious, and sexual associations. And I think the court has this particular court and even the previous iteration of the court has been willing to reexamine what privacy looks like in the form of the government and the digital age.

And we've talked a lot about data collection and data brokers, and I think I'll just be more blunt really quickly, data brokers. I mean, data brokers say this will put them out of business, which means not only will law enforcement not have access to this, but other people won't. But I think we don't spend enough time talking about, I mean, data collection is just the first part. Representative Cammack just talked about AI talked about all this. The ability to analyze that data in real time is advancing at an incredibly rapid rate, which makes it very much different than having a drug dog search a box at a post office. And how do we deal with this and continue towards a constantly protected act, constitutionally protected activity when the federal government has what I think is an incredible… all law enforcement has this end run around the Fourth Amendment by being able to purchase this data on the civilian market.

Jessica Rich:

That's exactly right. If the, i, if our rights exist for certain reason, you shouldn't be able to just go to another company and get the same data and, and not abide by those rights. It's, it, it, it's an end run. That's exactly what it is.

Rep. Kelly Armstrong (R-ND):

With that, I will yield back and I apologize to Mr. Mudd. I kind of wasn't totally honest with him.

Rep. Gus Bilirakis (R-FL):

Thank you. The gentleman yields back and I'll win on the democratic side. So recognize representative falter for his five minutes from the great state of Idaho, right? Idaho.

Rep. Russ Fulcher (R-ID):

Idaho, yes. Thank you, Mr. Chairman.

Rep. Gus Bilirakis (R-FL):

Idaho. Thank you.

Rep. Russ Fulcher (R-ID):

And to our panelists, thank you for your participation today and for your testimonies. And as I have said in previous sessions please understand that some of us have dueling responsibilities. And so the fact that we're not here the entire time doesn't mean we don't care what you have to say and, and to the contrary. Very much so. And, and for your written testimony as well, I wanna focus on two things and I'm gonna ask Ms. Rich to, to start this, please. Transparency in algorithms and what those what those relay, what those do is something I'm very interested in. I think that there needs to be some exposure of that and some increased transparency. My question for you, is it possible to have transparency without exposing secrets necessary to operate a business?

Jessica Rich:

Are you referring to referring to

Rep. Russ Fulcher (R-ID):

The algorithms, are ... Yes.

Jessica Rich:

Assessments. Yes. Yes. So

Rep. Russ Fulcher (R-ID):

And expand on that if you would.

Jessica Rich:

Please. So the FTC can seek a lot of information right now that would be used to create those assessments. The assessments, though, create even more transparency so that an agency can look to see if laws are being violated and being adhered to. The FTC already has a model for this because in all of its data security orders and its investigations too, it gets very, very sensitive information and even audits in those orders, which it evaluates. So it can do this. And there are procedures for protecting trade secrets and keeping the confidential information in those reports, there's extensive confidentiality procedures.

Rep. Russ Fulcher (R-ID):

If I, if I could ask you to take that same analysis and, and direct it towards what about the, the end user of, of that information of that it, you're talking about FTC? Yeah, I think what about the transparency by the user?

Jessica Rich:

I think that's more difficult because I'm not sure that consumers are really gonna understand all the details disclosed about algorithms. The information that the FTC gets is probably pretty complex and they need technologists to help them evaluate it and figure out whether discrimination is going on. So while there could be some mechanisms for explaining the algorithms to consumers, I'm not sure we would wanna give it to them. I don't think it would mean much, and it would be very difficult then to,

Rep. Russ Fulcher (R-ID):

My personal concern is the use of the data. Who owns that data, how that data gets used. Mr. Mudd has talked about that. I've caught a piece of that and I'm gonna do a follow up question with him. But that's, this is a, this is a big complicated issue. And so one follow up to you, if I may, is the FCC the best entity to be the regulator of that, the monitor of that?

Jessica Rich:

Yes, it is the most experienced, but it would need to get help. It needs more technologists, it needs more resources If it's going to be evaluating

Rep. Russ Fulcher (R-ID):

Statutory changes as well, excuse me, statutory changes as well. It

Jessica Rich:

Totally needs statutory changes. It needs the ADPPA or something like it. Okay.

Alexandra Reeve Givens:

Mr. Fulcher if I may, there are provisions in the ADPPA that protects trade secrets. And so the vision is that those impact assessments are performed by companies submitted to the FTC that can then look further into them if they want to. Disclosure to the public is optional by a company, and there's specific language in there to protect trade secrets while still making sure companies are going through that assessment process of really making sure they're being honest and thoughtful about the directory.

Rep. Russ Fulcher (R-ID):

In your opinion, is that language sufficient?

Alexandra Reeve Givens:

Yes, I think it's well drafted.

Rep. Russ Fulcher (R-ID):

Right. Thank you Mr. Mudd. shifting gears a little bit, wanna talk about GDPR, maybe I'm not shifting gears all that much. My perception of what's transpired in Europe over that is that it's been helpful to large companies, not so helpful to small companies. First of all, is that your perception as well? And secondly, we've only got about 50 seconds left. What is the primary component or components that need to be different when we embark on that path?

Graham Mudd:

Sure. my perception matches yours. That is, it has likely been easier for larger businesses to adjust to that regulation and to comply with it. And therefore probably more difficult for smaller businesses in terms of what we ought to learn from GDPR and potentially do differently. I think one of the challenges with gdpr, putting my sort of consumer hat on, is that it really does put a lot of work on the consumer to read and understand many, many, many consent dialogues. And instead, I would hope that we can find a way to sort of raise the bar, as we've talked about, instead of asking consumers to navigate very difficult choices, in some cases, no choice at all if they wanna access content.

Rep. Russ Fulcher (R-ID):

So the answer is get rid of the complexity.

Graham Mudd:

I think reduce complexity, but more importantly, focus on data minimization and technologies that support that, as opposed to asking consumers to say yes or no to answers. They have very little understanding of.

Rep. Russ Fulcher (R-ID):

Okay. Thank you, Mr. Mr. Chairman, I yield back.

Rep. Gus Bilirakis (R-FL):

Thank you very much. Appreciate that. Now we'll recognize he waived onto the committee, appreciate it. One of the hardest workers in Congress. I'm not even gonna mention the pharmacy thing. It's been overblown. Thank you. So yeah, I'm not gonna do, I'm not gonna do that. But in any case, I'll yield five to my friend from the state of Georgia.

Rep. Earl "Buddy" Carter (R-GA):

Thank you, Mr. Chairman, and thank you for having this hearing, and thank y'all for being here. As was indicated, I'm a pharmacist. I am, I'm, I'm not an IT technician, but I will tell you I'm a consumer and I'm concerned and, and you know, I've experienced it myself. I've experienced having a truck. I've got a Toyota Tundra, 478,000 miles on it. I mean, and I'm gonna get to 500, I'm sure, but you know, I had a cover on the back and it dry rotted and I needed to get a replacement for it, and I just searched for it. And then all of a sudden I started getting all these ads for this, and I thought, how in the world? And so it's real. This is real and this is something, and that's why I wanted to wave on, because this is so vitally important to us.

And it's just fascinating to me because I know we need to do something, but I wanna do the right thing. I don't wanna suppress freedom of speech, I don't wanna suppress innovation. But when you don't do something, you're doing something. And if we don't do something, then we're gonna be in a mess. Mr. Chairman, I’ll begin with, according to the Information Technology and Innovation Foundation over the next 10 years, it's estimated that the growing patchwork of state privacy law as we are experiencing will cost over a trillion dollars, with hit at least 200 billion hitting small businesses. And I know about small businesses, Scott, as I ran one for 32 years. So I am, I do have some expertise there, but Mr. Chairman, I'd like to ask unanimous consent to to include this report from the ITIF in the record.

Rep. Gus Bilirakis (R-FL):

Without objection, sole ordered

Rep. Greg Pence (R-IN):

Again. Okay, Mr. Rich, I wanna start with you and get right to it. Mr. Bilirakis hit on this earlier in the hearing, but the FTC making won't be preempting the five states to, with the enacted laws nor any succeeding legislation. So I agree with you, we, this committee, we have the responsibility to pass a national standard, and that's gonna be extremely important. But I wanna dig the FTC rulemaking a little bit more. I understand there's a difference between the FTC's APA rulemaking authority and their Mag-Moss rulemaking authority. I've heard there may not be legal authority for the FTC to do their own privacy rule under mag moss, but their authority is pretty clear Cut. Can you put on your FTC expertise hat right now and and give us your thoughts on whether they have a legal standing to promulgate this rule?

Jessica Rich:

Yes, and thanks for the question. So the FTC act explicitly authorizes the FTC to develop rules under this so-called Mag-Moss process to halt and remedy unfair or deceptive practices. It even tells the FTC what process to use, and the FTC has used Mag-Moss to develop other rules. So I think mostly the FTC’s on pretty solid ground, generally doing rulemaking using this tool. The problem is it's very cumbersome and it, and it's limited so that, you know, given the breadth and significance of the privacy issues here the FTC may not, is, can't do so much of what's in the law that you guys have been, you know, writing and...

Rep. Greg Pence (R-IN):

Absolutely with absolutely even more reason why we need to pass a national standard. Yes. Mr. Mudd, I wanna go to you, in your opinion, do you think overly restrictive rules that stymie innovation? Cuz I'm concerned about freedom of speech, I'm concerned about stymie innovation as well. I don't wanna do that. And the internet's one of the greatest inventions of our, of our lifetime. I get it and I understand that, but at the same time, as I said earlier, if we don't do something, you're doing something. So we, we, you know, we, we gotta address this and it's incumbent upon us, us here in Congress. That's our responsibility. So, we're responsible people. I know that some people would disagree with that, but I, I don't, I think we are, we need to do something. But just let me ask you, do you think that overly restrictive rules that would stymie innovation and the data driven economy harm America's competitiveness with respect to our global competitors?

Graham Mudd:

I think there's potential, but I don't believe the ADPPA will have that effect. I believe that what technology companies, big and small, need is clarity. And the idea of trying to adjust to multiple jurisdictions across the country is extremely taxing and probably a bigger tax on innovation than would be clarity across the board. The second piece I would make or the second point I would make is that there are technologies again that are deployed in many other verticals that allow you to process data in privacy compliant ways. And if these rules, if this legislation takes effect, the innovation will shift towards using those technologies. And I think that's a really good thing for consumers and a really good way for this country to lead on, on, on innovation in this space.

Rep. Greg Pence (R-IN):

And well, you know, look, all of that put together the fact that I don't want to suppress freedom of speech. I don't wanna suppress innovation. I don't want us to get behind our global competitors. That's why this is a heavy lift. I mean, this is, we need y'all's help. Mr. Chairman, I'm outta time and I'll yield back. Thank y'all again for being here.

Rep. Gus Bilirakis (R-FL):

I appreciate it, gentlemen yields back and now we'll recognize certainly last but not least certainly not least, he's got a lot of experience in this area. So we'll recognize Representative Obernolte from the great state of California.

Rep. Jay Obernolte (R-CA):

Thank you, Mr. Chairman.

Rep. Gus Bilirakis (R-FL):

Five minutes. Thanks.

Rep. Jay Obernolte (R-CA):

Thank you, sir. Ms. Rich, you said something at the end of your testimony that really resonated with me. You said that one of the primary reasons why Congress needs to act to establish data privacy standards at a federal level is because the FTC is unable to, through rulemaking, resolve the primary controversies of data privacy, those being preemption and also private right of action. I couldn't agree with you more. So I know that Mr. Duncan asked you about preemption and you said some level of preemption is necessary, but I wanted to tunnel down on that. Should we completely preempt away from the states in this space? Or should we allow the states to create standards that might be more stringent than those created at the federal level?

Jessica Rich:

Well, as I've said I think that some level of preemption and cons is necessary for consistency. I also think we're beyond total preemption, because clearly there are compromises that need to be made. And I'm in awe of this. Committee's work for making some of those hard cuts, at least attempting to. So you know, I think the ADPPA strikes a good balance of partially preempting to create as much consistency as possible, but allowing first of all, allowing all the state AG and other state agencies to enforce, which is incredibly important, and then leaving certain things in place. There's a third issue that's really controversial too, that I think congress needs to resolve, which is how much discretion the FTC should have through its own rulemaking. And so if the FTC does its rulemaking that it's doing, it has total discretion. But this body has tried to make decisions about when the FTC should be able to do rulemaking and when Congress's decisions should be, should, should be the law of the land.

Rep. Jay Obernolte (R-CA):

Well, talking about this issue of preemption, I'm going to partially agree with you. 'm of the opinion we need to totally federally preempt it. And the reason that I feel that way is one of the primary reasons justifications for preempting at all, I think, is to avoid creating this patchwork quilt of 50 different state regulations, which is, has been pointed out in the testimony, very destructive to entrepreneurialism, very difficult for small businesses to deal with. And unfortunately, if we only partially preempt, we leave that problem out there because small companies, you know, two guys in a garage in Cupertino, they're still gonna have to navigate this space. So and, and by the way, before this, I served in the California legislature, I was one of the leads in the drafting and passage of the California Consumer Privacy Act.

So I have one with a vested interest in saying, no, no, don't touch my baby. But you know, I really firmly believe that, that this is something that we need to preempt. If we're gonna do it, we need to do it all the way. Mr. Mudd we have been talking about private right of action. And let me ask you cause I know opinions have, have varied as in the testimony here, who do you think should be responsible for enforcing whatever privacy protections we put in place? Should it be the FTC, should it be state attorney generals, the Federal Attorney General? Should there be a private right of action? What do you think about enforcement

Graham Mudd:

Representative? I'm, I apologize. It is not an area of my expertise, and I'd be reluctant to offer an opinion like that.

Rep. Jay Obernolte (R-CA):

Okay. Well go back to Ms. Rich then. I know she has an opinion on this subject.

Jessica Rich:

Again, I support as much consistency as possible. And so even when I, you know, when I was at the FTC and then at Consumer Reports, I had worries about the private right of action and some of the incentives there. I do think that between the FTC and all of the State Attorneys General and all of the other agencies within the state, there thought's, a lot of enforcers on the beat, plus the FTC really needs more resources, but we are a little bit beyond barring the entire Right, you know, private right of action. Again, this is a lot of this is, is a political decision that you all need to make and compromises have been made, and I, my hat is off to, to, to, to, to you guys for being able to do so. Sure.

Rep. Jay Obernolte (R-CA):

Well, and I think that we are all interested in getting this across the finish line, but you know, the bill that we had last year we didn't quite get there. And so we're trying to figure out what, how to tweak it to get it the rest of the way to passage with, which I think is a goal we all support. I myself though, have some very serious concerns about private right of action and one need look no further than other domains that we have implemented it in. To find out the truth of, you know, what you said in your testimony earlier, which was sometimes quite, in fact, quite often it benefits attorneys more than plaintiffs. In California we have, I mean, obviously we have the Americans with Disabilities Act. I'm sure we're all familiar with those abuses, but in California, we've had the Private Attorney General Act for the last few years that creates a private right of action for the enforcement of California labor laws.

And every single person who represents any piece of California can testify to the number of abusive lawsuits that have been brought, you know, clearly with not, without the intention of actually forcing compliance with the labor law, but only through a profit motive on the part of a law firm. So that's why I think this is really difficult to navigate. And I, I really think that we have sufficient authority through the FTC and the state AGs to be able to enforce this. But I see my time has expired. I wanna thank you all for your testimony and hopefully we'll be able to get this across the finish line. This time around, I yield back, Mr. Chairman.

Rep. Gus Bilirakis (R-FL):

Thank you. And the gentleman yields back. So seeing that there are no further members wishing to be recognized, I'd like to thank all the witnesses for being here. Thank you so much for your patience. Y'all did an amazing job. You really did. I appreciate it. Very informative. So you guys don't have to stay. I've got some business to take care of, but thank you so very much. Pursuant to the committee rules, I remind members that they have 10 business days to submit questions for the record, and I ask the witnesses to respond to the questions promptly. If you kindly respond. We'd appreciate that. Members should submit their questions by the close of business on March 15th. So, let's see. I've got some documents that need to be entered into the record. So pursuant to the committee rules, I ask unanimous consent to enter the following documents into the record.

A letter from the Institute of Electrical and Electronics Engineers usa. A letter from the Insights Association, a letter from the Privacy for America, a letter for the from TechNet, a letter from the Health Innovation Alliance, a letter from the Credit Union National Association, a letter from Engine, a letter from the Confidentiality Coalition a letter from the National Association of Federally Insured Credit Unions. A letter from Mr. Brandon Pugh of the R Street Institute, a letter from the National Multi-Family Housing Council and the National Apartment Association. A letter from the Main Street Privacy Coalition, a letter from the Electronic Transactions Association, a letter from the BSA, the Software Alliance, a letter from the commissioner, Peter A. Feldman of the Consumer Protection Safety Commission. A letter from ABT, the App Association and the Connected Health Initiative, a letter from the US Chamber of Commerce.

A report from the Information Technology and Innovation Foundation entitlement, the looming cost of a patchwork of state privacy laws. A letter from the National Association of Manufacturers, a letter from the Leadership Conference on Civil and Human Rights, a letter from the Law Enforcement Stakeholders, a letter, and finally, a letter from the Fraternal Order of Police and the International Association of Chiefs of Police. Without objection. So order. So thank you very much folks even in the audience for attending this meeting. I want to thank the Ranking Member and of course the Ranking Member on the full committee and the Chairperson, Cathy McMorris Rogers. And without objection, this subcommittee is adjourned. We appreciate all of y'all. Thank you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics