Home

An Exit Interview with a Hill Staffer

Justin Hendrix / Feb 26, 2023

Audio of this conversation is available via your favorite podcast service.

The past few years have seen a number of high profile hearings on Capitol Hill, with Representatives expressing concern and even outrage at tech CEOs, often for their failures to just satisfy their own policies. And, there have been high profile investigations by certain committees, including the investigation of competition in digital markets in the House Judiciary Committee and its Subcommittee on Antitrust, Commercial and Administrative Law. But when it comes to passing laws, Congress has made little progress in the domain of tech policy.

An academic and a tech policy expert, today’s guest played an active role in the investigations and legislative proposals led by Democrats over the last few years. Anna Lenhart served as a staffer on the House Judiciary Committee Antitrust Subcommittee under Chairman David Cicilline (D-RI), where she supported tech oversight and investigations. And, she was senior technology policy Advisor to Representative Lori Trahan (D-MA), who serves on the Energy and Commerce Committee. I caught up with Anna for a kind of exit interview, as she recently left Congress to return to academia and a handful of projects focused on some of the issues she cared most about in her time on the Hill.

Below is a lightly edited transcript of the discussion.

Anna Lenhart:

My name is Anna Lenhart. My present affiliation, I'm just a scholar of sorts. We can get into that more. I was a congressional Hill staffer, policy aide to two incredible members of Congress. I worked for Congressman Cicilline in the second half of the 116th Congress, on the antitrust investigation into Apple, Amazon, Google, and Facebook. That was an investigation led in the House Judiciary Committee by now FTC chair Lina Khan and some other amazing lawyers. I did that for a year. And then in the 117th Congress, I worked for Congresswoman Trahan, who is a member of the Consumer Protection Subcommittee on Energy and Commerce. That's the subcommittee that does a lot of work on online safety, privacy, and a range of other consumer protection issues.

Justin Hendrix:

So we're going to learn a little bit from your experience in Congress. We're going to learn a little bit about what happened perhaps behind the scenes in those committees and in what are really some of the most kind of important moments in, I suppose, tech policy in Congress over the last few years, particularly the antitrust effort and also the more recent effort around looking at transparency and accountability for social media platforms. But how did you get to Congress? Just quick run through of your background. You're not the sort of typical Hill staffer.

Anna Lenhart:

Absolutely not. Yeah. So I had a really windy career. I started actually in alternative energy, and then most of my 20s, I was a salesforce.com developer for a range of nonprofits. I also sat on a lot of nonprofit boards. And in 2018, 2019, I found myself working at IBM as a data scientist in the federal government sector. So working on automated decision systems at HUD, at Social Security Administration, USAID.

This was also the same time that a lot of AI ethics work was coming out, right? This idea of algorithmic impact assessments, explainability, discrimination in algorithms. I was kind of looking at that work and figuring how do we apply some of this. And some of it was coming out of IBM research, so that was nice. And I had approached our leadership and they had said, "Yeah, let's try to figure this out, right? It's time." And through that work, it became very clear that we need some laws. We need some laws, we need some frameworks, we need some leadership from the government.

So there's this program called TechCongress, and they place mid-career technology fellows onto the Hill in various offices. So I got selected. It was an incredible opportunity. And when I showed up with the fellowship, I really thought I was going to go work on automated decision systems and algorithmic impact assessments and data rights, but I quickly got introduced to Congressman Cicilline's antitrust committee team.

As I was starting to talk to them and really think about the work they were doing, I came to realize that a lot of the anti-competitive conduct they were looking at was happening at the hands of automated decision systems, right? So Amazon placing first party products in the buy box, or setting Alexa's default shopping commands to Amazon eCommerce, or Google's Ad Exchange is running real-time bidding on ad space that Google owns, right? Apple places their apps first in the App Store, right? This is a type of discrimination against new entries against startups, right?

So it was a little bit of a shift in thinking for me, but I ended up really taking that opportunity and I knew I wanted to dive in here, and I got really sucked in. So I spent a lot of time looking at cloud markets, which are fascinating because they're a key input into AI and into IoT and into smart homes. So I was looking at that whole entire market space. And then also looking at browsers and other data-centric pieces of the investigation. So things like... We know the monopolies are able to collect and view a lot of data from their competitors that are using their platforms, and then can create competitive first party products. So really diving into some of that documentation.

But yeah, I mean, it was a crazy year. It was the pandemic. I just remember sort of sitting in my pajamas late into the night, going through all of these documents that we had collected.

And then the fellowship ended right around January 6th, which I know we might talk a little bit more about, and I knew I had to stay. Was able to work for this relatively new member of Congress, Congresswoman Trahan from Massachusetts, who had an ad tech background, but up until that point, hadn't been on the committee of jurisdiction, Energy and Commerce, so she didn't have, really, a portfolio. So it was this incredible opportunity to come in and build a new portfolio from scratch. And it ended up being a perfect fit for me. I have an entrepreneurial background, so it was just absolutely an incredible two years.

Justin Hendrix:

And I want to talk a little bit about some of the legislation that she put forward. We'll get into that a bit. But just kind of going back to the work on antitrust, listen, from the outside, I would imagine there must be some disappointment to see multiple bills come out of that process, and really not a great deal to show for it, ultimately. I don't know, how do you think about that? Maybe we can talk about that specifically. But also, I guess, how do you think about accomplishments, unfinished business and disappointment, having worked in tech policy in Congress?

Anna Lenhart:

Yeah. Look, I'm not known as an optimist. It's definitely not what I'm known for. But I'd be lying if I didn't say that coming into the 117th Congress, the Democrats had the House, they had the Senate, we had just come off this bipartisan investigation. And what I would argue, of course in a biased opinion, was a historic hearing with the four tech CEOs. There was a great amount of energy here.

So yeah, I'd be lying if I didn't tell you that I'm not devastated, right? Especially because the last few weeks of 2022, I was looking around at all of my colleagues, including my colleagues in the Trahan office who got some incredible wins in health policy, well deserved, and colleagues in other offices. And I'm just sitting there like, "How did we not get these bills across?" So I'd be lying if I didn't say I did get my hopes up and they were a little bit smashed.

But look, I don't have any unfinished business. I think the tech policy community put it all on the table, right? The coalitions we built these last few years were incredible and historic. The text we wrote and got marked up was incredibly historic. And not just the antitrust bills, right? ADPPA the coalition built around KOSA. I mean, just some an incredible, incredible movement building.

And also, we shifted the narrative, and I don't think we're going back on that, right? I mean, I think when I came to the Hill, there was still this narrative around members of Congress are stupid and they don't understand tech. That was still very alive. There was still a lot of, "But tech is creating so much good for society and so many free products." I think we're starting to move away from that. And I think there's been this space of people wanting democracy back, right? Wanting elected officials to take some of their power back from these monopolists.

So huge wins in that regard, and I don't have any, quote, unquote, "unfinished business." I think we put it all out on the table, and now the question is, "What happened? Why didn't they get across the line, and what do we do now?"

Justin Hendrix:

So I will grant you the language is of course now there, a lot of the investigative work is now substantiated. There's plenty of sight, there's lots of evidence, et cetera. And it is true that perhaps the politics will change at some point in the future. But let me press you just a little bit on this. Why is it that even on basic stuff like privacy, Congress can't move ahead? You mentioned ADPPA, the American Data Privacy and Protection Act, which passed out of committee with near unanimous support, and yet seems to have run into, in this case, California Democrats.

Anna Lenhart:

Yeah. Let's talk about this. Big picture. So tech policy is a really interesting policy area because it is not a kitchen table issue. What I mean by that is when you are a member of Congress, your job is to respond to the needs and the concerns of your constituents. That means you're getting phone calls. You're getting phone calls about climate change, about healthcare, about the economy. You're going to roundtables and task force meetings in your district, and you're getting asked questions about inflation and the war in Ukraine.

And underneath all of that is tech policy. Monopolies lead to price gouging. Monopolies lead to harm to workers and reduced wages. The spread of COVID disinformation and climate disinformation is because of our unhealthy environment, online environment, information ecosystems, right?

I mean, one of my favorite things to share is that the Digital Services Oversight and Safety Act, which is a bill I worked on in Congresswoman Trahan's office, has 17 environmental orgs endorsing it, ranging from Greenpeace to Union of Concerned Scientists. Why is that? Right? That is because the recent climate change policy reports show that the ability for Exxon and Chevron and Shell to target ads and just the nature of many of these platforms to amplify sort of extreme views on climate change is making it hard to pass laws in that space.

So I'm really lucky in that I worked for members of Congress that understood that tech policy was at the base and the foundation of these, quote, unquote, "kitchen table issues," but that doesn't make it any easier when you're trying to prioritize that work. And not all members are as effective at communicating that, and the public at large hasn't quite grasped that. So that's just an underlying challenge that tech policy has.

The other interesting piece about tech policy compared to again, say, climate change, is that it's really hard to address with money alone. So what do I mean by that? We were able to do a historic climate bill this year because we were able to use the budget reconciliation process, which only required 50 senators to pass a set of tax incentives that encouraged clean energy and sustainable practices.

For tech policy, we don't really have an equivalent of that. Yes, we can give more money to the FTC, to NIST, to NTIA to do more standards and frameworks, to do more enforcement of existing policies. And in fact, we did do that. And many of your listeners probably know that the FTC is hiring technologists now. That is in part because we were able to get them a funding boost. But it's certainly not as historic as what we're able to do in the Inflation Reduction Act for climate change. In tech policy world, we need new regulations, we need new rules and laws, and that requires 60 senators.

Now, so that's the first just other challenge that's thrown in the mix. Then on top of that, tech policy is bipartisan, actually, which is nice. So if you and I were to go to the Hill right now and talk to a handful of Republicans and a handful of Democrats and say, "Hey, do we want comprehensive privacy?" They would all say yes.

Where it gets challenging is that when you start to get into the details of that, it becomes very hard to hold a coalition together. And then there's this just... I don't even know how to explain it, but to get a policy through, especially one that's going to require 60 senators, you need a little bit of a bunch of things to line up.

So we already mentioned you need to have the text, you need to have the thoughtful, passionate members of Congress in the right spot. You need to have leadership aligned. You need to have the executive branch aligned. And then, and I wish this wasn't true, but it is, you need a catalyzing event. And in tech tech policy, we just don't get as many catalyzing events, at least not directly. The few we've had, we just haven't had everything line up, right?

So if you look at Cambridge Analytica, it was a catalyzing event for sure, but we didn't have the privacy language. It hadn't been written. A lot of bills got written after Cambridge Analytica. But we really didn't have any great text floating around that had already been discussed and socialized.

And then fast forward to the Facebook papers, and same thing, we just didn't have everything lined up, right? We had a few transparency barrels floating around. So Congressman Trahan, for example, had the Social Media DATA Act, but PATA hadn't been written, DSOSA hadn't been written, the Kids Online Safety Act hadn't been written. PATA and DSOSA are comprehensive transparency bills, for those who aren't familiar. So they hadn't been written, they hadn't marked up, they weren't there, they weren't ready.

And then we didn't have that ingredient of the executive branch coming in and saying, "Great, we had this huge catalyzing event. This is the policy we need you to do," right? For the last two, three years, we've had the executive branch specifically... I'm putting antitrust aside for a moment. Specifically for privacy and online safety.

We had a lot of really interesting speeches, states of the union where we said, "Bans are bans advertising for kids," or, "We need comprehensive privacy. We need to protect healthcare data." We had these interesting pieces, but there was no Congress passed this bill that looks like this, and both of the chambers you need to get together and do that.

Justin Hendrix:

Well, let's just talk about this for a second because the White House, of course, has hosted multiple roundtables, did release its principles for tech policy reform. But you're right, I mean, there wasn't sort of a push around a particular bill. Even in this most recent State of the Union, the president brought up child online safety, but there wasn't a particular call to pass, for instance, the Kids Online Safety Act. Does that kind of suggest that even at the White House, there's sort of a, I don't know, just not certainty about exactly whether these bills are the right bills or how exactly to move forward?

Anna Lenhart:

I mean, look, I can't say for sure. All I can say is that when we look at the incredible legislation that did get done these last two years, right? So the Infrastructure law, IRA, CHIPS, there was this coalescence of leadership in the executive branch with the leadership of both parties and both chambers, for the ones that were bipartisan. I didn't see that happening from where I was sitting on tech policy issues.

Now, look, there's a lot of speculation, as you know, several of the leaders in both chambers and both parties have ties to Big Tech. I don't know what role that played. I really don't. It's possible it played a role. Or it's possible that we just didn't have the consensus that we needed. So then my question is like, "What would it have taken to get that?" And I don't know.

Justin Hendrix:

Let me ask you about that question about tech influence in particular. Could you feel it on the Hill? Are you kind of aware of lobbying efforts when you're drafting this work? Are you getting phone calls? Are you running into people on the steps that seem to magically appear?

Anna Lenhart:

Yeah, so I'm really lucky in that I worked for two members of Congress who certainly were not checking in with lobbyists at the companies about their proposals. So I got to stay pretty sheltered personally from it as I was working.

Now, when I came to lift my head up and think about, "Okay, how are we going to get something passed?" I mean, I think you hear a lot about it. So I heard about a lot of my colleagues taking meetings with lobbyists, especially in the Senate. You heard about various members taking meetings with CEOs. But you don't really know what's being discussed in them. You don't really know if there's a lot of money changing hands. Certainly, while you're on the Hill, that's not being discussed. Fundraising happens very outside of the Hill.

But yeah, I mean, especially as bills start to move, the lobbyists show up and they start asking for changes or exaggerating maybe certain provisions of the bill and saying, "This is really, really going to harm X, Y, Z." And look, aides on the Hill, we're working on so many issues that if an interest group of any type, company or otherwise, shows up with a handful, an argument, we don't always have time to go do the research to see how legitimate that argument is. So I mean, I certainly did on tech policy issues, but not everyone has the capacity to do that for every single issue. So I do think that it plays a role for sure.

Justin Hendrix:

Let's talk a little bit about the global context. While we're working on these things in the U.S. and getting a lot of the right ideas in place again and getting a lot of the right language into proposed bills, at least, or what seems like the right language.

Europeans are full steam ahead. Digital Markets Act, Digital Services Act, moving ahead with AI regulation. Is that sort of something that you're aware of? Are you watching the Europeans out of the corner of your eye? Are you aware of the fact that they seem to be sort of leapfrogging the U.S.?

Anna Lenhart:

For sure. Absolutely. I was watching them very closely both... In a couple different regards. So starting with antitrust, when we were working on our investigation, Europe was working on a few investigations as well and it also just recently finished some really interesting investigations. So there was a lot of work to pull from from them, and I've been really impressed with Europe's investigations in antitrust. I do also like several of the provisions thatare being discussed for the Digital Markets Act.

Something that's so gut-wrenching when I think about antitrust in the EU is that they're just so limited in what they can do. Because at the end of the day, the United States let Google buy DoubleClick, let Google buy YouTube, let Facebook buy Instagram, let Facebook buy WhatsApp, right? When it comes to structural separation in the spaces where that is needed, we are the ones that have to do this.

So it's kind of hard because you're just watching them do really interesting stuff, but it's just so hard when you know underlying, there's the structural issues that just have to change, right? These companies are just too big and probably need to be split up in a strategic way. So that's antitrust. Definitely learning a lot from their work, but it's hard and gut-wrenching to watch sometimes.

Privacy is certainly, right? I think we're certainly learning a lot. GDPR has definitely changed the internet, I think, for everyone. I would say many of the provisions in ADPPA, specifically the emphasis on data minimization and data loyalty, this idea of you collect data, you use it in a way that the consumer expects you to and not anything else, those are definitely inspired by conversations around the GDPR. And I think that we'll continue to learn in that regard.

And then online safety. This is the one I obviously want to speak about for a bit. So I think we have a very unique window of opportunity right now, and I will tell anyone who will listen about this. So apologies to anyone who's already heard this rant from me.

Over the next few years. Everything in the DSA is going to go into force. So that means many of the transparency provisions that advocates in the U.S. would like to see are going to start being implemented. And every time that one of these disclosure requirements or transparency requirements goes into effect, there's an opportunity for us... And I'm talking big picture us. So all the advocacy orgs, members of Congress, agencies, academics, et cetera, to really push on the companies that operate in the U.S. And Europe to give similar U.S.-centric disclosures, right? For example, the VLOP counts, the very large online platform counts were just due recently. I want those counts for the U.S, right?

And I know what I'm saying sounds like, "Anna, you seriously are just going to ask these companies to voluntarily do stuff?" I get it, I get it. I want regulation too. But I really want to emphasize that the existence of the DSA makes this different. This is not like us asking them to do things without one of our major trading partners also asking for it.

The reason the DSA shifts the conversation is... First, it takes away a couple of arguments. So three years on the Hill, I've heard a lot of talking points from companies. A couple of their favorites are, "What you're proposing is way too complicated. It'll never work. We don't have that data, we don't have that system." They love to say this stuff. Well, we've taken that argument away because if you're following the DSA provisions, then you've figured it out.

So you take away that argument. They also like to say, oh, it's going to take so many resources. Smaller companies won't be able to invest those resources. Well, you have this kind of tiered size model in Europe and they've already invested the resources. All you're asking for now is a marginal few extra resources to do it in the U.S.. So you've taken away a lot of the arguments they love to use.

Additionally, the DSA shifts the incentives around some of this stuff. So just like companies don't want to see a bunch of different privacy laws, they also really don't want a bunch of different transparency laws. And they're looking at California, right? California just did a privacy law last year. They're talking about doing a few more. So if you're a company, you're already complying with DSA. There are going to be a few who will be strategic and want to get out in front of that and start doing these disclosures so that they can push for similar looking disclosures and make their life a little bit easier.

And then the last thing I want to say is voluntary standards and sort of voluntary compliance, we've seen this work in other product safety contexts before, where what will happen is a subset of companies will do the voluntary best practice, and a subset won't. So what happens is one, two, three years down the line, the companies who are doing this, who are investing in research or access programs, in disclosures, in user counts, they start to show up to Congress like, "Hey, we're doing this, but our competitors aren't. Knock this off, right? This is really annoying." And now what you've done is you've broken up a coalition of industry groups.

So that was a very long-winded answer, but I think we have a very unique one to two year opportunity to really, really build off the DSA, I think, if we're strategic about it.

Justin Hendrix:

Let me try to meet what you say is your innate pessimism with maybe some more pessimism. So maybe another argument you could make about sort of state of things at the moment, perhaps drawing from the composure of certain tech CEOs recently, the way they're comporting themselves, and the reporting that we've seen in places like the New York Times about some of the larger tech platforms almost standing down a little bit on addressing some of the harms of things like disinformation. Do you think that corporate leaders have, on some level, kind of internalized that Congress isn't going to do anything in this term, maybe next year? Who knows what happens in 2024 and beyond. And that to some extent, we've had all the hearings and that's all been done and there were no consequences. So what was it all for, essentially?

Anna Lenhart:

Yeah. Yeah, so I'll just share this memory I have. When we were working on the antitrust investigation, I have this very distinct, vivid memory from the hearing in July 2020. There was this moment where all the CEOs are up on the Zoom screens, and Jeff Bezos is sitting back in his chair, very relaxed. He hadn't really been asked very many questions, and he's just eating grapes very nonchalantly. Just so chill, not worried at all. And there was just this attitude radiating off of him, like, "These guys can't touch me. I can buy them off." I mean, AWS hosts most of the government websites. Like, "What are they going to do to me?" Right? I don't know if that's actually what was going through his mind. Just me as the staffer, this is just what I'm seeing and what I'm thinking.

And I just remember being so angry, and it was just this very distinct moment where I was like, "This is why monopolists are dangerous," right? Because they can just wave off regulators. So this whole Twitter situation, I can't even... People ask me what I think about online safety and what it's going to mean, and I just can't even answer that question because I am just so angry that we let this billionaire buy this platform and that we let this billionaire exist in this way, right? Billionaires are dangerous to democracy. I'm not the first to say this. Justice Brandeis has this quote, "We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." That is just so resonant right now.

Look, we need tax reform, we need to enforce our antitrust laws. We just really have to get to the bottom of this, which is that we cannot have people who can get this powerful. So anyway, it's just this interesting reminder for those of us who care about online safety and all these other issues, that they're all very connected.

Justin Hendrix:

While we're talking about risks to democracy, you mentioned January 6th, then that you were in and around the halls of Congress around that date. Can you talk a little bit about that experience, the extent to which that changed your time working in the House?

Anna Lenhart:

Absolutely. Yeah, so it was a really interesting day. I remember it, as most people, very vividly. It was towards the end of my fellowship with Congressman Cicilline. So I was wrapping things up. I was working remotely that day, but I live on 16th Street.

So I remember waking up that morning and first off, just so excited, right? I mean, the advocates in Georgia, amazing, right? Just so exciting to see the results from the Georgia Senate races. I remember putting peaches in my smoothie. I was just so excited. But I could hear. Because the people who were there for the rally were in town and I could hear them on 16th Street going down to the White House. And I remember thinking, "All right, whatever. They're here. They're here to share their voice. It's D.C., this happens a lot. But we won the Senate, and we're going to get some stuff done."

And obviously, as each hour passed in the day, it got very scary. And there were these moments... We just didn't know what was going on, right? So I was remote. A lot of my colleagues were remote. But a lot of my colleagues were on the Hill, and a lot of them were in spaces where you couldn't message them. You didn't know if they were okay. So there were just these hours where we were waiting to try to make sure that everyone was in a safe space and was okay. So it was obviously just a very scary time.

And I think for me, having just spent a year basically going through these documents and really understanding the ad tech market and the information ecosystem and the incentives and the practices of some of these companies and just the role that the information ecosystem obviously played in this violent attack, I just was filled with this deep knowing that I was going to need to stay. And I didn't know exactly how I was going to stay. My fellowship had ended. That's when I started reaching out to Congresswoman Trahan's office and got really lucky. But yeah, it was just an incredibly moving day.

And then what I'll say is when I eventually did start with Congresswoman Trahan, which was about a month later, I showed up to the congressional building, the Capitol, and it had just this big fence surrounding it and there were members of the National Guard all around. And in my early 20s, I did a lot of work in Sub-Saharan Africa, and I remember walking up to the Capitol and being like... It felt like I was walking into a government building in Namibia or Uganda, Rwanda, and it was just very striking, sort of like, "Wow, this is the people's House, and they closed up the people's House," and it was incredibly infuriating, right? I remember just... It was infuriating, but also, I just was so driven to want to try to address this, right? So yeah, it definitely changed. The House... It took a long time for it to really open back up again.

But look, I'm really proud of many of the members of Congress for despite things being closed down, really using phones, using Zoom, using all of these platforms they could to continue to hear from their constituents, to continue to engage stakeholders even while the actual physical building was closed.

Justin Hendrix:

What do you make of another big investigation, the House Select Committees look into the role of social media? Did you pay close attention to that output? Were you exposed to that investigation at all while it was happening?

Anna Lenhart:

Yeah, a little bit. I mean, certainly was not on the team. I was more of a cheerleader from a different committee. But look, I think they did an incredible job recording this moment in history and documenting the kind of leader Donald J. Trump was.

In terms of their report that was related to online platforms, I didn't get a chance, full disclosure, to read it all. I did what a lot of policy analysts do, and I skimmed the recommendations, which I thought were great and very much in line with a lot of Congresswoman Trahan's work. So I was excited to see that.

But look, unfortunately, the Select Committee wasn't truly, truly bipartisan, right? History will tell the story of Cheney and Kinzinger, and it will also tell the story of the cowardness of their colleagues, frankly. But because that was not a truly bipartisan investigation, I don't think we had an opportunity to do a comprehensive bipartisan online safety information ecosystem, transparency type of bill that I think a lot of us would've liked to do. So look, I think they did the best they could with the situation that they had.

Justin Hendrix:

And in some ways, January 6th could have been one of those catalyzing moments for tech reform, but it turned out not to be.

Anna Lenhart:

Definitely. Yeah.

Justin Hendrix:

So let me ask you just a little bit more about the process with the congresswoman. You got the Digital Services Oversight and Safety Act to the point of consideration. Were there Republicans who were fans of it or willing to work with you on it? Do you think that type of bill stands a chance at any point in the future?

Anna Lenhart:

Yeah, good questions. I think it's worth noting, and this is true not only of DSOSA, but also Social Media Data Act and a few of Congresswoman Trahan's other kind of investigations and proposals that she put out, that a lot of them were really sort of at the front of shifting the way we think about online safety.

So when I first started working in Congress, there were dozens of Section 230 bills. I think we might be up over a hundred now. I mean, there are so many. That was really the approach people were taking on both sides of aisle, right? Section 230, that was the Communications Decency Act. It was written in the '90s. It needs to be fixed.

So a lot of what myself and another colleague kind of working on transparency issues were doing was trying to say, "Listen, liability for content is tricky for a wide range of reasons. Let's start really viewing content moderation as a set of processes and systems," right? It's a series of algorithms, it's a series of people, it's a series of design choices. Let's start looking at it that way and really shifting the incentive so that companies want to do those processes better.

And when I say better, I don't mean take down more stuff or leave more stuff up. I mean write value-driven policies and do your best to implement them well, and think about the risks to society and think about how you're going to mitigate them.

So that's really the approach we were coming in, with both trying to encourage better researcher access to social media data so that you have sort of this independent check and independent understanding of how the platforms work and maybe are causing harms. Things like risk assessments and mitigation reporting, which is a big part of DSOSA.

A lot of these were really, really new ideas, so I certainly was not expecting them to catch on immediately and pass in the law. I do think the overturn window on many of them has moved substantially, and the number of groups that are now involved in these issues. I mean, when I first started, I remember reaching out to public interest groups to get help on the Social Media Data Act, and many had just not even thought about if researchers should get access to social media data and how to balance that with privacy and First Amendment issues and Fourth Amendment issues, and I could go on and on. And me approaching them with this idea was the first time they were even thinking about it.

So long way of answering your question, there's just so much socializing we need to do. Do I think Republicans can get on board with this? I do, actually. I think a lot of Republicans have been speaking about wanting more transparency into these platforms, wanting to understand the decisions they're making. I don't think they have totally socialized exactly what they want their approach to be to that transparency. So I think right now it's just a matter of how do you get people at the same table so they can actually figure out how are we going to do this.

I think the big questions are who, who should oversee the disclosures is a big challenge right now. This is not a secret. The Republicans are not very excited about a big government budget that funds either a new agency or a new bureau at the FTC. So then it's like, "Okay, well, if you can't build some kind of capacity into the government, how else do you get these disclosures? Who looks over them? Who makes sure that they're protecting people's privacy?" So I think that's the big set of questions right now. And we've shifted the conversation so much, so now I'm hoping that we can have one.

Justin Hendrix:

It almost feels like even listening to some of the argument around the Twitter files and the Hunter Biden laptop and the rest of the stuff out of Jim Jordan's strange committee hearings over the last few weeks, that if you kind of take away the rhetoric and you take away some of the details, that some of the calls for transparency and accountability do sound very similar to the types of calls that folks are making. But can you see that happening? Can you see them coming together or even recognizing that maybe they're after the same thing, possibly for different reasons, but that there could be some way to come together around that?

Anna Lenhart:

Justin, I've really, really tried, and I'll keep really, really trying. I think it is really hard, particularly now in the House with just all the conversations about the budget, right? And putting more resources into the government and spending, that just almost in some ways makes it a little bit of a non-starter. Because there's just really no way to do a transparency and disclosure regime without some entity in the government that's going to sort of do those rules and look at those disclosures and determine who a qualified researcher is, right? You just have to have some capacity. So I think that's played a role, unfortunately.

And then I think there's also just... Especially in the House, there's a little bit of a competition between the two parties to be like, "We are the party that is fixing Big Tech," right? So I think unfortunately, that's created a little bit of an incentive to be like, I don't know, "We want to do this. We want to figure this out on our own." Which to what I was saying earlier, it's just not possible with tech policies because you have to get 60 senators. So hopefully there'll be a shift and people will come together on that.

Justin Hendrix:

I have a friend who has this theory that everyone likes to say that they're strong on tech issues, that they want to fix Big Tech, they want to kind of take it to Big Tech, but that at the end of the day, everyone also wants to sort of take money from Big Tech and they want to fundraise on problems with Big Tech. And so there's a kind of disincentive in Congress to actually do anything about any tech issues for that reason. Because then not only would you perhaps lose support of the tech firms themselves, but you might lose a talking point for raising money from the grassroots.

Anna Lenhart:

Yeah. I mean, it's certainly an interesting theory. I mean, I think the other thing that's really interesting too is that most of the bills are not actually, quote, unquote, "Big Tech" bills.

I mean, this was really interesting. ADPPA covers any entity that collects data is covered by that bill, including nonprofits, by the way. And there were tiers based on how much data and the size of the company. You had more data rights you had to comply with. But still, it covered everyone, as it should. I mean, a comprehensive privacy bill should. But it was really funny to hear people talk about it as if it was a bill that was going after Big Tech, when it was really just going after any entity that was collecting data.

And then similarly with a lot of the online safety bills is that they actually cover a lot of platforms that have user-generated content, right? Everything from travel websites that have comments and rankings to Wikipedia gets thrown out a lot, right? There's a lot of user-generated content out there. And even cloud service providers make content moderation decisions, right? So that's been really interesting too, is that actually many of these bills, with the exception of the NHS bills, which had very clear cutoffs for just the big companies, most of the bills actually are the whole entire industry. So it is funny to see so many members kind of lean into just the big companies.

Justin Hendrix:

So what are you doing now or what's next? Can we expect to see you back on Capitol Hill at some point in the future, or certainly in Washington, D.C.?

Anna Lenhart:

Well, yeah, I'm definitely in Washington, D.C.. Still very much around. So I'm actually a full-time scholar of sorts. So I've actually been working on my PhD at University of Maryland for the last four years. A lot of my work centers around public engagement and tech policy. So those issues I was talking about before, just in terms of tech policy being underneath kitchen table issues and being very expert-driven at the moment. We do a lot of outreach to experts, but we don't always hear from the constituents and the users and the public at large. So a lot of my work centers around that.

Our lab created a game called Contenter in which people learn about how Section 230 works and then can engage in sort of policy discussion around platform accountability. We also did a study on privacy conscious smart home users, and really trying to understand the way people are protecting their data in their kind of advanced smart homes, and how we can put in privacy policies that can encourage that behavior.

And then also been looking a lot at how stakeholders influence privacy policy both in the U.S. And Europe. So to get into some of your earlier questions, why does this work in Europe? Are there things we can learn about how that policy is influenced and move that over to the States?

So working on the PhD. And then starting March 1st, I'll be a policy fellow at GW's, so George Washington University's Institute for Data, Democracy & Politics, working under Rebekah Tromble. So very excited for this. That role will be multifaceted, but will include very much translating and bridging scholarship around online safety with policies at the federal level, state, international, hoping to just be a resource for people working on tech transparency, data rights, and making sure that they have all the information they need to do informed policy.

And then the last project that I would love to talk about is a continuation of an issue I worked on during the last NDAA, so the National Defense Authorization Act. Over the last few years, working on researcher access of social media data, it's become very clear to me that a country by country approach to researcher access is going to pose a lot of challenges, mostly because the internet doesn't have a ton of borders, but I have a photo of me on Instagram with my German exchange student. Whose data rights are implicated in that? What API stream does that end up in?

I really think it would be smart if we were able to have researchers from the U.S. and some of our partner nations be able to study online platforms across a kind of broader jurisdiction of countries. And fortunately, the Carnegie Endowment for International Peace has been working on a project called IRIE, the Institute for Research on the Information Environment. It's a really cool project. They're building out this shared infrastructure for researching the information ecosystem.

But Trahan and Cicilline wrote an amendment, which is Section 5860 of the House passed NDAA, if you want to look at it, that really directed the State Department to kind of start to initiate multilateral agreements to really harmonize privacy laws and data and research ethics laws across nations that might want to share this type of research infrastructure. So that's going to be something that I'll be continuing to research and write about and advocate for because I think there's huge opportunities to take that approach.

Justin Hendrix:

Well, very exciting and something that perhaps may not get the same type of attention as a congressional bill, but in the long run could potentially have a huge impact if there is in fact a kind of coalition of democracies that are able to synchronize on these issues.

Anna Lenhart:

Agree, agree.

Justin Hendrix:

Well, and you're working with Rebekah Tromble, who's been a friend to Tech Policy Press and is at the center of those conversations both here and in Europe. So that should be great work ahead. So I wish you luck and I hope you'll come back and tell us more about your work at another time.

Anna Lenhart:

I would love to. And I also just would love to take the opportunity to say to you and all of your guests in the community that there are tech staffers on the Hill who do listen every Sunday. We do really care about the conversations that are happening here, and they've been really, really thoughtful and I've been really grateful. So thank you.

Justin Hendrix:

Thank you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics