Home

Donate

A Conversation with Meredith Whittaker, President of Signal

Justin Hendrix / Jun 18, 2023

Audio of this conversation is available via your favorite podcast service.

Earlier this month, I traveled to RightsCon, the big gathering of individuals and organizations concerned with human rights and technology organized by Access Now. The sprawling event had hundreds of sessions on a wide range of themes, but one topic discussed across multiple tracks was the importance of encrypted communications, especially to groups such as political dissidents and journalists.

A key panel at RightsCon featured Signal President Meredith Whittaker, who spoke out about policies proposed in legislatures around the world that threaten the promise of end-to-end encryption to preserve the privacy of messages sent between individuals and groups. Leaders of encrypted apps have pulled together of late to speak out against the proposed UK Online Safety Bill, signing letters and appearing at events. Shortly after RightsCon, I connected with Meredith to learn more about Signal’s posture against such legislation, why she sees encrypted communications as so crucial to freedom and human rights, and how the company thinks about safety and its role in the broader digital ecosystem.

What follows is a lightly edited transcript.

Justin Hendrix:

Meredith, I want to just start by putting Signal in context, because I think all of my listeners know what Signal is, I would suspect that more than half of them probably have a Signal app on their phone, but many folks probably don't know terribly much about the organization itself. Give me the basics. How many users, how many employees, how is it funded?

Meredith Whittaker:

Well, we don't give clear user numbers, but a short hack to try to deduce such for those of you into that kind of a sleuthing is to go to the Play store, and you could see that, just for Android, it's been downloaded over a hundred million times. So what I can say is we have many tens of millions of users across the globe, and that's wonderful because that makes us the most widely used truly private communications app in the world.

But that's not the only special thing about Signal. Signal is also a nonprofit, which, in the tech industry, an industry where the dominant business model is monetizing surveillance in some form or another, it's really important to have a structure where you are not tempted or even pushed by investors, or private equity companies, or anyone else, to prioritize profit, which often means surveillance over privacy. So the structure of the nonprofit is actually really important to ensuring that we can keep our eyes on the mission, and we can keep focused on what we are here to do, which is provide a truly private means of communication in a digital ecosystem that has been hollowed out by surveillance, or built around surveillance more accurately, over the past number of decades.

Justin Hendrix:

In that fight, of course, Signal plays an outsized role, you have an outsized voice, but again I, just for contextualization's sake, just want my listeners to understand scale of the organization, scale of the budget. On one level on the 990, I can see that the thing is roughly pulling in, what, 14, 15 million a year in the donations essentially. Roughly how many employees?

Meredith Whittaker:

So we have a little over 40- there's give or take there, but incredibly small for an organization developing and delivering a high availability app that needs to meet the norms and expectations of messaging, or those who want to use it for ideological reasons can't use it because the friends who don't understand how it works and just want a message won't use it either. So we're very small team, we're developing across multiple clients. We're about as lean as you could be for an organization developing something as expensive and labor-intensive as a messaging app that works instantly, always, everywhere.

In terms of funding, we are donor funded with a large runway that was provided by Brian Acton. But I think what it's important to stress is it costs tens of millions of dollars a year to develop and operate Signal, and that is hosting cost, that's registration cost, that's labor cost, but it is a forever expense that goes up, somewhat not one to one, but it goes up in proportion to the number of users we have. So the goal for Signal is really to be small donor funded, and that's one of my key focuses. We want to be funded by small donations from some percentage of those tens of millions of people who rely on Signal because we want to be accountable to those people and because we want a model that is as safe and robust as possible. We don't want to rely on the generosity of one or another institution or person because of course those institutions or people have the right at any moment to decide they want to do something else with their money, and we don't want to be in that position.

Justin Hendrix:

I should disclose that I am one of those small donors. I have the little star icon next to my account, I suppose, which indicates that I part with about $5 a month to Signal Foundation. So doing my best I suppose, but I of course use the app religiously. I want to go into a little bit of just the philosophy behind Signal a bit, and I don't want to spend too much on it because I know that my listeners can go and read quite a lot of material on this. But I was struck by this Anna Wiener profile of Moxie Marlinspike, one of the founders of Signal in the New Yorker back in 2020, talked about the idea that Signal has this halo of subversion. It certainly has a cool, almost, I suppose... What's the word I'm looking for?

Meredith Whittaker:

Iconic.

Justin Hendrix:

Maybe a punk rock kind of sensibility to the Signal brand, but is there a conceptual framework or an evolving philosophy of signal that goes beyond, "We think privacy is good?"

Meredith Whittaker:

Well. I think Signal is cool because it's cool, it's real because it's real, right? There's a way that contrasts really shockingly with the PR inflected comms that are meant to be as safe for everyone as possible, the excuses and framings around new products and features that come out of the dominant tech companies that are clearly calibrated to please advertisers or investors but are narrated as good for humanity. It is very different when you are simply able to be real about what you're doing, when you're actually able to focus on one vision, and that is privacy is good, but we can talk about why privacy is good. Power asymmetry thrives on information asymmetry. Surveillance is always a tool of the powerful to classify people into good or bad, to control people based on various interests, and can go beyond that to devastating effect.

We can't look at a mass atrocity in the last hundreds of years without recognizing that required the categorization, enumeration and monitoring of populations to accomplish. Now, we are now in an age, over the past handful of decades, where mass surveillance on a scale and granularity that we've never seen before is commonplace simply as a product of our participating in the regular social and economic life. We can't get a job without having a cell phone, we're flagged as a potential suspect if we don't have a social media trail via some algorithms. Our lives are narrated and surveilled at an extent we haven't seen. So what Signal is doing is saying privacy is good, but it's also saying, "And here is a way to preserve the norm of privacy that was part of human communications for literally millions of years before..." We can mark certain dates, but let's choose the mid '90s when the Clinton administration put down a neoliberal policy agenda where privatization and commercialization of network computation was core to their neoliberal aspirations, and they handed the keys to private industry to self-regulate on privacy. The surveillance advertising business model was developed, and now we have a real set of problems that at their core involve the power asymmetries of surveillance that Signal is simply working to provide a real alternative. But providing a real alternative is incredibly cool. I would say it is indeed punk rock.

Justin Hendrix:

I want to get a little bit into, well, the thing you just mentioned, which is putting things in a bit of a historical context. When we happened to see each other last week at RightsCon in Costa Rica, we talked very briefly about this moment that we're in where encryption privacy appear to be really under threat across the globe, even in democracies that claim to protect it and to prize free expression. I opened the New York Times on my phone this morning and saw Julia Angwin's guest essay, One of the Last Bastions of Digital Privacy Is Under Threat. She appears also to be framing this moment that we're in as a kind of turning point. Does it feel that way to you? Do you think you can tell which way it's going to go at the moment?

Meredith Whittaker:

It feels critically important. And I don't believe in inevitability, I believe in invitations to push for better futures. I have felt critical moments before, certainly 2015, 2016, with the showdown between Apple and the FBI, was a critical moment and could have gone otherwise if there wasn't concerted, organized pushback from experts who understood the stakes. I think we are in another one of those moments. I think there are an array of forces that are troublingly aligned against the right to privacy, or the right to communicate digitally outside of the mass surveillance of governments and corporations. And to be honest with you, one of the reasons I decided to take this job was, I saw this moment on the horizon. This has been building for a while and I didn't think there was much that was more important I could be doing than doing everything I could to ensure that Signal was safe, and that we could meet this moment and continue to provide people a real private communications option.

Justin Hendrix:

Your voice has been prominent in the last few weeks. You've had op-eds, you've had appearances, you've been attending events like RightsCon. When you survey the landscape at the moment, where are the kind of policy environments where you're most concerned? It seems like the UK clearly is the place where you've put a lot of your effort lately, but it's not just the UK.

Meredith Whittaker:

The UK is the closest to potentially being ratified, so we have certainly been focused on the UK, and of course, as you know, Justin, as somebody who's also working in this space for a long time, one law passes and it's copy, paste. The power of the precedent is huge here, and the UK would be the first to implement a law that allows one of their government agencies to mandate client side scanning, which is effectively mass surveillance that would scan every message you sent before you sent it against an opaque database judging whether it's acceptable speech or not. Whether or not that judgment was correct, whether or not there was a malformed AI system in the background making spurious predictions or determinations would be hard to know, but those are really the stakes. And if that passes in the UK, and becomes precedent, it becomes easy for any other country to point, so, "Well, the UK did it, and the UK certainly is a democracy, so how can you say that this is a move toward authoritarianism?" And then you are effectively moving in a direction where you're eliminating the possibility for private digital communications, extraordinarily stark.

Justin Hendrix:

When you look to the EU, of course we've seen this leaked document recently published in Wired that seemed to expose the different member states' perspectives on encryption. Spain perhaps going in a particularly worrying direction politically, certainly on one end of the spectrum there, proposing essentially the outlaw of end-to-end encryption in its borders. Spain also, I think, set to take over the presidency of the EU next year. What's your perspective on what will happen in the EU? Are you able to tell, or is it a little more up in the air?

Meredith Whittaker:

Yeah, it feels more up in the air, and you also have a number of countries who were not aligned with eliminating or undermining privacy, many of the northern states, so there there's a clearer divide there. But I do want to caveat this, I'm lucky to work with a coalition of experts. I work with folks at EDRi, I work with Open Rights Group in the UK, I talk to a number of other people who are much closer on the ground in the day-to-day political discussions than I am. That's just not something that, as a small organization, based in the US primarily, with a lot of other duties, that I'm able to do. So that's the type of question I would actually ask them because I would prioritize their expertise and really just their instinct on how the nuts and bolts of politics gets done.

But my impression is the UK is closer, and we need to watch that carefully, and that's where we have been applying pressure based off of the advice of a lot of the folks we work with. And the EU is farther away, but nonetheless, extremely concerning. And to be clear for everyone, our position on both is the same. We simply will not undermine the privacy promises we make to the people who rely on us, and we will never participate in a regime that would force us to adulterate or weaken our encryption.

Justin Hendrix:

You mentioned at RightsCon that, if in fact the UK were to introduce some requirement for client side scanning or otherwise interfere in the function of encrypted messaging apps, you might have to revert to offering the service in the same way that you might in a country where it is in fact banned, so perhaps a proxy like in Iran?

Meredith Whittaker:

Yeah, we would do anything we could to get the people in the UK access to the right to private communications. We're pretty agnostic there. We do recognize that in moments like the uprisings in Iran, or in moments were the UK government to move forward mandating mass surveillance and undermining the rights of privacy, that decision doesn't necessarily represent the people in those countries, and that those people still have the right to privacy. So in the case of Iran, we worked with our community who set up proxy servers that helped people who had Signal installed in Iran bypass the block, and we would certainly explore that and other options in the UK. Our commitment is to the people in the UK, and our commitment is to ensure they can exercise their right to privacy.

Justin Hendrix:

I suppose there is at least some defense or some good thing about the reality that the US is essentially unable to pass much of anything when it comes to technology, yet encryption seems to be perpetually under the target here. Is there a backup plan if in fact the US goes in the wrong direction?

Meredith Whittaker:

Well, I mean that's certainly something that we discuss. I'm not going to lay out the blueprints for the opposition, but I think there is a backup plan, but also real talk on this, people access through us through the app stores. We'll do what we can, but the centralized nature of the technological ecosystem that we exist within does mean that we really don't want it to get to that point.

Justin Hendrix:

I think that's one of the things about encrypted messaging when you really look at it as the pivotal almost totem when it comes to digital privacy, that if in fact it's impossible to send a private message from one individual to the next, we've then moved into a different phase altogether on this planet with regard to the relationship between technology firms, governments, and individuals in society. I struggle, and I know you have no appetite for doomerism, but I still struggle slightly with how practically we get to, say, the year 2100 with the ability to send that private message still intact. And I'm interested in it not as a doomerist point of view, but more practically. Clearly, defending encryption on the message apps we have at the moment is important, but what else do you think is key beyond that? You've talked a lot about AI recently, and about the threat essentially that AI poses, and why encryption is important in that context as well.

Meredith Whittaker:

Yeah, absolutely. And when we're talking about 2100, I don't know how we get to 2100 with the ability to use the kind of water resources and energy resources on data centers that they require. We're looking at... I flew in from Costa Rica to a New York that was still shrouded in smoke from Canadian wildfires that general expert consensus read as exacerbated by climate change. We're hitting that log function inflection point where it's very, very difficult to predict the future based on past data, and that is primarily... The X-risk I am worried about is climate. We're looking at, by 2050, an estimate I read the other day, and I'm not a climate scientist, but an estimate by a climate scientist is about 50% of the world's population will be displaced. So we're looking at significant upheaval. We're also looking at a structure of our world and the incentives that propel many of the powerful institutions in our world that is not commensurate with ensuring that we handle that the impending crisis in a way that ensures everyone has as much of an ability to have a thriving and dignified life as possible.

So, not to skirt around the core question, but I think there's a lot coming up that means that, for me, if we are going to be able to create a livable world, if we are going to be able to enact the necessary and socially beneficial significant transformations of the way that our governments and our businesses and institutions work such that we're able to absorb these shocks as gracefully as possible, then we will need to be able to communicate privately. It's an absolute necessity because those transformations will require a transformation of how power works, and it will require those who are most at risk of these harms, those who are in the often disinvested regions that contributed the least to the climate crisis, to be calling the shots on how this crisis is dealt with, how resources are distributed, how we are changing our consumption patterns to respect a changing planet that humans had a hand in changing.

So I think there is a world where we can't organize, where we can't say anything to each other without being surveilled by those who may have an interest in curtailing dissent, and where it feels dangerous even within our own psyche to explore dissident ideas or experimental notions, or to question orthodoxy. So we absolutely need the ability to communicate privately and that ability needs to be translated to the digital world because so much of our lives has effectively been moved, consensually or not, into the digital world.

So then how do we ensure that that right remains? And I think we ensure that by continuing to fight, I don't think the will to mass surveillance is ever going away because of course the will to mass surveillance is a will to power. Information asymmetry is a core tool of the powerful that is used to exact that power over those less powerful than them. And we've seen that from the beginning, we have state statistics, we can look at the Hollerith machines, and the enumeration of populations, and the good and the abjectly awful that has come out of that. We can look at all the history here, but it's pretty clear. So I don't have a formula for how we always win, but I think we need to recognize that this is a fight we're never going to be given the luxury of getting tired of.

Justin Hendrix:

So I'm all in on that gospel, and see the world, I suppose, very similarly to you, that's my bias, I suppose. There's probably somebody listening to this saying, "Yes, all for political dissent, et cetera, and yet lots of harm's taking place that I'm also concerned about. Concerned about child sexual abuse, and child sexual abuse material spreading on these apps, concerned about terrorism, concerned about scams and fraud, concerned about groups that get so big that conspiracy theories can spread."

Meredith Whittaker:

I think, one, we have to recognize those problems aren't online problems, and this conversation gets abstracted a lot into... We use the prefix online digital terrorism, online child abuse, but of course terrorism is committed by real people in the world who form the will to harm through complex social dynamics in a historical context. And again, that's not my field. Child abuse is committed by real people, real children are suffering and actually need help. This is not happening in an abstract realm. Now, I think I also want to name, there is a very big difference between a social media platform with broadcast capabilities that provides a content feed that we know, from early-useniks groups on, always effectively need a moderator or they devolve into some kind of gnarly mass where no one wants to be there, and a communication service that enables people to talk to each other one on one, or in chosen groups.

So I think we cannot abstract these problems into online and then do the age-old trick that tech loves of shaping or framing a complex social problem as a technical problem, and then assuming it is going to be solved by tech. If we're going to be real about this, we can look at child abuse. The numbers are fairly clear. The majority of child abuse happens in the family. When it doesn't happen in the family, it is perpetrated most likely by somebody who has been entrusted as an authority figure to care for children, so teachers, pastors, youth group leaders, whatever it is. So that framing of the problem already shifts the solution space. That is not an online problem.

We're looking at a problem that looks a lot more like a need for trusted intervention and support, and in fact evidence points to the methods that generally actually help these real children who are suffering and need help look a lot more like early intervention. It looks like education so children know the boundaries of right and wrong and have the language to express it. It looks like social services and financial support that can help women and children get out of abusive situations because domestic abuse of, generally, a woman partner is very highly correlated with child abuse and it is not infrequent that the reason that abuse persists is because there is financial abuse that it's hard to leave, maybe they don't have a job, don't have a place to go. And so providing those services is incredibly important.

And then when I look at the UK, where child abuse is the rationale, or the banner under which proposals for mass surveillance are being justified, it's very hard to take those proposals seriously when you recognize that in the last 10 years they've cut funding for earlier intervention by 50%. So we have evidence-based approaches to handling some of these tragic and horribly upsetting problems, but what we often see is a rush to frame these as technical problems, frame technical solutions, and in the case of client side scanning, the mass surveillance approach, there are a number of firms who are set to get big contracts if these go into play, and offer those as solving a problem that most people don't want to look in the eye. People don't want to look in the eye a culture that, to pick one difficult example, looked the other way at the Catholic Church's sexual abuse scandal for years and years, even though it was broadly known. They don't want to look at a culture where there's one person being charged in Jeffrey Epstein's child rape ring, even though there are hundreds of powerful people clearly implemented. There is something, we can talk about this, we can look at it, but I think oftentimes abstracting this online and then offering a cheap technical solution is actually a way to avoid looking at the much more difficult reality of the nature of the issue we're dealing with.

Justin Hendrix:

Is there a world where the thing that we have to worry about most with regard to threats to privacy are folks who, I don't know, just don't think that there's anything to hide on their own personal message app, and so they're simply not concerned with surveillance. Is that perhaps a bigger problem than perhaps we make it out to be in these discussions?

Meredith Whittaker:

Yeah, we certainly hear that view, and it's often expressed by people who are fairly powerful and fairly privileged. But I think, let's be real, you don't have to do anything wrong, you just have to do something that somebody empowered doesn't like or has criminalized. You have to be the wrong type of person. You have to be classified as the wrong type of person. Right now, what does wrong look? Well, in Uganda, wrong looks like being a gay person. You can get killed. Now, there's a law that was passed, pushed by US-based evangelicals that criminalizes being gay with death. In Florida, what does being wrong look like? It looks like wanting to learn history, checking out the wrong book. You have bills that are proposing criminalizing librarians. In other states, what does wrong look like? It looks like wanting access to healthcare as a pregnant person, and that's not just abortion care. We have people who are unable to access any healthcare because of fears of liability around harming a fetus.

So let's be real, everyone may have something to hide if there's somebody who would hurt them if they knew that. This is about power, this is about asymmetry, and this is about the fundamental ability to communicate safely and privately outside of the gaze of those who could harm you based on what you have to say.

Justin Hendrix:

The change in the tone around data privacy and fears around erosion of basic rights in the US has changed immensely since last summer and the overturning of Roe v. Wade, are you able to attribute any particular growth in the Signal app's downloads to that moment?

Meredith Whittaker:

We see steady growth, but we don't collect analytics. So not only do we not, we protect message contents with the Signal protocol, we have developed novel cryptographic techniques to protect metadata so we don't know who you are, we don't know who you're talking to, we don't know who's in your contact list or your groups, and we don't collect the kind of analytics and telemetry data that most other apps do. So we have very little information that can correlate a given event with a spike in usage. Of course, in Iran or Ukraine we'll see growth and we'll make an educated guess, but our privacy commitments go down to the level of even blinkering us from some of the information that would enable us to make more informed determinations.

Justin Hendrix:

Apart from the idea of trying to minimize, or limit, or completely not collect data at all in many cases, how do you make decisions about the safety of certain features or settings? One thing, for instance, that I'm aware of, is Signal recently increased the group size, for instance.

Meredith Whittaker:

It's 1,000 now.

Justin Hendrix:

Up to 1,000, which is much less, as I understand it, than perhaps some other message apps. So maybe on the far end of the spectrum, you've got things like Telegram, which I think will allow hundreds of thousands of users into a group. How does safety factor into those types of decisions?

Meredith Whittaker:

We talk about it a lot, and so we think about this a lot on the product side of things and happily we have a really, really good Chief Product Officer, who's thought about these things for years. But we talk about it a lot, we read, we talk to experts. But in some sense, we're balancing things. We hear from organizers, "We would love a group that is 100,000 big, so we can just text out updates to our network," and we get those requests daily. And then we recognize that we don't want to become a social media platform or broadcast media service. So how do we balance that? Well, what number makes sense? Well, let's look at what alternatives are doing. Okay, well let's look at what are the requests, what are the justifications for that? Okay, well 1,000 is large enough to include a small network of folks, maybe your regional librarians or network across the US or something. But certainly 1,000 likes is not a viral tweet, so it's not something that is going to go viral, and we don't have forwarding features, and we don't have the ability for that metastatic growth.

And so that's just an example of some of the thinking, but it's a lot of listening, and a lot of discussion, and a lot of, thankfully, relying on the expertise of folks who've been in this industry or hit some of these issues. Of course, Brian was a co-founder of WhatsApp, so it's helpful that he has gone through this with a very different product, but nonetheless has thought through some of these things during that decision-making process. So there's no perfect formula for it, and always, we make the decision and then we continue listening, and we continue looking at the ecosystem, we continue looking at what's actually happening insofar as we have those insights.

Justin Hendrix:

I suppose the last question I do have for you is about Signal's relationship to those other messaging apps such as WhatsApp. And you were just on a stage last week with Will Cathcart, who's the head of WhatsApp. You have recently publicly, the developers of encrypted message apps have signed letters together and otherwise lobbied against certain policies that they're concerned about. What do you think of as Signal's role at that table? Are you a kind of, I don't know, blast shield for perhaps these larger tech firms that may have to bend to the interests of governments in a way that Signal perhaps can afford not to?

Meredith Whittaker:

We're just not going to. We'll go broke not, if we can afford it. Either way, we have one thing we do, and we're committed to doing that, and that is providing a truly private digital communications app. So, blast shield, I'm trying to think about that metaphor. Are we taking the heat from them? I don't necessarily see it that way. In the case of WhatsApp or these larger organizations, they also have resources we don't. We're never going to have a huge policy team. We're never going to have a network of offices across the globe, or at least not in a future I foresee, and we may or may not have the resources to fight some of the court battles. So I don't know that we take the heat, but what we can do again, is be real where they often can't.

We don't have investors who are pressuring us. I don't have a nervous general counsel who's constantly concerned that one misstep might make Meta look bad. WhatsApp is in a weird position because they have done a huge amount to protect privacy by integrating the Signal protocol to protect the contents of WhatsApp messages, or most WhatsApp messages, like WhatsApp for business isn't protected, but they're also part of a massive surveillance company. So I know, because I've worked, where those companies have to skirt around certain messaging, or probably have to review a one-sentence tweet about 30,000 times with many, many different offices, we don't have to do that because, really, we know what we are, our analysis is very clear, and we're just going to go out there and say it. So I think we're often critical of WhatsApp, I think some of their privacy claims are overstated, I would love for them to adopt the technologies we develop to protect metadata as well. But where we are aligned in terms of, say, the UK's Online Safety Bill's encryption provisions or what have you, we're also happy to work for them. Our focus is ensuring that truly private communication continues to be possible

Justin Hendrix:

As President of the Signal Foundation, my last question to you. You've got, let's say, a roadmap three, five years in front of you, what do you hope to accomplish in that timeframe? What can we look back on perhaps the first years of Meredith Whittaker's term at Signal? What will we be able to say?

Meredith Whittaker:

Well, I am really focused on that sustainability model. Can we show that tech can be done differently, that we can produce a high availability app, and do it in a way that does not rely on monetizing surveillance, and is accountable to the people who use us, not a board of directors or whatever, shadowy investment firms? So that is really important, and I think that's actually key for sustaining privacy, or digital privacy. I'm also very interested in just continuing to grow the usage and accessibility of Signal, both in terms of ensuring that people of all abilities can use Signal, but being attentive to the various and diverse ways that people use messaging, the requirements for messaging, getting a much better sense of how Signal performs in low bandwidth environments, and getting a sense of folks in different regions, what kind of features they prefer, and really thinking about, like, "We're not going to be able to build many, many different versions, but how do we build something that isn't completely missing a must-have feature in a region where people might use that to communicate?"

And I think an example of that is Stories. We got some blow-back, mainly from tech folks in the US, when we launched Stories, but Stories is one of those features. It's huge in South Asia, it's huge in Brazil, it's a tool for day-to-day communication, without which it's very difficult to say, "Hey, let's switch the Signal." So really thinking about, how do we make the app as performant and as robust so that when there is a collective need for privacy, or there's an event, or, say, WhatsApp changes its terms of service again to further integrate its data with Facebook, or people just become increasingly sensitive to the harms of the surveillance business model, whatever it is, when they go to pick up Signal, it's there and doing what they need for them, not something that we still have to promise in six months it might work? So that we're there and available, and anyone anywhere can pick up Signal, use it easily to contact anyone else.

Justin Hendrix:

So there's both a defensive strategy, I suppose, around the policy environment, and some of the existential concerns there, and it sounds like there's offense as well in terms of the further development.

Meredith Whittaker:

Absolutely. And of course, in five years, we've beaten back these anti-privacy bills, and we have a bit of green pasture to focus on the fundamentals. Amen.

Justin Hendrix:

Well, I hope I'll talk to you before those five years are up again, and we'll hear a little bit about that progress and how things are going. Meredith, thank you.

Meredith Whittaker:

So nice to talk to you, Justin. So glad we finally made this happen, and thank you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics