Home

Donate

Resisting the Tech Coup: A Conversation with Marietje Schaake

Justin Hendrix / Sep 22, 2024

Audio of this conversation is available via your favorite podcast service.

Marietje Schaake is the author of The Tech Coup: How to Save Democracy from Silicon Valley. Dr. Alondra Nelson, a Professor at the Institute for Advanced Study, who served as deputy assistant to President Joe Biden and Acting Director of the White House Office of Science and Technology Policy (OSTP), calls Schaake “a twenty-first century Tocqueville” who “looks at Silicon Valley and its impact on democratic society with an outsider’s gimlet eye.” Nobel prize winner Maria Ressa says Schaake's new book “exposes the unchecked, corrosive power that is undermining democracy, human rights, and our global order.” And author and activist Cory Doctorow says the book offers “A thorough and necessary explanation of the parade of policy failures that enshittified the internet—and a sound prescription for its disenshittification.” I spoke to Schaake just before the book's publication on September 24, 2024.

A transcript of this episode is forthcoming.

Justin Hendrix:

Marietje, thank you so much for joining me today, and looking forward to talking about this book. You bookend this with stories from Iran. At the start of the book, you talk about an interaction you have with a young man named Ali in a cafe and you end with the tale of Mahsa Amini. Why did you choose to frame a book about tech and democracy through the lens about the experience of these people in Iran?

Marietje Schaake:

Well, in The Tech Coup, I also take the reader along in experiences that I've had in thinking about technology when I was a member of the European Parliament, when I realized very clearly what the intersection was between technology and human rights questions and how my own journey really evolved with regard to high hopes about what technology would do for democracy, the unheard voices, their right to speak and assemble.

The peak of that hope was very clear in the Green Movement, the uprisings after the presidential election in Iran in 2009, which was also the summer that I was first elected. So, I just found a lot of important moments for my own understanding of the role of technology that both amplified unheard voices but also repressed them and hacked their devices and spied on them.

And then fast-forward to what we are seeing these days and last summer is really the fact that technology has not solved all these repressive regimes vis-a-vis these young populations. And in fact, more repressive technology is available at a lower price with more sophistication also from countries like China. And so, I just thought it was a helpful anecdote, but also very human story to tell about how technology is actually used for repression if we do not deliberately make rules to make it a force of advancing human rights and defending and strengthening democracy and the rule law.

Justin Hendrix:

This book takes the view that we have not done that, and then in fact, something very opposite of that is what is taking place. Just explain for the listener the title, The Tech Coup. What is this coup you're talking about?

Marietje Schaake:

The coup refers to the power grab or the erosion of power on the side of democratic governments and leaders vis-a-vis tech companies or really an ecosystem of tech companies, bigger ones, smaller ones, the ones we all know well, social media companies, retail platforms, but also the ones we really don't know, those that offer spyware, those that develop data centers or microchips. What I would love the reader to come away with is a deeper understanding about where all this power grab is happening and how in a variety of ways it is harmful for democracy.

I think people are now well aware of the discussion about disinformation, for example, and a lack of trust in the electoral process, particularly in the United States. That's a hot topic, but there are other areas where, for example, when we look at infrastructure, I do not see as much of a discussion about who gets to decide where legitimacy to make fundamental decisions about national security or the future of use of natural resources or landscapes, where that legitimacy to decide for companies comes from, and in turn, where accountability mechanisms come from. In my opinion, those are lacking and there's not enough appreciation for what that means for the whole question of power and agency and accountability in the digital realm.

Justin Hendrix:

It's apparent that you have crossed the world talking about these issues, but you've also physically encountered a lot of the key actors from the venture capitalists through the regulators. You tell a story of being on Richard Branson's Necker Island. So, I have to ask you first to describe what it was like there, but also maybe as a window into what it's like to occasionally encounter this extraordinary wealth and the mindset, the imagination of tech executives.

Marietje Schaake:

Necker Island is everything you imagine when you close your eyes and think of a tropical island, white beaches, beautiful blue ocean, sun, a wonderful beautiful home of Richard Branson that he opened up to host what was then the second Blockchain Summit. And it's interesting how quickly blockchain came and went as a hype. Sometimes I think about it now as we hear so much about AI that not that long ago, everybody had the same kind of expectations of revolution that would be brought about by blockchain. And here we are never really talking about it anymore or even laughing about the promise that some people thought it would bring.

But this gathering saw a lot of the blockchain evangelists, if you want to call it that, the true believers, the early investors, the early entrepreneurs into this technology, and particularly the cryptocurrencies as well, talking about how blockchain would democratize and really empower people who were disempowered, who would cut away intermediaries in a number of transactions, whether financial or administrative and how that would benefit everybody.

And illustrated in that story, but also in others that you'll find in The Tech Coup is this notion that with several waves of technological breakthroughs or new services becoming popular, the promise was just so incredibly optimistic and the idea was, we don't need governments, we don't need rules. In fact, if those rules are made and if those guardrails are put in place, all the benefits of this technology will suffer and those who put in place barriers like the EU will actually lag behind and it will all be a waste of opportunity of all the beautiful things that this new technological wave will bring.

And over again, we have to conclude looking back that most of those promises did not materialize and the fact that no deliberate guardrails or regulations or obligations or accountability mechanisms were put in place. And as a result, a lot of the harms have exponentially impacted people around the world. And democratic governments have lost a grip and have lost agency really, a way to understand and govern technologies in a way that is now a systemic problem.

Justin Hendrix:

So, I want to dig into that a little bit. The US in particular, we've got an election coming up. We've got a choice before us about candidates. I seem to remember talking to you early in the days of the Biden administration and you being a little bit ambivalent about whether there would be an enormous difference, for instance between how a Trump administration would do tech policy versus a Biden administration. So, I'm interested in whether you think there's a more stark choice ahead of us or how you're thinking about going forward.

One thing about this book you, I think nailed it, US tech policy in many ways is about national security. You also point out that despite the catalytic event that we appear to be having around the Cambridge Analytica scandal in 2018, not much has happened here. But what do you make of the situation in the US right now?

Marietje Schaake:

Well, we could have a whole different podcast about how worried I am about the elections and politics in the United States where so many people don't trust the institutions and the political process around elections anymore. I think a lot of that can be attributed to disinformation. But let me pick up on your question about the difference between Biden and Trump and Kamala Harris and Trump as candidates for the next presidential election.

It's true that my expectations were not very high about what Joe Biden could achieve as president, not for lack of his ambitions because I think more than any president that came before him, he had a stronger vision about curbing the out-sized power of big tech. He wrote about it in The Wall Street Journal saying there needs to be more privacy protection, protection of minors. But the political reality in the US looking at the Congress is that the polarization and the partisanship is so significant that the expectations of what can be changed are low. That's an unfortunate reality.

Having said that, I think especially for the non-American listeners, it is often underestimated what courts can do in the United States, what agency the FTC and other federal agencies have to interpret the rules and run with them, but also what state legislatures can do. And on the latter front, so courts, federal agencies, and state-based rules and laws, I do think there are interesting developments in the United States that are often overlooked I think outside of the US because there's still not a federal data protection law in the US and there's not a comprehensive AI legislation being adopted. But that doesn't mean nothing is happening. So, I'm hopeful that even the United States can correct some of the wrongs of taking a hands-off approach for too long and by both parties, it should be noted.

Justin Hendrix:

I was thinking yesterday about the AI executive order, which I think a lot of folks will hold out as the Biden administration's perhaps biggest achievement when it comes to tech policy and thinking about your assertion that US, when it thinks about technology regulation, it's really all about national security. That's the kind of through-line that's both motivating but also perhaps demotivating certain behaviors. Do you think folks outside of the US understand the extent to which the national security imperative is shared across both parties and dominates our approach to things?

Marietje Schaake:

I'm not sure it's understood in the tech realm specifically. I think people see the enormous defense spending in the United States, see a tendency to engage in geopolitics from a security and defense point of view, but I'm not sure that the extent to which national security motivates tech investments and thoughts about regulation. But also the extent to which, for example, the concerns about the rise of China bring together these very divided Democrats and Republicans as one of the few topics. And politically speaking and practically, that means that there is simply opportunity to get things done on that subject where there isn't with other arguments or other focus, like data protection or civil liberties in tech for example.

So, that context is really important and acknowledging that context where national security is just a very potent argument in US politics, I do think that the executive order on AI takes a much broader scope but also was forced to stay in the lines of existing laws. And so, that was the maximum that the Biden administration could do given the lack of political will or majorities in Congress.

Another executive order that I believe is really quite significant in terms of how bold is the one on spyware, which it came very late compared to where we are and were in Europe in thinking about the harms of spyware, but it was much more far-reaching than what the EU has done. And so, there are sparks of optimism I think that we can discern in these executive orders, but also other steps that are taken in the US that, one, concerns are rising to the surface, particularly around AI.

I've hardly ever seen a tech topic where concerns, risks, harms were so prominently discussed from the very beginning. But also steps, albeit limited by the political reality, that hopefully will ensure there are more checks and balances, that there is more transparency, more oversight, all to strengthen the public interest vis-a-vis this power grab or tech coup by tech companies.

Justin Hendrix:

When I step back from this book, I think of you as approving of the European approach, but you don't think that things are perfect in Europe or that any of the kind of things that we regard here as extraordinary measures of regulation that would never be possible in the United States, that they're necessarily going to have the effect that you're looking for.

Marietje Schaake:

Well, thank you. I think both are true. I think it's interesting that you think I'm very friendly towards the EU and I'm a hundred percent certain that there will be EU officials and European-based readers that think I'm being too harsh on EU measures. So, we'll have to see where the jury lands there.

It's been remarkable to see the movement that has been made in the EU over the past years starting with the General Data Protection Regulation, but then including the Digital Services Act, the Digital Markets Act and the AI Act and there's more regulation in the pipeline. These are all important steps, steps in the right direction, but there are a couple of concerns that I have.

One, it's hard to judge laws that have not been implemented or enforced. So, we have to be hopeful about what they will result in, but we cannot be naive about how hard it's to enforce them. And also, I have a concern about the fact that they're quite patchy. There are different elements being addressed by different laws and we must hope that they gel together and that there is no cracks between them that can be exploited by companies, which I think is a question we'll have to watch for. But also, very few of these measures really tackle, directly, the power question, and I think that is something that's really necessary.

What we often see is that there's a hope that maybe by increasing liability, the power of these companies will be reduced that through antitrust, for example, a century-old regulation in both the US and the EU, competition law, where someone like Lina Khan also has a clear vision of how, through stronger enforcement and reinterpretation of antitrust rules, privacy protection, democracy can be preserved. But it's really a sort of indirect pathway towards this very important goal, if you ask me, of preserving the rule of law and democracy.

And so, even if I think that EU regulators and others have made important steps in the right direction, there is not one place where I believe this power question has been tackled adequately. And that is actually the key point of the book, which is there have been an erosion of power on the part of democratic leaders, those that are legitimate, that are accountable. And I would wish for more direct addressing of that sort of governance question rather than hoping that ripple effects down the road will also improve democratic governance.

Justin Hendrix:

It almost like you can have a comprehensive regulation, but maybe not a comprehensive effect.

Marietje Schaake:

That's right.

Justin Hendrix:

I'm wondering about the timeframe. We've got the GDPR of course in place for some years. We've got the EU AI Act, a brand new DSA DMA only recently coming into full force. I don't know, if you could judge it, how long do you think it is until we know whether Europe has truly turned things?

Marietje Schaake:

I would say five to 10 years, but we will see in real time, cases that will be broad or practices that will be changed by tech companies that actually create a live observatory between what is happening in the EU system and what is happening in the US system. So, to really be able to compare those two in real time at some point as internet users or as representatives of civil society or journalists, that will be interesting on a case-by-case basis. But zooming out and really looking at the net effect, I think we need a couple of years more because the implementation, as you said, is underway for some of these big laws, then you need some time to see how it's going.

There will certainly be lessons learned and in that sense the EU will also pay a bit of a price for sticking out its neck. There's often talk about the first mover advantage and hope that there'll be some kind of gravity towards EU rules coming from the fact that they're the first and also on a large scale to present these. But of course the challenges will also be faced by those who stick out their necks.

Justin Hendrix:

I can almost see some rocks ahead with things like the DSA in particular. Some of the bits and pieces of it that are going to be very difficult to implement, at least from an American perspective, seem very strange, bits and pieces like trusted flaggers and out-of-court dispute resolution over content moderation decisions. I could see these things being Achilles' heels in this regulation going forward.

Marietje Schaake:

Yes, I agree. There's a lot that has to prove itself in practice and that's a challenge with any rule that's being made. There's thinking, there's legal constructions being made, there's political reality that forms around a law and then it has to work in real life and resources are needed. I know that the EU is now standing up its AI office hiring people. And that's a bit of a rocky road, how to bring in people to the EU institutions that have the right skills that can be paid enough for them to be attracted to these kinds of laws. To do it quickly, to do it credibly, authoritatively, it's a work in progress.

I think the EU is doing the right thing by sticking out its neck. I recall well that when the work on the AI Act started, people said, "Oh, it's way too early. You're going to throw away the baby with the bathwater. This is too far ahead of the curve." And then when the law was finally concluded, it was a just-in-time kind of sentiment that AI was developing so quickly that the laws were lagging behind. And so, you could just see from that one example how it's incredibly difficult to meet the moment of regulating emerging technologies, anticipating where they may go, how impactful they may be and what would be needed.

I think ultimately, and I describe this in The Tech Coup as well, what's going to be needed is a much more principle-based form of regulating where the responsibilities, the obligations on the part of those who develop and deploy new technologies are clear, but that there's not a nitty-gritty kind of description of what technology, in what way, because we know that there will be new waves of emerging technologies, we just don't know exactly which ones they will be. Yet we know that whichever new technology comes next, it will have to respect non-discrimination principles, antitrust rules, some transparency rules and so on.

And so, I see a pathway towards solutions of giving more discretion towards the bodies that have to apply oversight and that have to implement the laws rather than having everything spelled out in the law, which inherently makes them outdated the minute they get implemented and the next iteration of technology has already emerged before us.

Justin Hendrix:

Speaking of maybe jurisdictions that like to delve into the detail, you talk about India and you talk about its importance as one of these big four blocks, as it were, that are creating generally, I guess as Anu Bradford might think of them as digital empires. I just want to drill into India a bit and how you think of it and its role in helping the world to muddle through these questions.

Marietje Schaake:

Well, India is a significant jurisdiction, which will most likely see enormous developments still ahead, including in the tech sector. And India has a unique vision on how to regulate tech with its digital public infrastructure, which is not flawless. There's a lot of criticism, but you cannot accuse India of not having thought about this.

I think India is often overlooked when we talk about the big geopolitical game of tech dominance or tech regulation. And India is particularly interesting because on the part of the EU and the US, there is a clear anticipation and hope that India will be part of their team, the democratic forces team on the world stage, also offering a counterweight to China, which obviously has a very pronounced model of governing tech, but mostly as an instrument of the state, the Communist Party, and an instrument of control and censorship and so on.

And so, this hope is pretty clear, but it's not quite clear where India will land in the larger competition between democracies and authoritarian states when it comes to governing technology. And so, I think India is an incredibly important country to watch where there are both very inspiring stories but also very worrying ones.

Justin Hendrix:

There are so many parts of this book that we will not get to in this conversation. You get into, for instance, the role of tech in war, how to think about even how legislatures come to understand technology, which I think is a subject we could spend an entire hour talking about. You recommend that legislatures have to do more to build up their tech expertise.

But I want to ask you about a subject we come back to on this podcast again and again, which is about technological imagination, how it is that lawmakers, but also I suppose democratic citizens come to understand what's possible in the future. You talk a lot about language and the power of language, and the extent to which tech firms spend so much money framing the opportunity and framing technology as the answer to the future. I don't know, how have you come to think about that? How have you come to think about the extent to which there's any room in our minds for alternative futures?

Marietje Schaake:

Well, I worry about the power of lobbying and it goes far beyond spending money on lobbying in capitals. It's also through sponsoring academia, sponsoring think tanks, really shaping the way we think about technology in the public debate that companies do. I think what would help is for lawmakers, policymakers to appreciate that technology is not a sector or a subject separate from anything else. It's a layer onto or a part of everything else. It's part of healthcare, education, national security, economic affairs, transportation. Really every regulator, every legislator, every policymaker has to have an understanding of how technology impacts their specific responsibilities.

And so, what I recommend in The Tech Coup is that parliaments like Congress or the European Parliament or national parliaments designate a technological expert hub, similar to a legal service or something that most parliaments actually have, to understand technology, but also to understand it in the context of the laws that they want to make. Because it requires that very imagination that you talk about, or awareness of what the technology is capable of today and maybe tomorrow, to make the right laws.

If you do not believe or know that spyware can be put onto a device without the user needing to click on an infected link and that it can access all your photos, turn on your microphone, turn on your camera, then you're not going to actually be able to appreciate the harms that can come from that, especially when this technology falls into the wrong hands. And you cannot prioritize regulating spyware or banning spyware as a lawmaker.

And so, it is critical that people understand how it works, but that they have access to information from independent experts and that quest for what the future might look like is not fed to lawmakers by lobbyists who are very happy to say, "The future is bright. We're going to solve climate change, cancer, and we're going to make life easier for everyone." That's what we've seen for too long and there need to be critical independent voices informing policymakers.

Justin Hendrix:

Another thing I found myself responding to late in this book, this assertion you make that if the guys building AI are wrong or it spins off all sorts of the various harms that folks are talking about now, that ultimately it's citizens who pick up the tab, that tech firms to some extent operating at such scale almost have little liability. They have nothing to lose in a way. I was thinking about this with regard to a conversation I was having about election workers in the United States, and of course we've all seen reports about all the harassment, violent threats that they're facing.

And to some extent you can get into the actual costs of that in dollars and cents. Election administrators that are having to reinforce their doors and buy safety equipment or train their staff on how to deal with possible violence, or hire PR firms to counter misinformation, or all the number of hours of overtime they probably have to work to answer calls from a public that are convinced that they're trying to steal votes. I don't know, I think there's a real cost there. I found myself thinking about that in a micro way. You describe it in a more macro way that often it is citizens that are left holding the bag when it comes to tech harms.

Marietje Schaake:

I'm afraid that's true. And of course the harms to the people you were describing, the election workers, the volunteers lies also in the fact that they may be discouraged, and that as a result, the whole electoral process is less robust because fewer people are willing to stick out their necks. To incur those risks of being smeared or threatened or harassed or their families being harassed or simply them being at risk of those kinds of problems, discourages them from being the backbone of that process, which indeed requires a lot of volunteering and also support for the process, which thankfully has been resilient even if tens of millions of Americans don't believe so.

And so, the cost can be seen in many ways. I'm thinking about job displacement as a result of AI, the need to re-skill people, maybe put them on social benefits. That's not a price that AI companies are going to pay for. And therefore, the kind of risk or gamble that happens in society with the rollout of these new, impactful technologies is really quite significant and I think decisions to do so should be more deliberate.

I'm not saying it should never happen, but it should be much more of a societal debate based on as much information to inform that debate and it should be weighed against the public interest, the public harms, and that does not happen sufficiently. There's really a lot of agency power on the part of tech companies to decide, "Oh, here, we have this new thing. We're going to roll it out in society, and whatever happens next will be after us and will not be our problem because our shareholders will benefit. The rest is not our problem."

Justin Hendrix:

I want to ask you how far you think this coup might go. The word coup has a meaning and an association and connotation. We tend to think of it as associated with violence. We tend to think of it as associated with uprising. You're clearly concerned that, essentially, democratically-elected leaders are losing their agency to these tech enterprises. How far might that go if we don't get to the kinds of legal regulatory safeguards and, I suppose, the updates to democratic models of governance along the way? How far might the coup go?

Marietje Schaake:

Well, in a lot of areas we can already see a negative downward spiral between a growing dependency of public authorities on tech companies really being locked in, and as a result, losing knowledge and insight into how some vital functions in society work. As a result, it's easier for companies to then pitch their products to solve that lack of understanding, so to basically keep suggesting that governments are too stupid to understand the technology so that they must buy the next solution off the market.

And so, if you extrapolate that to the next, I don't know, five to 10 years where there's even less understanding, even less talent being attracted, even fewer rules being implemented, we can already see that, on the one hand, the leaders of tech companies are manifesting themselves openly and politically. I think that's a very stark development in the current US presidential campaign. This capital, this power is directly used to influence politics. And it's not unimaginable that in the next electoral cycle for the presidency in the United States, tech leaders will run.

So, the impact on who gets to decide what the democratic process looks like, which forces are pushing to get things their way, I think is well underway and it can lead to all kinds of excesses. But even if we don't focus on the excesses of it being normal, that cybersecurity companies are engaging in offensive operations without a mandate, which I think is quite excessive, even if we think about more mundane and gradual erosion of power on the part of the elected and accountable leaders, then that is problematic.

Democracy has been in steady decline worldwide for the past two decades. Democracy is under pressure both globally and domestically. We really do not need tech companies to privatize a lot of power. In fact, we need every force imaginable to make democracy more robust, to renew and update to the current reality some of the mechanisms and laws that we work with. But the erosion or the attacks really on the rule of law that stem from the growing power of big and small, visible and invisible tech companies really concerns me.

Justin Hendrix:

You use that phrase, who gets to decide? It always strikes me when talking about tech policy issues that fundamentally, that's the question that's at stake, it feels like to me.

Marietje Schaake:

I fully agree with you. And it's harder to decide if you don't know what you're deciding about, whether that is the next wave of technologies that's already being cooked up, or simply to get an appreciation of how things work under the hood of some of these companies, the data that's collected, the algorithmic settings, the risk assessments that these companies make themselves.

As far as I'm concerned, in a free, rule-of-law-based society, it is the people that decide and there's always a trade-off between different priorities that people have. That is actually what a democracy is supposed to look like. But that should be a process that happens transparently and legitimately, and right now, more and more of those decisions about our lives in the digital realm are not made by public officials and not made in the public interest. I think that ultimately really hurts democracy as such and also already sees harms that are real for people.

Justin Hendrix:

This book's called The Tech Coup: How to Save Democracy from Silicon Valley. It's from Princeton University Press and published on September 24th in the United States for my listeners here. I think they can find it already in Europe?

Marietje Schaake:

The Dutch version is out, so that's very exciting. Although the original was written in English, so it was a reverse process of getting it translated back to my native language. But since the number of Dutch speakers in the world is modest, I'm excited that there will be an English version for many more people to read. And I hope that it will be insightful and that they will feel optimistic that something can be done to make things better.

Justin Hendrix:

For prescriptions of opportunities to realize that optimism, check out this book. Marietje, thank you so much for joining me.

Marietje Schaake:

Thank you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics