Home

Marietje Schaake on the threat the global spyware industry poses to democracy

Justin Hendrix / Jul 27, 2021

Last week, Marietje Schaake, International Policy Director at Stanford University’s Cyber Policy Center, president of the CyberPeace Institute and a former member of the European Parliament penned an op-ed in the Washington Post with David Kaye, who teaches law at the University of California at Irvine School of Law and previously served as the U.N. special rapporteur on freedom of expression, about NSO group, Pegasus, and the threat global spyware poses to democracy. I caught up with Marietje for the Tech Policy Press podcast- you can subscribe here.

Justin Hendrix:

You have been following the global spyware industry for some time, and this week's news for US poses a sort of another dot in a story arc or a narrative that's been going for a while.

Marietje Schaake:

Yeah, it is. I'm happy with every minute of attention and spotlight that this spyware industry and all the harms that it's causing is getting. But I also kind of shook my head in disbelief that it's actually been 10 years that I've been working on the surveillance industry, all the harms that they're causing to human rights, to press freedom. And so yeah, it's a really double feeling this week for me. Bittersweet, in some ways.

Justin Hendrix:

What does this industry now look like, across the globe? How would you describe it?

Marietje Schaake:

I would describe it as very cynical and as much more sophisticated than 10 years ago, much more powerful than 10 years ago, and I think misunderstood and invisible to the majority of people. And that's where I think the Pegasus Project has made a difference, because once the president of France is on a list of persons of interest, it does make it a different political matter. Then unfortunately so, human rights defenders in authoritarian regimes that are facing this kind of technology almost on their own, where there has not been enough support from democratic leaders, democratic governments, for the actual harms that these technologies are causing.

And what I find so ironic is how NSO Group, but also the many other spyware surveillance companies out there, are often in democracies. They come from democratic societies, while they directly oppose the stated foreign policy and human rights policies of those countries. So take Israel. It says, "We're a democracy. We believe in human rights. In fact, we're the only democracy in a very troubled neighborhood, and we think that the press should be free and that there should be respect for human rights." While the Israeli authorities have actually allowed this industry to blossom. Israel is very strong when it comes to spyware. Actually, the Citizen Lab revealed another company that has exploited Microsoft to stealthily spy on people.

And so it doesn't add up, to on the one hand, call for the respect for human rights, respect for democracy, and on the other hand, turn a blind eye to this industry that is attacking democracy and human rights for business.

Justin Hendrix:

Now, wouldn't the folks who are involved in companies like NSO- which by the way, did boast advisors who had had roles, for instance, in the Department of Homeland Security- wouldn't they say they're on the good side, that they're defending democracies from the bad guys?

Marietje Schaake:

Oh, I'm sure that's what they're saying, but you do have to ask yourself how technology from NSO Group gets shipped to Saudi Arabia, presumably with a lot of consulting and explanation of how the systems work. While Saudi Arabia is well-known for mislabeling people as terrorists, as criminals, for that matter, in order to justify their repression of very basic human rights. And so I think what is said by NSO Group's officials and others representing spyware and cyber surveillance to companies doesn't count as much as what these companies do.

And the whole idea of, "Oh, our trust in our clients has been breached if they misuse this technology to actually go after others than the worst terrorist suspects in this world," it's a kind of hypocritical argument that is completely non-credible. I mean, if you are playing in this league, you have a multi-billion dollar company. You are dealing with intelligence services all over the world. You cannot say, "Well, we shook hands. They pinky-swore that they were going to use this only to go after terrorists. And oh my gosh, we just learned from Amnesty International that that may not be true." This is nonsense.

Justin Hendrix:

So across the globe, we're seeing the rise of a variety of tools and technologies similar to Pegasus, which allows people to essentially turn any phone into a listening device. I'm looking also at the report this week that Clearview AI, the facial recognition company that has attracted a lot of criticism, of course, for its practices in relationship to law enforcement, has just raised a ton of money. What should governments do to hold the line against the encroachment of these surveillance technologies into our societies and into our law enforcement and other mechanisms?

Marietje Schaake:

Well, I think it's really important that democratic governments appreciate and understand that companies like Clearview AI (I'm very glad you mentioned them), like NSO Group and other surveillance tech companies, are essentially competing with the most sensitive roles of the state. So I think you can consider NSO Group a privatized intelligence service, or at least its intelligence-grade technologies. Clearview AI is competing with law enforcement's identification of individuals. And I hope the democratic governments think, "This is not the kind of function we want to have, out-of-control companies, meaning companies without the proper oversight checks and balances, running as core functions that indeed are very sensitive."

And so I hope they feel incentivized to draw a line and to ask themselves: If we claim to have or have to defend constitutional right to privacy, fundamental right to privacy, then our obligations to protect those rights are directly challenged by these companies. There is no other way to look at it. You cannot have commercial facial recognition systems based on scraped images from the internet, sold to law enforcement, while saying, "Well, there's a right to privacy." The same with mass surveillance systems, and the same with the stealthy intrusion and ex-filtration of information from people's devices. It does not combine with fundamental rights, and that's a problem. Democracy should defend these fundamental rights, should defend the rule of law, notions of a presumption of innocence. Not the targeting and intimidation of victims of this spyware.

So it's very urgent that democracies, hopefully together, draw a line in the sand against this industry.

Justin Hendrix:

You talked about a couple of specific things that governments can do. One is export control, so quite literally, putting some controls on how these technologies can be sent across borders. What are some of the other things that governments could do that would help to rein in these types of technologies?

Marietje Schaake:

Well, I think there's a lot of hypocrisy on the part of democratic governments, too. A lot of the systems that we're talking about are procured, in the first place, to actually be used by democratic police, law enforcement, intelligence services, and so on. So they should take ownership of preventing the proliferation. So I think there should be maybe better contracts that avoid the use of the same innovations and technologies towards other clients. If you really want this to be restricted and focused, you should have better know your customer, better due diligence, better oversight mechanisms to avoid the proliferation. And the proliferation of these systems is a real problem that has already been recognized.

I remember James Clapper saying that lawful intercept capabilities have been found in, I think, 25 cases when there was one contract that was known and legitimate. So it's really important to rein it in, to have better oversight, but also, as you mentioned, export controls, which is something I worked on when I was still a member of the European Parliament. We're always just one piece of the puzzle in order to put checks and balances in place and to have scrutiny of potential harms to human rights of these technologies included, for example, and not just national security considerations.

But the Pegasus Project has shown that in a country like Hungary, the government was the importer. And so in a case of bringing in these technologies into the EU, export controls are not going to help. And I think there is really a moment now, where the legitimacy of the use of these systems at all should be discussed. And certainly, if they're going to be used, what kind of redress mechanisms are there? What kind of oversight is there? Because abuse is really luring below the surface, and we've seen that now, with example after example.

Justin Hendrix:

So one of the things that you say in this piece with David Kay is that these companies need to subject themselves to more outside scrutiny. You talk about United Nations guiding principles on business and human rights, which is this global standard. What would be appropriate independent scrutiny? What type of scrutiny would be necessary to put the brakes on such a fragmented and growing and secretive industry? What type of scrutiny could governments possibly do in this space?

Marietje Schaake:

Well, scrutiny of the use is one half of the answer, but also scrutiny of big technologies, another half of the answer. And if I look at the Netherlands, the country that I know best, one of the main newspapers here has been submitted a freedom of information request. So the Dutch government, not exactly Saudi Arabia. Much better human rights track record, thankfully. But nevertheless, the Dutch government refuses to answer which types of spyware it uses, so it's also not denying that it's using NSO Group. And I think that in that non-answer, you find part of the clue as to why this is an industry that is so difficult to rein in. There have also been rumors that I heard a lot when I was working on the regulation, that actually intelligence services are benefiting from these technologies. So nobody, I think, would be surprised if NSO Group actually was working with Israeli intelligence to pass on information glanced through the sale of these commercial spyware systems, but that clearly provide access to hugely sensitive information.

And so scrutiny of those relationships, oversight over intelligence services, and the kinds of technologies that they're using. Preventing the proliferation and having norms around how this is used, if it is used at all, I think are all very, very urgent. We hear huge concern in this week, between the US, the EU, and NATO, expressed about China's cyber menacing, using technology to attack, at the very core, exchange servers of Microsoft. Their shared concern about China, I would say, across the political spectrum in the United States. Has anyone wondered what would happen if Chinese spyware would be used to go after journalists of The New York Times or The Wall Street Journal or Fox News or CNN? If they report unfavorably about China, what would the United States do in an instance like that? What laws does it have to fall back to?

So the idea that democracies are benefiting from this industry, benefiting from a hands-off regulatory approach, will be tested the minute that the tables turn. I hope it doesn't happen, but perhaps our own societies will be flooded with these technologies, and we will feel like others may benefit from the use of these technologies in a way that really hinders our core freedoms and values.

So it is a moment to look at what this industry has grown into, who benefits, and what we actually know. And the fact that we know so little should be reason for concern, as such.

Justin Hendrix:

The other thing I find myself thinking about with regard to this NSO story, is on some level, we're all surveillance artists now. We're all installing devices on our homes, on our front stoops, that are tracking our neighbors. And we're recording one another in various surreptitious ways. Is the horse out of the barn, as we might say, on this? I read a story this week about the idea that some are embracing in the United States, which I must say terrifies me: the idea of putting body cams on teachers. Where is this all sort of headed, in a broader sense?

Marietje Schaake:

Well, it is very frightening that in a way- Big Brother is us, right? But we are very much enabled by an industry that for example, makes it more palpable to have facial recognition to unlock your iPhone. Whereas it's a different context if there is mass facial recognition systems deployed in public spaces. And so it's important to also differentiate between the way we should use.

I was thinking about the same question in the context of the terrible murder that just happened, of the crime journalist in the Netherlands. And there were people who were on the scene where he was shot, and proceeded to post the clips that they had filmed onto Instagram and YouTube and Twitter and Facebook. And people were furious. Whereas at the same time, the police was asking for these recordings, in the interest of evidence-gathering. And one can say, "Well, that doesn't add up," but on the other hand, it is a different context. When the police uses these images to find a suspect, it is different from just being a sort of disaster tourist, to be there and to have materials to share with friends on social media.

So context matters, but we have to understand that once the technology is developed, it will more easily creep into different missions. And I miss that discussion in our own societies. For a long time, the idea has been, "Well, there can be abuse of technology. If it goes to authoritarian regimes, then we have to fear what could go wrong." But actually, the use of certain technologies, like Clearview AI, as such, is an abuse of rights. And I think that that's what we have to think about. The same with NSO Group. What is the legitimate use of spyware? I really want to have a more lively discussion about, and if there's a legitimate use, what constitutes that legitimacy? Which checks and balances? What mandate? What oversight? How to oversee the users, so that they don't abuse the technology somehow.

And there's too little transparency, too little independent oversight over this whole sector and the whole intelligence corner and the use of technology there.

Justin Hendrix:

Are there ideas about secrecy and surveillance and their relationship to democracy that guide your thinking on this, philosophically? Are there things that are in your mind about this relationship?

Marietje Schaake:

Well, we speak today, 10 years after the horrendous terrorist attack on Utøya in Norway. And I just reviewed, for my own moment of remembrance, the speech that Jens Stoltenberg, then the prime minister of Norway, gave. Currently the Secretary General of NATO. His words were so profound. That country had been hurt where it really is felt most. Young people on a summer camp killed by a far-right inspired terrorist. And he said, "The answer to this attack will be more freedom, more democracy, and more humanity." Whereas instead, what we often see is in response to an incident ... Obviously, September 11th, a terrible, terrible loss of life because of a terrorist attack, comes to mind. But there, it was really the start of systematic restrictions in civil liberties. Again, in the name of fighting terrorism, almost any forsaking of rights protections was enabled in a free society.

And so I always go back to the question of, what is it that we seek to defend? Of course, there's risks out there. We dread and hate terrorists, crime, mafia-esque practices. And of course, they need to be fought. But at some point, you have to wonder: What is proportionate, and how do you make sure that in some ways, medicine doesn't get more toxic than the disease? And that is the kind of question I always try to ask myself. After all, it is our freedoms that we seek to defend, and are we doing that in the right way? And are we not misguided by the promises of what technology could bring to keep us safer, while overlooking new risks, unintended consequences, benefits for companies that are absolutely not in the interest of the public and of society and of our rights protections and civil liberties?

And it is remarkable how democratic decline, globally, has gone hand-in-hand with technological disruption, globally. And I think it should be a wake-up call in that sense, that we really have to find ways to govern technology in democracy's image, and not sort of trust that the democracy will come with the technology, without any safeguards.

Justin Hendrix:

Thank you very much.

Marietje Schaake:

You're very welcome.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics