Home

Donate

Maria Ressa: On Disinformation & Democracy

Justin Hendrix / Apr 18, 2021

Maria Ressa is co-founder, CEO and President at Rappler, the top digital only news site that leads the fight for press freedom in the Philippines. She has endured constant political harassment and arrests by the Duterte government, and still has to fight to stay free. Rappler’s battle for truth and democracy is the subject of the 2020 Sundance Film Festival documentary, A Thousand Cuts. Maria has a 35 year career in journalism. For her work on disinformation, Maria was named Time Magazine’s 2018 Person of the Year, was among its 100 Most Influential People of 2019, and has also been named one of the publication’s Most Influential Women of the Century.

I caught up with Maria the morning after she had herself done an interview- with former US Secretary of State and presidential candidate Hillary Clinton. We talked about that- and from there got into a range of issues, including her perspective on how we will need to come together to build new collaborations and institutions to deal with the advent of social media just as we did for another technology that changed geopolitics- nuclear weapons.

Subscribe to the Tech Policy Press podcast via your favorite service here.

Justin Hendrix:

You were telling me in the moment before this interview started that you've had a whiplash couple of days in terms of your feelings about the world. Why is that?

Maria Ressa:

Well, this morning was a tough morning. Late last night I had a great conversation with Hilary Clinton. Of course, we spoke about disinformation, the impact on her. If you're talking female targets, that's the female target. Within minutes of ending that, I had to then monitor a presidential address of Duterte. The gap between them just shows you what is wrong with the world.

And it's everything combined- how did this happen? What's the role of technology? And then that's coupled with just late last night, or yesterday, the publication of Sophie Zhang's Facebook whistleblower account in the Guardian. These are things that I've known since 2016, because we brought the data to Facebook- how it has allowed leaders to manipulate people with impunity.

So again, the same theme, and then all this is against the backdrop of the latest in the Philippines. We have just been released from the severest lockdown, meaning we’re right back to the future from a year ago. Yesterday was the first day when it's been slightly eased, but this alphabet soup of quarantine names really doesn't hide the fact that our healthcare system has collapsed. So many people I know are infected, have tested positive, and they cannot get into a hospital. And you get the absolute impunity of having a government official just walk into a hospital room while people are dying in their cars waiting to get in.

Okay, well that's really a depressing morning, isn't it? So, that's the context. I'm a bit melancholy this morning I guess.

Justin Hendrix:

Let's focus a little bit on the Hilary Clinton discussion. You were able to talk with her about the topics you've become a standard bearer on- disinformation, democracy, terrorism. Did you find her to be optimistic at the moment, given her career dealing with conspiracy theories, hate, and harassment?

Maria Ressa:

I think optimism grows out of action. This is an empowered woman, very clearly, and also it’s clear that she's lived with this before. There are real power struggles and how information- I mean propaganda has been around way before technology arrived to become a behavior modification system. That's the problem.

So this is a woman who's known about this, has felt it, has been a subject to it, has learned how to deal with it. And I think what was fascinating was seeing her grapple with the shift in how she was first in denial herself, and then tried to tell people, but no one listened to her. This is in 2016. And then as now I mean, frankly, it's the impunity, isn't it? In 2016 we came under attack because we called out the impunity of two powerful forces, Duterte and his drug war, and Zuckerberg's Facebook. And that still continues in 2021.

We asked her about Biden and where things are headed. She was quite optimistic. She believes we must do something about Facebook, YouTube, about social media platforms. And in that, she said some of the harshest words I've ever heard her say. One of the last questions she answered was a request for her advice, because obviously I'm dealing with the same things. In the Philippines our data shows us that women are attacked at least 10 times more than men, and she was aware of all of that.

And she was very honest. This isn't easy- there's no easy answer. She went through how she got through it. And I think what I liked is it goes to these fundamental values. There are positive things about social media and what it has done, but of course, we feel the negative impact. And we've seen this in more than 80 countries around the world, these cheap armies rolling back democracy, attacking people like me.

I think it's shown that nations' cultures- the way we used to divide the world, and ironically the very fissures of each of our societies that are being attacked by information operations, identity politics- that in some ways what the social media platforms have proven is that humanity has actually a lot more in common than we have differences. Because the very same platform that connects 2.8 billion accounts globally is manipulating all of us the same way. I mean, that's a really weird way of finding the upside, but we've always known that people- and this is the part that real leaders touch- that our dreams are the same.

Maria Ressa, Co-Founder, executive editor and CEO of Rappler

Justin Hendrix:

You mentioned Julia Carrie Wong's piece in the Guardian, which builds on reporting from Buzzfeed. It tells the story of Sophie Zhang and the memo that she had prepared before her acrimonious departure from Facebook. But there are a lot of new details in this, things we hadn't heard, about false accounts and governments using the Facebook platform to manipulate politics. It occurred to me that many people like you must have felt somewhat vindicated to read the words of Facebook executives essentially dismissing concerns about this in countries that aren't America or aren't Western European countries. Did you learn anything new in it?

Maria Ressa:

Sophie's journey is very similar to mine- except we were on the outside. We knew Facebook better than Facebook did in the Philippines and I gave that data and continue to give that data to Facebook. We see it. I feel it. Again, glass half empty or half full, right? When you're the target, you see the tactics change. We do both natural language processing as well as network analysis on Facebook in the Philippines. So as fact-checking partners of Facebook, we fact check and then we look at what network spread it- kind of like terrorism research. We look at these as recidivist networks and we flag them repeatedly.

So a lot of the things that were in the article are not new to me- we've been asking for systemic solutions. I understand after 2016 to 2021, after this many years, I understand exactly why it's not moving and it's part of the reason I joined the Real Facebook Oversight Board. But at the same time, I also realize the role Facebook plays and I haven't completely given up on them, despite their decisions and their obfuscations.

Look I came from big organizations. When I worked with CNN and I set up the Manila Bureau. I set up the Jakarta Bureau. I understood the gaps when you're starting in southeast Asia. There was a two week gap between what happens in Manila or what happens in Jakarta, versus getting to headquarters, because they have their own thing. So I put in a lot of buffer for corporate and for the information flow. So I was patient with Facebook from 2016 to 2018. But I also sounded the alarm in 2018 after the Cambridge Analytica scandal when Facebook shut down the API, which was ironic because they did two things. They shut down the API, at the same time, they also said that, "Let's get rid of the news." They didn't say it like this, but they said, "Since it's about family and friends, we'll just keep it to family and friends." And newsgroups around the world in 2018 had a significant drop of traffic, which meant their actions actually enabled disinformation networks. They enabled information operations because where does it spread? Through family and friends. And where are the facts? The signal they muted, the news organizations.

So at that point in time, some groups said that they lost as much as 60% of their traffic. We didn't lose that much, but we felt it in the Philippines. It's the wrong tactic. I have said this privately and publicly. Facebook increasingly now is going to have to choose, and they will have to do it in the public sphere and there may now be criminal liability, right? Because it's very clear what is happening. They're going to have to choose between profit or public good. That's ironic. Lots of companies have been forced to do that in the past, right?

Lobbying can hold back the tide, you put a finger in the dam, sure. But at some point it comes back. And the same way that when January 6th happened in the United States, it wasn't a surprise. I mean, we had genocide in Myanmar. We had what's happening in the Philippines. We had Sri Lanka. That wasn't a surprise. And I didn't like seeing it happen, but the problems of Silicon Valley have come home to roost. And frankly, these are American companies. Americans should hold them accountable.

Justin Hendrix:

One of the reasons I was so concerned about the post-election period in the United States was because Mark Zuckerberg said we should be concerned about it. And I figured he had an enormous amount of data on the situation and probably knew more about it than most. And so when he came out warning about post-election violence or civic unrest in the early fall of 2020, that struck me as a big deal, that he would give us that warning.

Maria Ressa:

Well, so I think they now know exactly what it is and this is political will. This is called leadership. I mean, frankly it's called accountability. There are so many of us who work with them. I'm still a partner because Facebook is the world's largest delivery platform for news. And the design of the platform prioritizes the spread of lies laced with anger and hate over facts. So you can really say that the platform that delivers the new today is biased against facts and is biased against journalists. How crazy is that? I mean, is this a surprise that we are where we are today?

Justin Hendrix:

So let me ask you a little bit, there's been a conversation in the United States around polarization, the question of causality in this argument, about whether it's tech and media that's driving divisiveness or whether the problem is fundamentally from someplace else. What do you think about that question? I mean, putting aside the idea that some of the divisiveness, of course, is occurring on social media, we're seeing democracies stumble all across the world. Where do you think that primarily comes from? What's primarily driving that?

Maria Ressa:

Technology. The technology. So in Ambon in 1999, Muslim Christian violence, a little spark... leaders struggle to hold back conflict so these fracture lines of society do not combust. In Ambon, one news report that was actually accurate, and this was an Indonesian newspaper, that just reported that a grenade was thrown into a mosque that triggered Muslim Christian violence, because of our nature.

So that, to me, is the perfect situation of what you're talking about. Yes, it is within our nature. Humanity as a whole, we will deal with conflict and wars have started from forever, but what hasn't been there has been the constant machine that nudges your worst self. And when these kind of exponential bottom up nudges are seeding some of these influence operations or seed meta-narratives of actual lies- a perfect example is election fraud. The actual lie that came out President Trump's mouth was seeded a year earlier in August 2019, picked up by Steve Bannon on YouTube, seeded in closed Facebook groups before Tucker Carlson comes out with it, August 2020. QAnon drops it and then President Trump comes top down.

It is this convergence of human nature, yes, and technology that makes facts debatable, that makes you doubt facts over time. We are like Pavlov's dogs. We are being taught responses to things. Playing with- E.O. Wilson said this- our paleolithic emotions. That's all new. We've never had something like that. I've seen how fragile society is, I've seen civil society try to marry this, and yeah, Yael Eisenstat will tell you this. She's done this work in Kenya. It's not that hard to bring warring groups together to the table. Although I guess if you look at the Middle East, it keeps recurring. But we have real leaders trying to heal the division. What happens when in the middle of crucial talks, someone throws a match into the discussion? That's what happens in every conversation on social media today, because it is by design.

So I've seen in the Philippines, a country that doesn't have a CNN versus Fox, we're a country where the facts were not debatable in 2016. And all of us started pretty much in the center, we don't have polarization in news groups, we agree on the facts. But with the election of Duterte, with a very similar style to Trump- us against them. Then add in the decisions made by tech platforms.

So the first decision is to recommend friends of friends- algorithms do that to keep you on the site. So when that happened after Duterte was elected in the Philippines, here we are in the center, but he really, truly did ‘us against them’, ‘if you're not with us, you're against us’. So his supporters, to grow their networks, the pro Duterte folks moved further right. The anti-Duterte folks moved further left. And that's connected to a drug war, because the meta-narrative being seeded then is, it's okay to kill. Now it’s 2021. Each year, that division because of the algorithm of friends of friends just keeps getting wider. Instead of pulling groups together, you rip them apart. That's the first tech decision and that is all social media platforms, that's not just Facebook.

The second one is content- what they call personalization. We don't use this on Rappler, precisely because I don't believe in it. It doesn't make sense, because personalization leads you to your own reality. We're divided by the kind of content you are fed, the groups you're asked to join. Look at YouTube- the algorithm for if you click on a 9/11 conspiracy theory, you're brought down the rabbit hole. This is now all documented.

But on Facebook, in 2018 news was pushed aside for groups, family and friends. I think the idea here is that, well we don't want to see it, so you can just keep your disinformation in your small little groups. But they're not small. So what happens in these groups is that when you click on a group, you are then brought down the rabbit hole and radicalization is no longer in the public.. So you divide and you radicalize. That is what the delivery platform of news does today.

As a journalist, that's shocking to me, because as a reporter, you have to have the humility, the standards and ethics manual of the news groups I've been part of really show you that you never know, you don't know what's right. From whatever precipice or whatever crisis you're reporting on, you don't know what the solution is and you're not the actor, so you want to arm people with the information so they have the context for the world they're living in. But the solution is theirs. These social media platforms are actually driving reality. That's scary. It's such arrogance.

Justin Hendrix:

Maria, so let me ask you this, Mark Zuckerberg essentially rejected some of these ideas in front of Congress a couple of weeks ago. He said that there's scant evidence that social media drives polarization and subsequently his VP of Public Policy pointed to some specific studies and some idea that essentially polarization in the specific definition of it there referring to was increasing before social media even existed. So that's number one. Number two, Mark Zuckerberg denied this idea that their business model is about engagement or keeping people online, that's what it's all about. If he was here, what would you say to him in response to those things?

Maria Ressa:

There's a great book by Sinan Aral of MIT, there's a 750 page book by Shoshana Zuboff, so Sinan Aral's book is called The Hype Machine. Shoshana Zuboff's book is called The Age of Surveillance Capitalism. Look, it's hard to go up against such a powerful man who's never been elected, but the reality is, I see its impact. I live it. I've seen it here. He can actually, whatever they want to do, they can point to the shifts globally. I've covered this. And what I've seen in the cycle, so I've been a journalist for 35 years and in south Asia and southeast Asia, what we've seen is starting 2014- I think there's a dovetail of global events and a kind of zeitgeist. And then the tinder or the match that set the kindling on fire was social media. If you look at The Hype Machine, Sinan will show you that the disinformation, the information operations, really the experiments began in 2012. They really erupted in 2014 in the Ukraine. The success in the Ukraine, that was pretty incredible. There's a significant impact of those influence operations- that the world doubts whether Crimea was an annexation versus the people in Crimea asking Russia to come in.

The fact that, well, it drove policy, the fact that there were two alternate realities. And then because of that success, it moved to the dominoes falling in 2016. I would say the Cambridge Analytica whistleblower, Chris Wylie, Brittany Kaiser also had said that SCL, the parent company of Cambridge Analytica was working in the Philippines and SCL and Cambridge Analytica itself was working in the Philippines. Alexander Nix had visited the Philippines in 2015, November of 2015. We have him in pictures with Duterte's team. So for the sixth year in a row, Filipinos spend the most time on social media. Chris Wiley calls us a petri dish- they tested these tactics of mass manipulation here, and then when they worked, they ported it over to the West. I've said that so many times. I'm so tired of it. But we say it again.

Justin Hendrix:

And they try it there because they can get away with it.

Maria Ressa:

They can get away with it. Yeah. But look at how quickly they reacted in the US for Mark Zuckerberg. I think Facebook also had a panic button but the question really that you have is, why didn't they look how powerful they are, for one? And the second is, why didn’t they call it earlier before violence erupted? The other part that I see is this, as we have coronavirus in the real world. Think about the information ecosystem, because that's what I see- the information cascades. I used to study how the virulent ideology of terrorism spreads. First in the physical world, and then in the virtual world. I wrote a book called From Bin Laden to Facebook right before ISIS. In 2011, there were Filipinos calling for global jihad in Arabic on YouTube and they were- by 2012, or 2013- using YouTube to negotiate for ransom. So we saw it here. Anyway, just as you have a virus- coronavirus in the real world- think about the information ecosystem as having its own virus. It's a virus of lies. It's very contagious. And it's seeded. And when you get infected, you become impervious to facts.

Fast forward to your January 6: the people who went there believed in what they were doing. And they're not going to go away. The division in America, the division in all of our societies that are being pounded open on social media. Online violence doesn't stay online. It erupts in the real world and we have far more evidence of this now than any denial you can have.

I started by saying that we had trends in human nature in 2014, simultaneous with trends on social media, including the influence operations. Russia was doing that, now China's doing it. Facebook took down information operations against me September 2020, last year. So they were targeting me but these were Chinese influence operations. They were also setting up fake accounts in the US using AI-generated photos. And they're not separate from the domestic players. Americans tend to think that there's domestic, there's international, just like terrorism. Homegrown groups are hijacked by global groups. That's what we've seen with Al-Qaeda, what did Al-Qaeda do? They took the homegrown networks and gave them a global purpose against the West. The person who told me that was Benazir Bhutto. And our evidence showed it time after time.

So anyway, in 2014 what I saw in my part of the world was this nostalgia for a strongman ruler, because the world had just gotten so much more complex. People couldn't keep up with the change and I saw this even in my family when gay marriage, all of a sudden and you had Caitlin Jenner. I saw folks struggling with, what does this mean? Older folks in particular. And what does that mean about my own value system? So when it got too complicated, there was this search for a strong man. I saw this in Indonesia, in the Philippines, in India, looking for someone else who can make these decisions. “It was so much better before.” So who comes up in India? Modi is elected. Shocking because in 2004, he was dealing with human rights violations. In Indonesia, the son-in-law of Suharto, former President Suharto, almost wins. Shocking because I was there in Indonesia. I covered that transition, the end of nearly 32 years of Suharto's rule.

But there's this yearning that the world is just too complex, that yearning, that emotion goes right back to the design of social media. E.O. Wilson said this, right? And he's not a social media person, what he does do is he studies ants and emergent behavior. And what he said is that the biggest crisis is this convergence of our paleolithic emotions, our medieval institutions, and god-like technology. The evidence is there. The denial is putting your finger in the dam so the business model can keep going.

Justin Hendrix:

So you're kind of taking us to a place of maybe thinking about historical context and the bigger lens on this. How long does it take for us to figure out to live with the internet? What does this trajectory look like? How long does it take for us to choke down this technology and does democracy survive along the way? Is this a decades-long thing that we're going to contend with?

Maria Ressa:

I hope not, because if that's the case, I really could go to jail. I mean, I hope it's not a decades thing, but look, I mean I think the first thing is for all, journalists also need to understand the tech. We don't in general. We still think the world is what it used to be, that power. There are massive shifts in power globally. The global institutions are eroding. I mean, because there is both the social media influence operations, and then the geo-political power play. It is happening. And again, even in that, the Philippines is in... There's a war for it, the proxy war between the US and China. We become yet another flash point in this.

I just got this Four Freedoms Award and the granddaughter of Franklin Delano Roosevelt and Eleanor Roosevelt reminded me that FDR gave this speech to Congress on January 6th 1941 about the four freedoms- and how it is global, how universal it is, and how fragile it is. January 6th this year, we know. 80 years later, the four freedoms are so fragile. Despite the Biden win, right? Because again, the social media platforms came together, your groups came together just enough to prevent another four years, but the people who have been infected are there. The flashpoints are there.

Let me go back to how long it will take. We have to act like we did post-World War II, after the atom bomb exploded, the twin atom bombs. Because I think at that point, the people who created the atom bomb, the United States, Japan, the world was in shock because of the detonation and the impact of what humanity did to itself. And we came together, the world came together, because of that.

This destruction on this massive scale has happened on social media. People have died globally. And I guess the solution to this is for all of us to realize we don't have the structures and somehow move into this emergency mode to try to prevent this from happening again. So yes, globally coming together, coming back to the point where we created a universal declaration of human rights. Eleanor Roosevelt was a driver in that, right? And Bretton Woods, all of these things are now outdated.

And it's ironic that the very groups that have power, i.e. governments, i.e. powerful organizations do not know how to deal with this asymmetrical power play that has just washed them away like a tidal wave. Because that's the truth. That's the reality we live with. So now, what are they going to do? The European Union is a little bit ahead, because it came out with a Democracy Action plan, it has a draft of a Digital Services Act and a Marketing Services Act. The UK has its online HARMS Bill. The United States, which actually could do us all a favor and enact the laws they have in the real world in the virtual world instead of giving these loopholes. The United States is still looking at Section 230, but the impact is documented.

I mean, I asked this of Hilary last night. "Has anything been done about the way she's been attacked? The influence or the information operations that have been launched against her?" If that had been done in the real world, what would have happened? Right?

Justin Hendrix:

So to get to the future, what we have to do is to create those institutions and to have those summits. I know Joe Biden wants to have a Summit of Democracies and there's some talk about potentially having a focus on tech at that. There are other things that I'm seeing getting built, the EU is building a set of academic centers to study disinformation which is promising.

And there has been an enormous amount of funding into these problems in institutions, in centers, and university programs in the United States that I think will produce the kind of intellectual foment that may lead to some of those initiatives you're talking about. But it's going to take a while. I guess my fear is that, again, it takes longer than some of us have.

Maria Ressa:

Yes. I was just going to say the warning, of course, is that you are seeing a lot of that research is also funded by the very same platforms. And the researchers themselves are being set off against each other. Those are rabbit holes. And again, I feel like I watch the US closely, but even this debate, it's not that debatable. The impact is already clear, it's documented, and we are going to have to move on to find the action point. But as long as you're bickering amongst yourselves, the people like us- because we don't have a seat at the table- I mean, you allow it to happen.

Justin Hendrix:

We have to do more, don't we? Yeah.

Maria Ressa:

Yes, you do, Justin. I worry about the deflection because that takes away from action. All the social media platforms will say, "You can't solve a problem until you define it." They already know the problem. As long as they're deflecting, that to me means danger, because they know the problem. Define it, own it, solve it.

Justin Hendrix:

Even these recent post facto justifications from Nick Clegg or Mark Zuckerberg, they refer to outside research which relies on the slivers of data that some researchers are able to get. They don't let us see the internal research that former employees say is what drove them to be concerned and to leave the company. I don't know. There's quite a lot to be worried about there.

Justin Hendrix:

There was recently this HBO documentary about QAnon, I don't know if you've seen it there yet or it's made its way there, but half of it seemed to be filmed in Manila with Jim and Ron Watkins.

Maria Ressa:

Oh, yeah. Of course. Yes, we have a link to everything. So, I can send you some of the stuff, but back to Hotmail, when they were trying to deal with email spam and both the bad guys and the good guys, there were cottage industries in the Philippines, right? Internet fraud for example. If you look at the security risks in this, because I went back all the time to the Hotmail era and I don't know if you remember Megaupload for example.The founder of Megaupload was married to a Filipina. She went with him to New Zealand. The QAnon link is there and he left, I think Fredrick Brennan left after someone filed charges of cyber libel. So we have always had some, I really hate this, but it's not just the internet. It's also, look, 9/11. That was a lot of what I did with CNN after the 9/11 attacks. Because when I watched the planes crash into the buildings, I was on a treadmill in Jakarta. It was evening our time and the planes crashing in the buildings was a memory for me, because in 1994, I had read interrogation documents of probably the guy who would have been the first pilot recruited by Al-Qaeda. It wasn't called Al-Qaeda then. He's in Super Max prison in Colorado.

And so my career has actually been looking at the testing ground. We have been a testing ground. In 1994, the mastermind of the 9/11 attacks, his name's Khalid Sheikh Mohammed, lived in the Philippines. His nephew was Ramzi Yousef, the guy who did the World Trade Center attacks in 1993. The shoe bombing plot that came out of London in 2001 was tested here in 1994 on a Philippine Airlines flight. A liquid bomb was smuggled on a flight to Tokyo. One guy was killed during that time period. I had all these intelligence documents- why did they come to the Philippines? Because we had the same security systems in the airport as the United States. And so if they could get through them here, they knew they could do that in the States. Why did we do that? Because we're a former colony and because a lot of our systems are imported from the United States.

So let's go to the internet fraud and the homegrown industries. They just keep shifting. With Hotmail, when they were able to actually control spam, then it moved to something else. It's a cottage industry. Well, you saw in QAnon, our link there. I don't know whether I'm drawing too broad a brush, but I can give you papers on this where we show you how these kind of cottage industries evolved over time to now become fake news industries, i.e. you have little offices in Manila, in Sibu where you have rows of cellphones that have their own unique number that are creating Facebook accounts. So this is a whole black market system.

Justin Hendrix:

I wanted to ask you about it for precisely that reason, because you can really see that, Jim and Ron Watkins clearly scammers making money off the worst bits of the internet, pornography-

Maria Ressa:

Pornography. And in our government the Duterte administration actually encouraged online gambling at some point, pogos are what they're called. So yes, it's the weakness of the system. It's what is okay in one country is not okay in another.

Justin Hendrix:

Right. And they seemed to merge from there into this other thing, this other grift, politics, which turns out to be a pretty good grift too. It's kind of hard to tell whether they believe any of this stuff, but maybe they do. On some level, does it matter? I don't know.

Maria Ressa:

I don't think they need to believe in it. And that's exactly what we're seeing. The most recent maps we've made of the virtual world in the Philippines still show asymmetrical warfare. And look, when you have online violence and hate and then real-world violence and fear, these things combine. That's the other part that's shocking to me when Facebook says these things, they don't live in our environment. It changes you, right? And they contribute to that massive shift and the emergent behavior that comes out of it.

Justin Hendrix:

That was the final defense that Nick Clegg, and ultimately Mark Zuckerberg make- the Silicon Valley idea of the user as being this rational agent who's perfectly in charge of their facilities and making intelligent decisions and not affected by the behavior of the network. It's a tango, it's a dance. Facebook works for me.

Maria Ressa:

That's laughable. I've seen it in the real world. Facebook doesn't work for you. If it did, then you wouldn't see all the data that you're seeing now about how we're being manipulated. It is a behavior modification system and we are Pavlov's dogs and that shouldn't be legal, right? You can make an argument for anything, but as long as the social media platforms are in denial, they cannot be part of the solution. And the sooner they get out of denial, the faster globally we can find a solution, because I do think that they're part of the future. They're too big, right?

If they fail, we wind up failing as well and like I said, this virus of lies impacts real people, changes their world view.

Justin Hendrix:

I want to thank you for everything you are doing and thank you for this constant energy that I feel emerging from you on these issues. So I'm very grateful for you, for all your activism.

Want more? Ressa also gave a keynote address at the Skoll Foundation last week.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics