Robert Gorwa Tackles the Politics of Platform Regulation
Justin Hendrix / Dec 1, 2024Audio of this conversation is available via your favorite podcast service.
Robert Gorwa is the author of a new book titled The Politics of Platform Regulation: How Governments Shape Online Content Moderation, published by Oxford University Press. (The book is available open access- download a free copy here.) It is an analysis of how and why governments around the world engage in platform regulation. The lessons he draws from case studies of key regulatory developments in Europe, the United States, New Zealand, and Australia help explain the adoption of different regulatory strategies by these governments and the underlying politics that shape their approach.
What follows is a lightly edited transcript of the discussion.
Robert Gorwa:
I'm Robert Gorwa. I work at the Berlin Social Science Center. It's a publicly funded research institute in Berlin, Germany. And yeah, I'm the author of the new book, The Politics of Platform Regulation, how Governments Shape Online Content Moderation, which is out open access with Oxford University Press.
Justin Hendrix:
Robert, you write that this is a book exploring how governments around the world are seeking to regulate the policies and practices of harmful content and governance being developed by the technology companies that operate platform services. This is the dense but slim volume that's available, open access. So I'd encourage the listener to go right to the internet, Google the title and download a PDF. But Robert, even before we get started, how did you come to this subject? What is your area of research and how did this project get started?
Robert Gorwa:
Yeah, so I've been researching content moderation and platform governance issues for close to 10 years now. 8, 9, 10 years time has been blurring in the platform era, the platform scandal, kind of post-tech lash era. And I have a slightly unusual academic background I think in this space as someone who did first graduate work in media and communication, internet studies, platform studies, and then also in political science. So my PhD was in international relations where I was really thinking about transnational regulation and global governance. And most of my work on platform governance, on content moderation has been really trying to translate between these two communities, trying to bring methods from empirical policy research, political science, as well as conceptual insights from global regulation, global regulation, scholarship in other fields into the literature on content moderation and platform governance. And I think, yeah, that translation effort is really at the heart of this book. It started as a PhD. It may be a slim volume, but it was a real labor of love. I think it took me more than seven years from start to finish. So yeah, it's on one hand I guess my very nerdy in the weeds love letter to how platform regulation happens, how it's developed, fueled by a lot of empirical material. I did a lot of interviews, I did a lot of FOIA requests as well as I guess my most ambitious conceptual effort to navigate how, where and why platform regulation happens.
Justin Hendrix:
So you've really waded through the stew of politics and circumstances in which this development of platform governance policies happens. But I think it's interesting that you point out what this book is not about. You make clear this book is about applying this conceptual framework that you've developed about how regulations come along, but what's it not about?
Robert Gorwa:
Yeah, that's a great question. I think even a seven year project has a lot of limitations and I had to make a lot of hard choices when I was pursuing this project just to make sure that it was doable. And I think most importantly, this is actually not really a book about the implementation and enforcement of platform regulation. So it's really about what happens before the law or maybe as we'll discuss, I'm not a legal scholar, so I'm not just interested in formal laws, but I'm interested in the whole morass of different strategies, tools, what I call convincing and collaborating that government actors of different types use when they interact with platform companies, trying to get them to change what they're up to, how they design their services, how they design their policies. So it's really about understanding how governments and different government actors, the book really is about the importance also of sub-state actors.
So not just treating certain countries as an individual unit but breaking down that unit. It's about thinking about the politics that happen in policy development less so than implementation enforcement. So I think that's really, I guess what it's not about. It's also not necessarily about how to make platform regulation better. And I think there's a lot of really good powerful work that has been done by people in civil society, by academics in a bunch of different fields. Many of them have been on the podcast recently. And I think this of course is an incredibly important area for us to continue thinking about, especially as more and more governments are getting more ambitious and complicated, taking costly and complex institutional interventions into the platform economy. Of course we want to continue thinking about how to make those efforts better, how to make what platforms actually do, which of course have huge impacts on the lives of people around the world, billions of users. We want to continue pressuring platforms to invest more resources in this important work to take this stuff more seriously. But that is actually also not at the heart of the book. So the heart of the book is really about the politics that shape the interactions between governments and platforms around platform governance, standards and rules, content moderation, standards and rules.
Justin Hendrix:
You write that, “the crux of my argument is that the decision of a government to intervene and regulate a platform company in a certain way, the strategy that it will choose to do so and the likelihood of its success are all influenced by two main factors, which I call political will and the power to intervene.” So you're talking about this kind of demand and supply model for technology policy development, how does this model work?
Robert Gorwa:
So thanks so much for that question. And there's a lot of dense conceptual material in here and maybe we'll start there before we get into some of the case studies. And I think some of the interning developments that we've been seeing in recent months, what is old is new again in platform regulation it seems, and there's a lot that has been happening. But maybe just before I get to the kind of intricacies of this conceptual model, like you say, what I call a kind of demand and supply model for explaining how platform regulation occurs, lemme just tee it up by getting back to this question of where, when and how platform regulation occurs. On one hand, tech policy press listeners might think that it's a pretty obvious question. So there are many policy issues with the contemporary platform economy, hate speech disinformation, intimate image abuse, child sexual abuse material, and there is lots of bad stuff happening.
And basically what we are seeing, at least in one point of view might be a lot of regulatory efforts, a lot of increasingly complicated regulatory efforts around the world trying to curb that bad stuff. So it might be that simple, that's basically what's happening. But I think actually if you zoom out a little bit and look in a global comparative perspective as I try to do, platform regulation is still an emerging and really quite rare fields of regulatory intervention. And let me just say here that what I mean by platform regulation is what we hinted at earlier, which is politics that are trying to shape how companies with different business models. So this book is mainly about social media platforms, but also I talk about marketplaces and other platforms that different business models, mainly platforms that deal with user generated content, how they design their services and how they develop policies around user behavior or user content.
So if that's platform regulation, and again, it's not an entirely new thing in that it's linked to older issues in internet law, internet reliability. What I think is new is that we have more specific rules that are trying to shape the practices of content moderation, the specific practices of companies on harmful content, not just copyright more and more issue areas. So this is of course something that is emerging rapidly and the conversation in most places in the academic literature too is really focusing on the EU, on the US, maybe on the UK and Australia. And it's not really a global conversation. And when you zoom out, you actually see, as I think I saw when I started this project, that it's still a patchwork and it's not happening everywhere and it's not entirely clear why action is happening in some places and not others. So why are governments given that many of these platform services are global, they span jurisdictions.
Of course, some platform issues, some platform harms if we use that language, are going to have specific local manifestations, but many of the affordances of these services are global. So why are not every types of government, every type of regulator in every jurisdiction try and intervene? So just to give you a tangible example of this, take Canada, which is a country that is near and dear to my heart, it's where I grew up for some years. They've been debating online harms rules that are quite similar to things that were proposed in the uk. There have been multiple draft bills and yet they're still not in force. And you contrast that with the UK for example. So I first moved there for grad school, I started this project in its early phases in 2016 and a few months after I moved to the UK, there was a green paper on online harms that was put out by DCMS, the Department of Digital Culture, Media and Sports.
Over the years, this morphed into a white paper, then a draft online harms bill then eventually got rebranded as the Online Safety Act. It eventually went into force in, I think, 2023. And in the meantime there were 175 changes of the conservative government leadership and the prime minister. So I think it's not self-evident why certain bills, certain types of platform regulation get over the finish line and why not and why it happens in certain countries and at certain moments. So that's the where and when I use this language regulatory episode. So I'm interested in understanding from a political science perspective where and when these bills happen. And just very quickly, I think the reason this matters is because we have so many conversations about platform power and the ability of governments to regulate them or not. We hear of course about the power of lobbying and how these companies are some of the biggest and most influential companies in human history. We hear about the lack of transparency in the platform ecosystem, the complexity of the digital domain, how cross jurisdictional there are. If we import the famous I think Clintonian phrase from the nineties, regulating platforms is like nailing jello to the wall. Are they too big and too powerful? And yet still we're seeing all sorts of different bills, which I would call contested platform regulation. We'll get into this, but also all sorts of other stuff under the iceberg. So co-regulation, informal regulation pressure, and basically I'm trying to understand those dynamics.
Justin Hendrix:
This conceptual framework, the first you choose to look at is Germany's Network Enforcement Act, the NetzDG. You look at the Christchurch call, which is a really interesting one and I think probably discussed in a lot of ways is sort of a very interesting conceptual model for how to address platform harms. And then you look at the US two years worth of efforts by state legislators to try to get around federal gridlock and pass their own rules. And of course we spent an enormous amount of time on this podcast looking at the wreckage of those efforts, including some trips to the Supreme Court. But let's start with Germany. How does Germany hold up as a case study, as you wade into the German situation, what does it tell you?
Robert Gorwa:
I think these kinds of core questions are really helpful for getting into the nitty gritty of the argument. And at the core, the argument is that the development of platform regulation has a lot of politics and a lot of it is domestic politics, right? Of course we know about this inherently, I think we care about elections, we care about new governments, we care that there are new officials at the European Commission, but it hasn't really I think been something that has been at the core part of the academic conversation so far. So Germany is really interesting. The NetzDG is still talked about as probably the first global example of a harmful content oriented platform regulation bill. It has a lot of things that ended up getting into the Digital Services Act in the EU, mandatory transparency reporting, some kind of trusted flagger elements, some basic risk assessments if I remember it.
So it's something that really set the stage for the DSA and was a bit of a global vibe shift. And yet I think it was something that at the time was being misunderstood in my view by a lot of audiences outside of the EU. And this is maybe where I'll get into this kind of conceptual thing that you mentioned in your last question, which is basically that I think the German example really shows that government actors need to have what I call political will to change the rules. And they also need to have what I call the power to intervene. And we can roughly think of this as the demand for change and the power to supply this change. So in the question of Germany, it really in my view was about elections and it was about policy entrepreneurship. So it was about a specific set of individuals in particular the Minister of Justice at the time, Heiko Maas in a very difficult politicized moment where also the governing kind of grand coalition between the Christian Democrats and the social Democrats were worried that they were going to be losing seats to the rising alternative for Germany or AfD far-right party.
And they wanted to do something to signal that they were really serious on the issue of online hate speech. And basically, and again, we can get more technical on the kind of framework that I develop, but basically it was a case where there was a huge amount of demand for policymakers to change the rules. And this demand wasn't really depressed by other actors. So the industry lobbying we can get into this either didn't really happen or it didn't work. Interestingly, other major international partners of Germany such as the US, which again later on in the development of the NetzDG would've been under the Trump administration. So maybe that explains partially the lack of clear kind of firm state, but we don't really see, or at least I didn't really find heavy pressure from the US for example, on Germany to not regulate the companies in this way.
And then they also had the ability to supply this regulation through their institutions. So most of the book I think really lives on the supply side. It's all about the institutional factors that basically condition the ability of different policy makers at different levels to challenge platforms in different ways. So it's about legislative systems, about veto points. And the thing that I think is the most interesting takeaway of that chapter is the really weird and kind of interesting behind the scenes politicking that was happening between the European Commission and the German government on the NetzDG.
Justin Hendrix:
You point out that a lot of this is negotiation essentially over what terms Europe would deem acceptable for this German law.
Robert Gorwa:
So I don't know to what extent this is a controversial take. There might be some people out there that would disagree with me on this, but in my interviews and also in my ongoing conversations with experts, I found pretty much unanimous agreement that the NetzDG due to some very esoteric legal things relating to European law and the e-commerce directive, which is kind of like the CDA 230 for the European Union, which was passed in the late nineties. There are really major legal issues between the NetzDG and the e-commerce directive. And I found almost unanimous, I would say support or understanding expressed by legal experts, including people at the European Commission, that the net cg, if it went through a legal challenge, it would be struck down as unconstitutional or in violation of the e-commerce directive. In particular, the so-called country of origin principle, which is trying to basically prevent the exact type of regulatory fragmentation in the EU, which something like the NetzDG is doing.
So the EU as an economic project, what it long has tried to do is harmonize rules for international businesses rather than fragment them. They don't like the idea that Germany and France and Austria and Denmark and all these different countries could have their own rules on international businesses of any sort. And of course the platform sphere, the digital services sphere is no exception. And actually, as I discovered, I'm not an EU politics scholar. I'm not a super deep expert on European integration, even though I've been nerding out a lot over the research on this book and have been getting more into it due to recent developments with the DSA and the AI Act and things, it has a mechanism for negotiation between member states, the commission and countries that are proposing new rules that basically should allow for rules which are going to fragment the single market from getting negotiated down or prevented from being put into place.
This is just an example, I also talk later in the book about the really interesting case of Australia, which in polar opposite to, for example, the Canadian example I mentioned at the beginning, they managed to pass a new law and I think under a week before the end of parliament, so turned this down really fast, something like that would be impossible in the EU because you can't just pass laws Willy-nilly, even if you have a majority over your legislature, you need to notify them to the commission called a standstill period where different member states the commission can provide a comment. And then interestingly, there's a mechanism called the detailed opinion where the commission can basically say, Hey, you need to go back to the drawing board. You need to give us more explanations around what you're doing with this law and how it may or may not violate some of our concerns or existing European legal frameworks.
And that extends the standstill even further. And because of the way that the NetzDG was notified to the commission, which is extremely last minute, so it was notified on the last possible day, it could have been for it to still happen before the election, basically it left them with very little wiggle room. So it set off a bit of a game of chicken, if you will, between the commission. You're saying, okay, are we going to basically use this detailed opinion to defacto veto the NetzDG, or are we not going to do something and try to negotiate this behind the scenes? So that's what the chapter is about. But I guess the core, I think, interesting argument there is that it seemed, at least to me in that case that the most interesting compromises about the bill, about the NetzDG, which we may have forgotten by now, but it was super controversial when it was first proposed.
It had some pretty strict things like take down requirements, we can get into it, but there were worries for example, that it was going to apply to too many platform services, ones that had end-to-end encryption, that it was going to have automated filtering, all these things. All of that got negotiated down not by industry lobbying, but by the commission basically in this kind of informal backroom way. And I just think that is interesting. So I think that story, if it tells us anything, it tells us about how veto points, institutional structures and power politics really shape platform regulation. And unfortunately, at least in this case, more than these broader normative principles of human rights, of good governance, these things that we might necessarily want to advocate for when we're thinking about content moderation.
Justin Hendrix:
Your next example is across the world. New Zealand, you mentioned this kind of policy entrepreneurship after the murders, the massacre in Christchurch that took 51 lives. This was a catalytic moment, an urgent moment you had a Prime Minister set out to take advantage of it and make change. The politics of this strike me as very different from what we've seen in most cases around the world, and the outcome is very different.
Robert Gorwa:
So this chapter to me was maybe the most interesting to research because as you hinted at earlier, I think it's a bit under explored in the literature, there's really not a lot of standalone articles either about the Christchurch call or about the Australian Abhorrent Violent Material Act or AVM Act, which also was a result of the Christchurch shooting. And to my view as a policy studies person, I was like, wow, this is a horrific yet very interesting policy experiment because you have this shock event and then you have two very closely interlinked socially, politically, culturally jurisdictions that are responding to the same event basically with the same policy goal, trying to contest the amount of resources, the types of rules that platforms are doing around violent extremism, the levels of policies, the levels of resources that they're devoting to this in two very different ways.
So one in Australia through a very kind of hard contested approach where they're trying to tie the hands of company than saying you comply or else or we'll find you. Versus on the other hand in New Zealand, the kind of international orchestration of this very interesting co-regulatory approach, bringing in a lot of international partners, bringing in platform companies as more voluntary participants and creating this Christchurch call institution, which has had I think a really big and important and still probably under-discussed long run effect by reins, institutionalizing and reconfiguring this thing called the Global Internet Forum to Counter Terrorism or GIFCT, which operates this hashing fingerprint database for violent extremist content that is used by virtually all of the platforms. And what I talk about in this book, and I guess probably listeners are getting a little bit of a flavor now of the stuff that I'm doing here, is look really at a micro level getting freedom of information requests to get internal policy deliberations with officials in both governments in the days after the shooting to try to basically understand what was on the agenda, what their options were, what they did, and hopefully why.
What I think is interesting is that both countries did a bit of a U-turn, so both of them started with potentially different kind of options on the table than what was pulled eventually. So in Australia, something really interesting happened where as far as I can gather, companies were basically told that, okay, we're going to do a co-regulatory approach akin to what the EU did with the code of conduct on illegal online hate speech that was coordinated by the DG Justice in like 2017, 2018. This is going to be non-binding. You're going to come up with a crisis protocol, you're going to give us more information around how you deal with live shootings and these types of events. And then we'll have this kind of more informal measure through which we collaborate on these things together. And yet the opposite ends up happening where basically they U-turn and at the last minute they decide, no, we think, and for electoral reasons, we have time to try to do a bill and we'll try to pass it through parliament really fast before companies can respond.
New Zealand, the opposite happened where first there were, as far as I can gather initially open to a really hard response, but then as I write in the chapter, a lot of, I think policy entrepreneurs, people in civil society and industry, were able to kind of frame the conversation a little bit differently and steer it towards a more collaborative approach. It's a really interesting comparative study as to some of the demand side motivations that shape why different government actors, different agencies want to do certain things when they're dealing with issues like online hate speech or violent extremism.
Justin Hendrix:
One of the things that comes to the fore in this discussion of the GIFCT is the role of civil society in particular, I know that GIFCT has a sort of civil society organization that's around it. Early on there was a lot of concern that civil society wasn't being heard. There was this sense of a sort of barreling forward under the political will of the New Zealand and French Prime minister and president. What did you learn about the role of civil society in all this?
Robert Gorwa:
Yeah, this is really interesting and I will caveat by saying that should have maybe been added also to my list of actors that I don't really focus on or things I don't necessarily do in this book. The project isn't at its core trying to understand the relationship between civil society and platforms and governments on issues of platform regulation. And I really think we need actually more work that's exploring this. There are some people out there that have been doing great stuff, but I think it's still a relatively underexplored thing. But with that caveat and also with the caveat that my focus probably leads itself to a focus on what governments are doing and these kind of institutional power politics, I think it doesn't really look great for the impact of civil society in certain cases to be able to influence the agenda and influence the implementation.
But that said, there are caveats on this. I think GIFCT is really interesting because it went from a completely kind of informal organization set up by industry, actually, if I have this right, due to another kind of informal co-regulatory pressure project from the European Commission under the auspices of something called the EU Internet Forum that has brought together European law enforcement officials with platform companies from 2015 onwards. So this organization, this database of basically banned content that should be hashed and fingerprinted and checked a point of upload on every platform, when you and I try to upload basically any type of content, these systems have become really pervasive. It's an interesting case study as to how an organization like that is quite informal, becomes a bit more institutionalized, gets this weird hybrid structure because you have the injection of government influence. So if I get this right, I don't know exactly about the GIFCT, right?
There's this relation between the GIFCT and the Christchurch Call, which also has its own advisory network and there are governments involved as well as civil society organizations involved. Yeah, like you mentioned, there has been a lot of concern expressed by civil society about their ability to meaningfully participate in these spaces and to meaningfully set the agenda. And in a way, I think maybe that's something that actually is born out by the focus of the book, where if at the end of the day these types of initiatives are really mechanisms for government actors to communicate their interests and to change what platforms are doing, even if incremental ways, is there a huge amount of room for civil society in that? I don't know. But maybe just to add and to caveat all that, where I do think there is more of a role for civil society is more in the kind of demand side stage.
So rather than in the kind of implementation of these kinds of agreements and the kind of eventual rollout of these types of regulatory frameworks, I think you could say actually a lot of the stuff that we're seeing in the US in terms of state led lawsuits to, or state led bills, for example, to challenge CDA 230 to create new responsibilities for platforms or I guess tie their hands away from some of these responsibilities when it comes to content governance issues can also be interpreted as actually a win of this weird civil society coalition that comes more from the right and brings together, for example, libertarian groups, free speech, absolutist groups, child safety groups. So in a way, I think that is maybe something that not everyone saw coming, is that when we think of civil society in the space, we really, of digital civil society, we think of organizations like Article 19 organizations that have been in this space for a long time and we're not necessarily thinking about these weird hyper issues, specific hyper political interest groups. And again, maybe that's certain policy issues speaking, that's some of the issues we've been thinking about. I know that people who for example, have been working on content moderation, issues relating to online sex trafficking relating to sex work, they know that these issues are always political in the role of civil society is always really political. But I think, yeah, the factors of the politics of the last few years and the developments of the last few years have really shown how that's the case for a lot of other issue areas of platform regulation as well.
Justin Hendrix:
Let's briefly touch on the US and the states. You get into the context of how various states have tried to take on bills that would address content moderation questions. You talk about the tech lash, you talk about our regulatory ecosystem here. You talk about how the judiciary generally has become a kind of locust for dispute around these issues. And then you get to Florida and Texas as well as California. There are many shots in gold being taken on platform regulation. It seems like states like California have more of a kind of perpetual ambition to address these questions from one point of view, whereas efforts in Florida and Texas come from a very different point of view. The court has answered, maybe things will change from here, but I don't know, how do you think about the US states in this period?
Robert Gorwa:
I think this is a really interesting and tricky one, and the role of the US legislature is really interesting. And again, I'm not an expert on this. I get into it a little bit. The way that the courts have been packed with Trump FMTs and the courts themselves are becoming more political and are becoming political actors intervening in this space. But I think in a way, the way that I've been thinking about the US is again, and forgive me here, to zoom out again to this kind of conceptual framework, I think we can really helpfully, and this is maybe the best case of applying this depend and supply change. So the political will and the power to supply change. And what I see in the US is very clearly, at least in my view, an example of a place, a political context where there is huge demand from all sorts of different actors to change the regulatory status quo, to do various things that would affect how platforms moderate.
Of course, there are issues where sometimes that demand goes orthogonally to itself and is coming from different corners, and some people want some more stringent rules and some others want weaker rules. But at the end of the day, it's a space where you have huge demand from huge segments of the policymaking apparatus for change and real limits on the power to supply that change due to institutions. And I think a lot of this comes down to, as you mentioned, regulatory gridlock, the weird intricacies of the Senate filibuster that makes it so hard to do policy. And yet again, I think is not necessarily a completely US exceptional thing. For example, in India, there was a lot of conversation about the potential super majority that the BJP might get in the latest elections and they didn't get it. So this has really tied their hands also because it means that the opposition can block them in the other house of Parliament and it means that they have to use kind of workaround tactics like these money bills, which are very similar to the kind of budget reconciliation procedure through which US policymaking really happened.
At least in my view, even like you say, the politics on the ground I think are constantly shifting and it's an impossible and very ian task to try to always keep up with the latest developments. I'm struggling so hard with just the absolute deluge of state bills on a huge range of issues from kids stuff to now AI that all might impact what platforms are doing. And yet on the other hand, I think it's just fundamentally, from a top level point of view, it makes a lot of sense that it's happening through the states because that's a real kind of lever for policy contestation to happen. You have these trifectas that I write about, right? Where you have, unlike federally, you have many state legislatures, actually the majority of state legislatures, which are controlled completely with both houses and the governorship in the hands of one party.
And basically they can pass bills more or less as they will. Of course, the question is to what extent are those actually going to happen? AKA what extent will they be overruled in the courts? And that's a whole ‘nother conversation, but that's at least my reading on the US. So it's a very weird and very different space than actually the questions that initially motivated me to write this book, which is more non emergence. So why, for example, I dunno, we pick a hypothetical country on any continent and we say, okay, so why are you not regulating? And I think there are demand side things that might factor in. So for example, it could be as simple as this is just not on the agenda for policymakers. It's not salient or maybe it's salient, but it's not in their interests because there's economic benefits that they perceive to having the platforms in their countries.
They're worried about regulatory fallback. It could be that the interests are being suppressed by lobbying from different actors. So for governments famously, the US for example, lobbied incredibly deeply against EU chemicals regulations in the mid two thousands going down to the consulate level, lobbying at individual EU cities. So this is the apparatus of the state, the home state of platform firms that we haven't really seen wielded as a lobbying actor, but I think will probably happen more often as these things become politicized, lobbying from companies, there might be demand, and this is what I think we're seeing in the us, but not supply. And that's the issues of regulatory capacity.
Justin Hendrix:
You also spend a bit of time talking about other contexts. You talk about Brazil, India, China, which I find interesting when you think about China in particular, how does it look different through this lens of supply and demand when you're dealing with a country that's not democratic?
Robert Gorwa:
So this of course is the kind of million dollar question for political scientists too, which is this issue of regime type and structural factors that undergird this whole analysis and are always difficult to factor in. And I also say that I'm not an expert on China or on India or on Brazil, which are the three countries that I explored in some depth in this global majority chapter at the end of the book. And I also unfortunately didn't conduct my own primary research in any of those countries, but I did my best to consult with people who are experts on those countries and also to pull in all of the secondary literature I could find on those countries. And I think actually, and I would love to hear from readers whether or not they think this is convincing, but I think the model still holds, and that's because the kind of political will and power to intervene concepts are fluid enough that they can hold for different regime types, different political situations with of course, unique wrinkles in each context.
So China with the caveat that I, again, am not a China expert. I've been reading Yao Wen Lei. Angela Zhang has this new book, Highwire, that I've been reading with interest on this. So these are the people who have really been on the weeds on this, but this to me seems like a fairly clear combination of both very strong political will demand to regulate platforms and then also the capacity, the power to intervene the supply to actually do this, right? So China is really interesting in that you have these kind of neo-mercantilist ties between companies and firms and the government is constantly and has been constantly, at least in my reading for probably at least two decades, been convincing, collaborating, contesting multiple policies and multiple issues that have been deployed by the Chinese government. So trying very actively with a huge range of strategies to shape what platform companies are doing at least domestically. The question which I think is maybe the most interesting one is what happens internationally and what happens when you have these kind of internationally operating subsidiaries of the Chinese companies that maybe share some leadership, but also are basically trying to play ball with the platform regulation environment that is set up in other jurisdictions. It's a really interesting case about the extent and impact of regulatory capacity and of state power and the way that companies are interacting with platforms.
Justin Hendrix:
This book leaves us with a lot of questions. One of them is, will the future outcome be fairer, more accountable, more transparent systems of platform governance with better due process and enhanced rights for users? Or will the outcome be territorial and fragmentation as firms are pulled to splinter their services and rules to comply with a vastly increased regulatory burden across jurisdictions? Where are you on those questions right now? What do you think is going to happen over the next couple of years? I sort of think we're in for this inevitable contestation across the world, probably a lot more situations. We just saw in Brazil, the battle between Musk and Judge Alexander de Moraes, maybe situations like we're seeing unfold in France around telegram and Pavel Durov.
Robert Gorwa:
100%. I think I started our conversation off by saying that this, if you look relatively speaking to other areas of technology policy, of other areas of corporate regulation, this is still a fairly kind of in coha and new area of regulatory intervention. And I think it's very clear that it's just getting, like you said, a lot more complicated, a lot more busy, and we're getting more and more contestation from different corners from different levels of government. Brazil is an example of, Hey, okay, it's not the executive, it's this kind of interesting very Latin American judicial body that has autonomous regulatory authority getting on in France. We're seeing law enforcement getting on and deploying some of these strategies themselves. So we're seeing more and more politics basically at all levels involving a huge amount of political stakeholders. And I think that is just going to continue to unfold and get more complicated.
The other thing which I think is happening already and is probably going to be the big story if I had to pick one for the coming years, is that we're seeing more and more interlinkages between these perhaps fairly niche issues of content governance, of content moderation with broader questions of industrial policy, of geo-economic contestation of geopolitics. And I think this is going to get just more and more interesting. So we talked about the net DG, we talked about the fact that then in 2016, 2017, the US government didn't try for example, to say, oh, you are regulating ARC big companies, right? The net dg like the DSA had differentiations for different firm size and it hit basically exclusively American companies. And they could have said, Hey, you're slapping us on some kind of protection measure. We're going to retaliate with the tariff. And as far as I know, this has only really happened once in the space of platform policy writ large, which is when France was talking about a digital taxation and Trump threatened retaliatory tariffs on cheese and these kinds of things. But I think these kinds of interlinkages are here to come, right? And the next DSA, the next major content regulation and platform regulation bills that we're going to see in more and more jurisdictions, I think are going to get swept up in these broader political questions. And I think then they're just going to get more complicated and more interesting for us observers to try to understand,
Justin Hendrix:
We'll see if it all works out better for the interest of democracy and free expression, all those things we hold dear. Robert Gorwa, the author of the Politics of Platform Regulation, this thing's available for free download from Oxford Studies and Digital Politics. Go and look for it now. You can find a link in the show notes. And thank you so much for speaking to me.
Robert Gorwa:
The recent interviews you've been doing on a plethora of great books have been fantastic, and it's a real honor to be able to talk to you about my little addition to the, I'm sure rapidly growing pile of books on everyone's desks these days. So yeah, thanks for having me.