Home

Donate

Assessing Systemic Risk Under the Digital Services Act

Gabby Miller, Justin Hendrix / Oct 6, 2024

Audio of this conversation is available via your favorite podcast service.

One of the most significant concepts in Europe’s Digital Services Act is that of “systemic risk,” which relates to the spread of illegal content, or content that might have foreseeable negative effects on the exercise of fundamental rights or on on civic discourse, electoral processes, public security and so forth. The DSA requires companies to carry out risk assessments to detail whether they are adequately addressing such risks on their platforms. What exactly amounts to systemic risk and how exactly to go about assessing it is still up in the air in these early days of the DSA’s implementation.

In today’s episode, Tech Policy Press Staff Writer Gabby Miller speaks with three experts involved in conversations to try and get to best practices:

  • Jason Pielemeier, Executive Director of the Global Network Initiative;
  • David Sullivan, Executive Director of the Digital Trust & Safety Partnership; and
  • Chantal Joris, Senior Legal Officer at Article 19

What follows is a lightly edited transcript.

Gabby Miller:

My name is Gabby Miller. I'm a staff writer at Tech Policy Press and this week I'm taking over the podcast. So we have a panel of folks on today to discuss Europe's Digital Services Act or the DSA and the systemic risk assessments that very large online platforms and search engines are required to conduct.

David Sullivan:

Hi, I'm David Sullivan and I'm the executive director at the Digital Trust and Safety Partnership where we work with companies providing a wide range of digital products and services on best practices for trust and safety.

Jason Pielemeier:

I'm Jason Pielemeier. I'm the executive director of the Global Network Initiative at GNI, which is a multi-stakeholder organization that brings together academics, civil society organizations, investors and tech companies from around the world to work on issues related to freedom of expression and privacy in the tech sector.

Chantal Joris:

My name is Chantal Joris. I'm a senior leader officer with the International Freedom Model Expression Organization Article XIX, where I focus mainly on platform regulation in the EU and beyond and also on freedom of expression in armed conflicts.

Gabby Miller:

There was a DSA stakeholder engagement forum that took place in Brussels this June that everyone here either organized or attended. So could you all start off by telling us first what the DSA is and what led you to putting the forum together?

Jason Pielemeier:

The Digital Services Act, for anyone who hasn't been tracking it, is a landmark regulation that was established by the European Union a few years ago and covers a really wide range of different types of digital platforms, services, and intermediaries. The DSA falls into a broader category that at GNI we have referred to as content regulations. So these are laws that have been emerging with increasing frequency around the world that attempt to address different challenges and risks and opportunities related to content online. And the Digital Services Act is probably the most sophisticated and most well-known example of content regulation. At the moment, it is not only applying across the entire European Union but is being cited in discussion around other regulatory efforts in other parts of the world. And it is one of the first to start to come into effect. And crucially, the DSA begins its obligations are tiered based on the size of the particular service. And the obligations for the very large online platforms and very large online search engines kicked in a couple of years ago.

And we are starting to see not only more detailed regulatory guidance in the form of so-called Delegated Acts coming out of the European Commission, but also implementation by those very large online platforms and search engines, which I'll refer to here as VVOPs as a shorthand. GNI has been tracking this process since the legislative debates began on it several years ago, and we've been working with our friends at the Digital Trust and Safety Partnership to try to unpack and understand what this will mean in practice. I'll hand it over to David to talk a little bit more about how we've been doing that.

David Sullivan:

The very large online platforms and search engines, a large part, certainly not the only part of the biased tier of requirements in the Digital Services Act, but a substantial part of it, is a requirement for these services to conduct systemic risk assessments, mitigate those risks, and be audited for how they are implementing the act as a whole, including those risk assessments and mitigations. And so the first systemic risk assessments, which is an annual process, at minimum, these services have to do these risk assessments annually, but they also have to do that if they're deploying some new functionality, which has a so-called critical impact. And we can talk a little bit more about that later. But the first round of these systemic risk assessments were due in August, for the first designated platforms and services were due in August of 2023. And so in advance of that, we at the Digital Trust and Safety Partnership and GNI had been thinking about how can we try to catalyze some multi-stakeholder discussion of what these systemic risks are, what these risk assessments should look like, and so forth.

And at the time, that was a challenge partly because no one had ever done a systemic risk assessment before. It wasn't clear exactly what a systemic risk was. And this was a very obviously sensitive process for companies who are now being regulated by the European Commission. And so we held a virtual event back in May, I think it was, of 2023. And we were able to have some discussion particularly where academics and civil society organizations were able to provide some input on what kind of risks they were thinking about how they think companies should approach them. But most of the companies that participated in that call weren't really able to talk much about the risk assessments they were doing. So civil society could inform the risk assessments, but civil society wasn't really informed about what the risk assessments looked like. And so now going into 2024, we thought there was an opportunity to do something more substantial, in-person, bringing together people to talk about these risk assessments.

The companies having done a first round of risk assessments working on a second round of risk assessments, but still without any of the documentation about this process being public. So still a sensitive topic, but one where we thought that bringing folks together to talk about specific types of risks and how one goes about assessing and mitigating these risks was a worthwhile endeavor. And we spent several months trying to think about what would be the best way to bring folks together to encourage that kind of conversation and help level the playing field so that both civil society organizations and academics can learn from what companies are doing and companies can learn from the perspectives of those organizations.

Gabby Miller:

And just as a baseline, are any of these systemic risk assessments going to be made public at any point?

David Sullivan:

So I believe it is at the end of November that the companies are obligated to produce a public report on their first round of risk assessments, mitigations, and the audit that was done of those. So we are already a little bit behind in a cycle because companies will be reporting about risk assessments and mitigations from 2023 after having done a new round of risk assessments for 2024. But that's when we'll start to see more in the public about what all this looks like.

Jason Pielemeier:

We recognized when we were setting up this meeting in June that we would not have these public documents available, but we wanted to make sure we had a chance to create space for conversation and input that could influence the second round, the second year of risk assessments that companies were already engaged in. We've also committed to doing a virtual follow-up session, which will happen in late January. And the intention of that event is to bring together a similar set of stakeholders once we've seen the public reports that David alluded to and be able to further deepen and enrich the discussion based on those public reports.

Gabby Miller:

David, you already mentioned this, but with the systemic risk assessments, there's been some criticism around what does the systemic risk even look like? What does that mean? How do we define that? That's been top of mind so what have those criticisms been and how did you face those criticisms head on in the forum?

Jason Pielemeier:

The primary criticism has been on the lack of transparency, I think, regarding what is happening within the companies as they conduct these risk assessments and what has been happening between the companies and the commission, which has already seen and been processing the first year risk assessments. So there's been really very little visibility into that dynamic. And that's something that at the Global Network Initiative we're very concerned about both because we want to make sure that the risk assessments that are being conducted are meaningful and build on the tech aids of good practice that has emerged under a variety of different frameworks, most of which draw back to something called the UN Guiding Principles on Business and Human Rights, which is a framework that was endorsed unanimously by the UN Human Rights Council in 2011 and broadly describes the obligations that states have, with regard to the regulation of private sector conduct, and the responsibilities that private sector organizations have vis-a-vis human rights independent of how they're being regulated by nation states.

So GNI has a very specific framework for thinking about privacy and free expression. It's tailored to tech companies, but it's built on that UN Guiding Principles framework. The OECD has a framework for responsible decision making and responsible business conduct. So all of these frameworks have led and have guided how both companies and civil society have understood and worked to improve responsible business conduct over the years. And the DSA is one of the first pieces of regulation that seeks to build on those efforts and ensure that they are carried out seriously and across industry. We want to make sure that the systemic risk assessments are being done in a way that is effective and meaningful. We also want to make sure that the regulator, in this case, the European Commission and the digital services coordinators at the EU member state levels understand their obligations to ensure that these new duties for private companies are implemented in a way that enhances rights rather than restricting them.

And there's real risk, and we can talk about this later, of both intentional or unintentional consequences of this kind of regulatory form. And the best way in our mind to mitigate that risk of unintended or intended overage is transparency. So this session that we organized was an attempt to bring together, across section of stakeholders, to open up a bit more than has been possible to date about how this is all working in practice. And we really hope that it's just the beginning. There really needs to be a lot more transparency and discussion across sectors and with the public around this new dynamic to make sure that it does what it's intended to do, which is to protect the rights of users in the European Union and then, hopefully by consequence, that there are positive spillover effects outside of the Union as well.

Gabby Miller:

Who exactly was in the room?

David Sullivan:

So we had about 80 individuals, I think, in the room over the course of two days in Brussels. And that was, I think, roughly 20 or so folks from VLOPs that are members of the Digital Trusted Safety Partnership in the Global Network Initiative. And then about 60-ish folks from civil society and academia from Europe, but also from around the world. The forum was held under the Chatham House rule. Individual comments weren't attributed, but we did have a list of participants that's in the report that we published by organization.

So we don't name the names of the people who were in the room, but you can see what types of organizations were part of it. And that was leaning upon, I think, not only the multi-stakeholder membership of GNI as well as the civil society organizations and academics who are part of GNI to help us try to find the right people who could be in Brussels on those days. There's obviously always blind spots in terms of trying to find everyone and you don't have enough space for everyone you would want to be part of this kind of a conversation. But I think we were able to succeed in bringing in a lot of different types of expertise who maybe hadn't been in a room together before.

Gabby Miller:

David, you mentioned earlier that there are these information asymmetries because there is a lack of transparency between civil society and the companies are performing these risk assessments, and so now you put them in a room for this forum. So what were the dynamics going into the room and how did this all play out?

Jason Pielemeier:

We worked with our friends at DTSP, but also with friends in civil society, including Chantal, to build the agenda for this event and make sure that it was going to be seen as attractive and a useful place for people to spend their time and energy. The reality is that confidence in tech companies and their efforts to protect digital rights is not super high around the world and Europe is no exception to that. And there's good reasons for skepticism and doubt, but we nevertheless want to see how we can use this DSA risk assessment framework to not only improve corporate conduct, but to improve visibility for civil society and to how companies think about these risks and the ways to mitigate them. So we did a lot of groundwork ahead of the event to make sure we were getting input from a broad cross-section of stakeholders and that we were getting the right people in the room from the companies but also from civil society.

And that, for us, meant not just bringing in, obviously, people who have expertise in the risks that exist in Europe, but also thinking about a global set of civil society actors who are witnessing the spillover effects, the Brussels effects as they're sometimes called, of this landmark regulation. We spent a lot of time trying to build some trust and credibility upfront and design an agenda that made sense in terms of how we went through unpacking the relevant articles in the DSA articles 34 to 35. And that bore fruit in the sense that we were able to get a really good group of people into the room. The conversations that we had were, I think, really constructive. It wasn't without criticism, but I think the overall tenor was one of desire to learn from each other and to hopefully influence the way that this all works in a more rights-respecting direction.

Chantal Joris:

I'd love to comment on this and perhaps I also want to make a couple of remarks about the risk assessment provisions generally, which we do think they offer important opportunities, which is also why it's so important to get their implementation and enforcement right. And as Jason mentioned, some of the aspects of these risk assessment bear some similarities to the UN guiding principles on business and human rights, which is something we've always advocated with companies to adopt that framework and conduct human rights due diligence. And again, the DSA keeps the wording very broad. There will probably, at some point, be full libraries to even explore what exactly are systemic risks. And there is an opportunity in that it can be so flexible enough to be adjusted for different risks, different human rights impacts or fundamental rights impacts and different products and services. But there is also a certain risk that it leaves too much discretion to the companies and also to the European Commission as to how they interpret these provisions, which is also where the civil society engagement and us being able to have a seat at the table is so important.

And I think also when just looking at the legislative text and how regulators or companies might interpret it, I think one of the risks is that it is looked at as some sort of content regulation. One of the good things of the DSA, which we said, is that it doesn't look at regulating user speech, but how platforms operate and their own systems and processes. And this is, we believe, is the correct approach and it is not necessarily the approach that we have seen in some jurisdictions outside the EU.

So we are however, a bit worried that these risk assessment provisions and the mitigation provisions could be understood as almost demanding content-specific restrictions in the context of these due diligence obligations rather than looking at the systemic risks stemming from the platform's and online search engine's own systems and processes. And we have the former Commission wrote letters to companies, for example, following the 7 October attacks and also more recently in August following UK riots, Elon Musk's interview with Trump where we said that he's going specifically in this direction of looking at you need to do more to curb problematic content rather than saying you need to check your own systems and processes.

So just this again to say seeing how these provisions will be implemented in practice is almost representative of seeing is the DSA going to be a success and this going to be actually something that promotes and furthers fundamental rights online. Or is it going to be interpreted in a manner that can make it similarly dangerous and restrictive as other content regulations we have seen? Now more specifically, to the question of stakeholder engagement and this forum. I think both David and Jason have alluded to what is really the key issue, which is transparency. It is very difficult for us to assess what is the quality of these risk assessments. Are they appropriate if it's virtually impossible if we do not have access to these risk assessments. And granted, the DSA does not impose any obligations right now to make these public, but we did not receive almost any indication of how companies went about with these risk assessments.

So what the risks were that they looked at specifically stemming from their products and services, what sort of content they looked at and what sort of mitigation measures they have envisaged. And on that basis it is very, very difficult to provide any sort of meaningful input and to scrutinize in any meaningful way. So that is something that we believe has been lacking. And for the one in June, there were certainly some interesting conversations.

At the same time, of course, it's nice to bring so many participants together. But at the same time, if I see in the same room as 20 companies which offer completely different services, completely different products, how could I be able to input in any meaningful way, in any of their risk assessments? Or if they do not present individually, how they consider these issues? And they should provide information first. I'm only able to input, and I think David said this earlier, that stakeholders were giving information rather than being informed. It is however the company's responsibilities first to do their homework and then we come in and then we can comment. And this, unfortunately, has not been the case so far. The June forum, certainly a good, well-intentioned attempt, but it has not given me any clear indication of how these risk assessments look like.

Gabby Miller:

The report notes that during these first rounds of systemic risk assessments, which we still haven't seen, but during that first round there was some level of stakeholder engagement. Some from civil society criticized that they didn't receive any feedback, they had no idea how their engagement or the sessions or conversations that they had with these companies, how that was being considered into the risk assessments that were given to the commission. And then that lack of feedback or willing to engage after the fact then acted as or could act as a discouraging factor from future participation. On top of the fact that there's already a low trust in terms of engagement, a lot of civil society don't necessarily trust that the companies are going to take what they're saying seriously. So in terms of where we're at now and looking to the future, maybe again this will be good for you, Chantal, to address, are you hopeful for the future about anything changing after this forum?

Chantal Joris:

Let me first add the caveat that I don't believe Article 19 participated in the first round of consultation, so I was only there in the one now in June. I do want to say though, many of the issues around stakeholder consultations, they are not new. With civil society actors have, for a long time, engaged with companies, with some companies more than with others, and there has always been a problem of a feeling that it has been quite an extractive relationship. That we get asked for our input, but then we don't know, there's not really any follow up. We don't know how our input is actually being considered. Does it make any difference? In short, is it worth our while? Is it worth the resources that we invest? And this might also depend on Article 19 is a globally active organization, so some of our regions and teams might have a more direct and clear interest to engage with these companies in areas also in regions outside the EU in more authoritarian context, for example.

I can also say for our from Article 19's perspective, we have not been specifically approached by any of the companies when it comes to the DSA risk assessments. This might be different. I cannot speak on behalf of other partners, but I can say that there was one conversation about expectations, which then went to a completely different topic. But there's also not been much specific outreach from the companies to us about these DSA risk assessments. The challenges are not exactly new. Of course, with this famous Recite 90, we are hoping also that the commission pays more attention to this. Whether there has been stakeholder engagement in Recite 90 essentially encourages, I would say, or it says that companies should consult independent experts, including civil society actors as they draw up these risk assessment and the mitigation measures. For us, this is obviously a promising basis to advocate for that to go better, but again, it's all about how meaningful such engagement should be.

I also want to mention very briefly that on the transparency front, which is one I consider one of the main gain of the DSA. There's also the access to data access for researchers where there has been, I think, an article actually on tech policy press which described it as one step forward, three steps back. Because also the companies are, the conditions they impose on researchers that have already been vetted are too burdensome. There has not been proper access given. So we need really companies to step up their efforts and really honor these obligations, which will be crucial. And also the commission to pay attention to this, which will be crucial to see whether the DSA can be a success.

Gabby Miller:

David and Jason, you guys were also in the room. What were other participants saying about what I just asked Chantal?

David Sullivan:

None of these companies are monoliths. And especially when we're talking about the very large companies, we're talking about companies with a lot of different functions and teams that have some bearing on these huge and consequential issues that are covered by the DSA. So in some cases, companies have human rights teams that spend a lot of time engaging with human rights organizations. They have subject matter experts who are working on some of the specific issues covered by these four broad baskets of systemic risks that are specified in the DSA. But now with this regulation, they also have compliance functions. They have folks who are specifically dedicated to doing risk assessment, preparing these reports as well as dealing with the audits. And I think those individuals are people who have not had the same kind of exposure and engagement with independent experts that we tried to bring together in this forum.

And that's not to say that there isn't much more that can, should, and must be done, but I think one of the things to me that was interesting was, okay, we're actually bringing in a different set of people from the companies and also helping understand what does this actually look like inside the companies. That there is this layer of risk assessment that is not about specific subject matter expertise, that is often about coordinating how these risk assessments are prepared, including then going to the people inside trust and safety teams, content policy teams, human rights teams. And so I think hopefully we were able to, I think, make a little bit more clear what some of this looks like inside companies even without the reports, which will be crucial for understanding this. I also think this is a long game or it has to be a long game because the first reports that will be produced, again, will be from risks assessments done in 2023 that are looking ahead to 2025.

So the cycle of assessment and reporting that we've been on, again with nothing public yet based on the text of the DSA and the requirements that were enacted, is not optimal. But we need to try to figure out how does this get better over time. And it may get worse before it gets better because I would imagine that the reports that will be prepared and coming out in November will not contain all of the things that civil society organizations might be looking for.

The other thing just to mention briefly, I think that was really interesting was to hear how little feedback the companies have gotten from the commission. And one of the things that came up, in a session that I was in, is that the only feedback on these risk assessments that companies have received from the commission has taken the form of requests for information and enforcement actions. So we are not seeing yet, we don't have indications yet, that there is some process where these reports get submitted to the commission or to the digital services coordinators in the member states where they're established and that there's some sort of feedback. Instead, you are somewhat getting more information if the commission seems concerned that you haven't done a good job. And that doesn't necessarily seem like the most healthy process for making this work the way I think we all want it to work.

Jason Pielemeier:

Chantal made some really good points. And David's point that we need to look at this in the bigger picture or longer term view is a good one. This is going to be an iterative process. These risk assessments are going to have to be conducted at least annually. There will hopefully be mechanisms for feedback both from the commission and from civil society and other actors that will lead to improvement over time. But GNI, think we're very concerned that while we have faith in the process over time leading to improvement, the window for demonstrating improvement, not just to regulate existing regulators and potential regulator, but to the public is really quite narrow. It's been, I think, frustrating to say the least that there hasn't been more guidance publicly developed by the commission as to what these risk assessments really should contain and how they should be conducted and what stakeholder engagement, as a critical part of that process, should it entail.

But it's really incumbent, I think, on all of us if we care about this space and digital rights more generally to not just sit and wait for the regulator. So if the regulator isn't going to provide that clarity, we want to make sure that productive conversations can begin in advance to lay a foundation for improvement taking place more quickly than it might otherwise. And so that's why it was really important for us to have this conversation and to come in very honestly acknowledging that there's going to be a lot of differences of views and a lot of disappointment along the way. But the sooner we can process that and make improvements that get better, the more likely we are to have a digital services act that works, share enhancing digital rights. And that is a really important piece of the broader global puzzle.

Having a piece of regulation that was democratically enacted, that builds on these existing frameworks and approaches that I mentioned earlier, the UN guiding principles, the OECD guidelines, and that doesn't create a structure of top-down, state-led content enforcement because that is the primary counter-approach that we've seen in other parts of the world. I think that the DSA has a lot of potential and we feel very committed to trying to see that potential realized as soon as possible. It's not going to be easy. It's not going to be as quick as we would probably like. But we think these kinds of conversations are a critical catalyst to condensing that improvement over time window.

Chantal Joris:

I do think that the main area of agreement was probably that we would like to see more guidance from the commission. They published these guidelines, had a consultation just on mitigation measures around elections, but that's it. Maybe they're also looking at what sort of risk assessments they receive and then, on that basis, let themselves inspire what they actually should look like. But I definitely also agree that there is now a window of opportunity to also set the narrative and see how the DSA is enforced and applied and how is it really to be understood. And there are many regulators around the world, again, we are globally active, that reference not only the DSA, also the UK Online Safety Act, that reference different regulatory frameworks. But if they feel that they are all quite similar and they're really all content regulations, as I said, as a start and not necessarily regulations of platform's or search engine's own systems and processes, I think that will really be a shame. And it will mean itself is quite market improvement that the DSA could potentially could be missed.

David Sullivan:

I think one of the other things that was interesting that we learned. So between the Digital Services Act and the UK Online Safety Act, two of the most significant pieces of regulation in this space that have taken very different approaches in terms of how they've been implemented, the DSA started with first the largest platforms having to basically immediately comply with these provisions. And so we don't have the kind of guidance that would be helpful in terms of understanding how to implement them.

UK Online Safety Act is taking this steered approach where they've been doing these consultations and providing thousands of pages of guidance before anyone has to do an actual risk assessment under the UK Online Safety Act. So companies are using guidance from the UK, I think, to craft how they do risk assessments under the DSA, which is a weird development. And I think one of the things that I'm looking at and thinking about is, if the process in Europe is that the companies do their own risk assessments absent guidance, and the commission looks at those and says, "Oh, I like the way that this company did that. And I like the way this company did that, but I don't like the way this company did this."

There's a danger that the way that these risk assessments will wind up being the direction they will go is towards putting the most amount of resources at them, which might benefit the largest companies and put companies that are still large, but may not have the same level of resources as the very, very largest companies, having to do the same things that those companies are doing.

Gabby Miller:

So I'm hearing that it's not just that the public needs to have faith in the companies to perform these systemic risk assessments, but also faith in the commission and the DSA itself. So much of it is built around transparency, while there's also a lack of it built in. And I don't think everyone's aware that most of the engagement these companies have had with the commission have been around public requests for information and enforcement actions. Chantal, you just mentioned that the commission issued a report despite mostly not engaging directly with the companies, there's a lack of clarity around what these assessments should look like. So I'm wondering what better engagement from the commission looks like.

Jason Pielemeier:

We did meet with the commission when we were in Brussels. We had given them a heads-up about this event. We've been in touch with them, both DTSP and GNI, think independently and together, for several years now. I do want to say that the team that works at DG Connect, the director general division within the commission that's in charge of DSA enforcement, has been, I think, engaged, aware of some of these challenges and frustrations that we've been articulating. We were very clear with them that we're going to hold this event in an effort to try and build in this vacuum of guidance, to build some conversation and create some space. They were encouraging of that. We were also told them that we didn't want them to attend because we thought that would create a very different dynamic that might make it more difficult for companies to open up.

Obviously, if the regulator were themselves in the room. And they understood that. Whether or not we will produce recommendations or guidance coming out of these workshops, I think is still to be determined. I think there have been some really good recommendations already provided by civil society organizations. And I think we certainly encourage that and want to see. I think it's really on all of us to try and come up with what we think clear public positions could look like to help provide more guidance and clarity around these risk assessment exercises. We're very interested in being parts of these kinds of conversations and continuing to help host and facilitate more of these conversations. As I mentioned, we'll have the follow-up session in January, which I think will be really rich because we'll have much more concrete information to discuss in the form of the reports that were produced in November. And hopefully we can continue to have more discussions after that. Just as these processes will be iterative, we hope that the surrounding conversations and spaces can also continue to become more participatory, more transparent, and more inclusive.

Chantal Joris:

If I can maybe just add from civil society perspective. So we've also, many of us have been calling for a more structured involvement also at all stages of the DSA enforcement process with essentially the same argument as we use with the companies that we've been working on, fundamental rights issues associated with online platforms and search engines for so many years so that we should be able to have a seat at the table. There were a number of round tables also with the commission where many civil society organizations were present discussing different topics. I would say generally speaking, certainly there's room for improvement, but again, we have to figure out how we can make this as inclusive as possible. I do also want to mention the importance both when it comes to platforms or the commission tool include non-EU based voices, also to give access to researchers that are based outside the EU.

I would say at the forum, there were quite a lot of US voices interestingly present, although US does not seem to be necessarily at the forefront of the jurisdictions that are super keen to adopt a tight content regulation. But yeah, we had some interactions. And also maybe I want to mention that I think beyond us wanting to have a seat at the table and being involved through stakeholder engagement, both from the commission and the companies, there is still an important pure watchdog role to be preserved for civil society where we should be scrutinizing and criticizing and holding accountable companies and regulators and bothering everyone, basically, because they can all still do better as can we. And then of course, it's a common learning curve that is certainly acknowledged.

Gabby Miller:

Are you all issuing a set of recommendations?

Chantal Joris:

Some of the discussions were, for sure, interesting. It's also discussions that we might have had in a similar way in other form, but really what we need is as much transparency and information as possible to be able also to provide useful input. So let's see what happens when these summary reports get published. I'm sure there will be lots to talk about then. And we can take it from there I think.

David Sullivan:

To me, I think that these kinds of conversations, one, are hopefully setting the stage for the much more robust and substantial kind of stakeholder engagement that is mentioned in Recital 90 of the DSA and I think is what civil society organizations and others want out of this process. So the idea is not that you attend one forum in Brussels for a couple of days and you've done your stakeholder engagement. It's more that we have a place to talk about what good looks like here and find the right types of conversations that can help enable that. One of the things that was really interesting throughout the two days that we spent in Brussels was that depending on the configuration of who was in the room and who was talking, very different conversations emerged. There were times when it was clear that company participants who are risk assessors were not the people to talk about some of the specific expert risks on disinformation and electoral processes or something like that.

There's somebody else in the company that should be speaking to that, and they weren't there. But there were other opportunities where you had just enough people from both from outside and inside companies that people from outside companies could ask questions and get answers that they maybe hadn't heard before about, okay, this is how this actually really would work inside the company. That's the beginning. But that's just the start. And that we now need to translate that into ongoing and robust processes that are going to hopefully make all of the work that goes into this very rigorous cycle of risk assessment, mitigation, audit, and reporting something that actually has clear benefits for the rights of users of these services around the world and is not just an enormous use of time and money from everyone involved.

Jason Pielemeier:

Coming back to where I started, we knew this was not going to be an easy conversation to organize and to have. But at the end of the event, we did put out a survey to all of the participants and got generally quite positive feedback. We also offered all of the non-company representatives or organizations that participated an opportunity to not be listed as organizationally as participants if they didn't feel like the summary or the event itself was worth putting their name on. Not that indicates any endorsement, it's just to give a sense to the broader public of who was in the room. And nobody opted out of that, which I think is a sign that there is room for these kinds of conversations. I'm not sure everyone recognized or appreciated that those kinds of conversations could be valuable beforehand. I think now hopefully there's more appreciation for that and that we'll see more of that and DTSP and GNI are very happy to help facilitate that.

But we also really expect, as David alluded to, there to be independent, much more independent engagement that can happen between civil society and companies across researchers, academics, and others. I think the other thing is that hopefully the company participants who attended, as David mentioned, many of them are not the ones who typically interface with civil society within their companies. And I think they came away much more impressed with a deeper understanding of the role that civil society can play, both in identifying risks and surfacing important challenges that they need to take into account.

But also in helping them think through how to mitigate those risks. Not just reading their reports, but actually engaging with them on how you grapple with these really thorny challenges that sometimes put free expression of different users in contrast or intention with one another or that have mitigations that can have positive impacts on particular communities while otherwise impacting privacy rights of other communities. Those are the kinds of thorny challenges that companies are grappling with. And civil society has this incredible repository of expertise and experience that they can tap into, and we certainly hope that they will be doing that more and more. I'm optimistic that, over time, that value proposition will become more and more clear.

Gabby Miller:

Thank you all for this wonderful conversation.

Jason Pielemeier:

Thank you.

Chantal Joris:

Thank you very much.

Authors

Gabby Miller
Gabby Miller is a staff writer at Tech Policy Press. She was previously a senior reporting fellow at the Tow Center for Digital Journalism, where she used investigative techniques to uncover the ways Big Tech companies invested in the news industry to advance their own policy interests. She’s an alu...
Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics