Home

Knocking at the Door of Transparency: The Digital Services Act and Infrastructure Providers

Konstantinos Komaitis, Louis-Victor de Franssu, Agne Kaarlep / Feb 21, 2023

Konstantinos Komaitis is an Internet policy expert and author, and serves on the Advisory Council of Tremau, a technology Trust & Safety start-up located in Paris. Louis-Victor de Franssu, is the co-founder and CEO of Tremau, and Agne Kaarlep is its Head of Policy and Advisory.

Given the increasing focus on trust and safety and the responsibilities of actors across the Internet ecosystem, regulation has gradually shifted focus on transparency requirements. What are the processes that must be in place to deal with illegal content while protecting fundamental rights and due process? The Digital Services Act (“DSA”) is quite clear: if a company's services are to be considered safe and trusted, transparency is non-negotiable.

If there is one place in the Internet where transparency can provide some much-needed insight regarding content moderation, that would be its infrastructure. The infrastructure of the Internet is a space consisting of various actors who provide everyday services that allow users to have a seamless, reliable, and secure Internet experience; however, it generally attracts little attention because it is obscure and, predominantly, technical. Actors on this level consist of conduit, caching, and hosting services, seen in companies such as Cloudflare, Amazon Web Services, and Google Play Store, to name a few. Their operations are crucial, yet they often seem distanced from the public discourse; they are often considered inaccessible and, occasionally, unaccountable to everyday users.

The question, therefore, is whether the DSA could help shed some light into the practices of these otherwise invisible actors. Does the DSA manage to create a consistent and predictable environment for infrastructure providers that could help alleviate some of the opaqueness with their content moderation practices?

What complicates things further is that given the Internet's evolution, drawing clear lines as to where the Internet's infrastructure begins and where it ends is difficult as various institutions across the Internet stack perform different functions. For our purposes, however, infrastructure is the place in the Internet where entities like domain name registries and registrars, Internet Exchange Points (IXPs), Content Delivery Networks (CDNs), infrastructure layer hosting and other services exist.

Traditionally, infrastructure providers never engaged in content moderation. They were supposed to be agnostic of the data they carry and their role was limited to facilitating the movement of packets from one Internet point to the next. They were meant to be facilitators of the way data travels rather than decision-makers of whether that data should travel in the first place. This agnosticism has been crucial in ensuring that there was no discrimination on the sort of data that should exist on the Internet. To this end, and as a matter of principle, “governments should not require such interventions and infrastructure companies should not intervene voluntarily."

Things have changed though and, over the past few years, there have been increasing calls for infrastructure providers to be more proactive in making content moderation decisions. For instance, after pressure from governments participating in the Internet Corporation for Assigned Names and Numbers (ICANN), a number of registries and registrars have become signatories to a framework that seeks to address Domain Name System (DNS) abuse. According to the framework, the domain name registrars and registries “aim to reinforce the safety and security of the DNS by highlighting shared practices towards disrupting abuse of the DNS."

With this in mind, most recently, content moderation by infrastructure providers has become increasingly common, raising challenging and complex questions about free speech and access to information. What is more, there is a lack of consistency among infrastructure providers in their response to illegal content. Most of them do not have transparent policies regarding content moderation, and of those who do, many of the decisions seem inconsistent and are taken on a whim. Unlike the services users interact with and use, most of the issues happening at the infrastructure level do not make the news, yet their effect is profound: in a blink of an eye, Amazon Web Services (AWS), Cloudflare, or GoDaddy can make a user’s online presence impossible.

The DSA does not stay quiet on this issue. The new European regulation continues to ensure that Internet infrastructure providers should not feel compelled to get involved in moderating content, clearly stating that services are not liable for content they have no knowledge of. Caching and hosting sites are compliant with the law if they act expeditiously when they have actual knowledge of the illegal content, meaning they have been notified of the content by a law enforcement agency or a user (user notice & action). When it comes to mere conduits, there is effectively no liability for user notices under the DSA.

A key strength of the DSA is that it requires infrastructure providers to make their moderation processes more transparent, as well as ensuring that providers explain the reasoning behind why a decision was taken. In the case of caching and mere conduit services, obligations such as enhanced Terms & Conditions, providing Contact Points to the service recipients, and, most importantly, requiring the publishing of Transparency Reports will all act as new requirements that will inevitably change the operational requirements of infrastructure providers. However, it is important to note that, while the DSA brings about a further swath of requirements for social media services, its proportional nature excludes most infrastructure providers from its scope. For example, caching (e.g. Content Distribution Networks) and conduit services (e.g. Internet Service Providers) do not need to have a mechanism to receive user notices, send a statement of reasons, or have any risk management obligations. These services do have, however, the obligation of responding to removal orders issued by a judge or law enforcement.

Where things get tricky, however, is when one considers that not all Internet infrastructure players are the same, nor should they be treated as such. For example, Amazon Web Services (AWS) also acts as a hosting service. Hosting services, even if they are at the infrastructure layer, are required to put in place mechanisms to deal with user notices and provide accurate statements of reasons and this is consistent with how moderation by such services has been conducted even prior to the DSA. For instance, in the case of Parler, a far-right social media app, its suspension was at the sole discretion of Google and Apple and in response to the role it played in the January 6th storming of the US Capitol. While there were many who agreed with the deplatforming of Parler, this decision rendered Parler almost inaccessible to users, revealing the power of such infrastructure layer hosts.

The DSA does attempt to respond to this concern by suggesting that moderation should happen closest to the content in question, which means that, in the previous case, content should first be moderated by Parler and only if they fail to respond, by the app store. In this respect, the DSA does not completely resolve the power imbalance between an infrastructure host and the social media service it hosts, but, at least, it gives some room for initial reaction to the host. Time will show how infrastructure players will deal with such moderation cases in the era of DSA obligations. On the plus side, thanks to the DSA’s transparency reporting obligations, we will most definitely find out. This is important as it will help us understand better how such services operate, allowing us to create more focused and ‘smarter’ regulation.

In this context, the DSA should be seen as an attempt to make procedures more standardized, thus giving companies less options to act outside of pre-established policies. Currently, we know little of how infrastructure players moderate content as they either do not publish transparency reports at all, or they do so for only a small number of issues. This is bound to change as infrastructure providers start receiving notifications and orders from a judge and other administrative authorities, such as law enforcement agencies to remove illegal content. The hope is that the DSA's extensive transparency reporting requirements will ensure that infrastructure providers become more open and transparent with regards to how they moderate content under their terms of service; they simply won’t be able to sweep such actions under the rug. How individual companies comply with these obligations remains to be seen, but there is a clear value attached to it.

In particular, one such case in which this new approach would have been beneficial is that of Cloudflare's well-publicized changes in attitude with regard to Kiwifarms, a site known for fomenting toxic behaviour and online harassment that ended up spilling over into the real world. If such a situation were to occur in Europe, the DSA would ensure both a clear process for Cloudflare acting to remove Kiwifarms, as well as demand a greater degree of clarity behind its final decision.

As a whole, the DSA seems to be filling a necessary void: the need for transparent and efficient processes; however, as they say, the devil is in the implementation details. Services– whether at the infrastructure level or higher up the Internet stack– must start adapting to a new environment where opaqueness is no longer an accepted practice. The DSA could push platforms to a future where transparency and accountability sit front and center.

Authors

Konstantinos Komaitis
Konstantinos Komaitis is a veteran of developing and analyzing Internet policy to ensure an open and global Internet. Konstantinos has spent almost ten years in active policy development and strategy as a Senior Director at the Internet society. Before that, he spent 7 years as a senior lecturer at ...
Louis-Victor de Franssu
Louis-Victor de Franssu is the CEO and co-founder of Tremau, a Trust & Safety tech start-up helping online services adapt to the evolving regulatory framework. Prior to Tremau, Louis-Victor was deputy to the French Ambassador for Digital Affairs. In this role, he specialized in issues related to tac...
Agne Kaarlep
Agne is the Head of Policy and Advisory at Tremau, where she helps online services meet the demands of the new regulatory environment. Before Tremau, Agne worked in the European Commission, where she wrote and negotiated the Digital Services Act and the Terrorist Content Online Regulation. She also ...

Topics