Home

Donate
Perspective

How Pornographic Platforms Address Gender-Based Violence Under the DSA

Marie Seck, David Klotsonis / Dec 17, 2025

Europe’s Digital Services Act (DSA) represents a landmark effort to create a safer online environment. Central to this framework are due diligence and transparency provisions, designed to give the public a glimpse into the inner workings of online platforms. One such key provision is Article 42, which requires Very Large Online Platforms and Search Engines (VLOPSEs) to publish a report once a year detailing how they have assessed and mitigated the risks that their services pose to society. While most major large platforms were designated by late April 2023 and are now publishing their second round of Risk Assessment Reports (RA Reports), platforms that host pornographic content were not designated until later in December 2023 and only published their first RA reports last July. In what follows, we offer a first look into the reports by Aylo, Stripchat and XVideos, with a particular focus on Gender-Based Violence (GBV).

Given the specific nature and business model of these platforms, their assessment and mitigation of GBV, one of the systemic risks VLOPSEs must mitigate under the DSA, is particularly sensitive. We have now published our findings in a brief, exploring how these three platforms define, assess and mitigate GBV.

What is ‘Tech-Facilitated Gender-Based Violence’?

Over the last few years, the Center for Democracy and Technology and CDT Europe have extensively researched and analyzed the impact and (co)regulatory framework surrounding Tech-Facilitated Gender-Based Violence (TFGBV). TFGBV is a pervasive and global issue and one of the fast growing forms of abuse against women and girls. It is a continuation of GBV, which manifests as online harassment and abuse based on one's gender or gender expression. TFGBV can take a range of forms, such as verbal abuse, threats of violence, non-consensual image/video sharing, stalking, theft of private data, doxing (publishing or sharing someone’s personal data online without their consent), creating and sharing fake images/video without consent, and more. Within this range, certain forms of TFGBV may constitute illegal conduct or content, particularly in the European Union, which has adopted the Directive on violence against women and domestic violence, which sets minimum criminal standards in some areas.

The DSA does not provide a definition for TF(GBV) despite it being one of several risks that providers of VLOPSEs must assess alongside “any actual or foreseeable” negative effects on fundamental rights; on civic discourse, electoral processes, and public security; the protection and public health, minors as well as to individual physical and mental well-being. While the list is not exhaustive but rather intentionally open-ended, this typology of systemic risk categories has attracted criticism, both because of the vagueness of some of the terms and because the categories themselves are not clearly delineated from one another.

TFGBV cuts across risk categories

In the context of tackling TFGBV, this raises concerns about how effective platforms use these categories for the assessment of a type of risk that is inherently cross-cutting. An example to illustrate this is Non-Consensual Intimate Images (NCII) content – one form of TFGBV, which will soon be criminalized across the EU under the Directive. Far from being a stand-alone risk, NCII cuts across risks to multiple fundamental rights, as well as different risk categories under the DSA. For instance, NCII undermines the right of the victim-survivor to privacy and damages the victim-survivor’s mental wellbeing. But it can also threaten civic discourse in cases of NCII targeting public actors, while in the case of NCII targeting a minor, the protection of minors is endangered. Overall, this means that cases of NCII may systematically affect individuals and their fundamental rights across DSA risk categories. This raises the question of how VLOPSEs in general and platforms that host pornographic content in particular should engage with the complex, cross-cutting risk of TFGBV under the DSA – and of what mitigation measures are effective in that context.

What the reports reveal

Overall, the reports show too little consideration of the cross-cutting nature of GBV; instead, we see a siloed, superficial understanding of that risk. Concretely, this means that risks we understand to be related to GBV are not assessed as such – this includes, for instance, NCII, but also the dissemination of content linked to criminal cases of sex trafficking. Examining GBV as a cross-cutting risk could have a significant impact on how it is assessed and how its severity is determined. For example, in its revised typology of risks, the OECD recognizes that cross-cutting risks can have wide-ranging impacts on children’s lives. Similarly, in its RA Report, Meta identifies the risk to mental and physical health as cross-cutting, which leads them to consider the potential impacts on that risk when assessing the severity across all systemic risk areas. While the way this is implemented in practice by Meta results in different issues, it also results in elevated risk scores for risks deemed to have a potentially elevated impact on an individual’s physical and mental well-being. This showcases that examining a risk as cross-cutting can help a platform pay more acute attention to it in its risk assessment and subsequently also invest more in mitigating it.

Additionally, the overall lack of a gender-sensitive understanding of risks – even when they are classified as GBV – leads to a limited assessment of it. For example, in its RA Report, Stripchat assesses risks in relation to the company itself, meaning risks are thought of in relation to Stripchat’s reputation and liability, and not the risk to the user or content creator. Not only does this relegate the harms to fundamental rights to the background, it is also directly at odds with the purpose of RAs as put forth by the DSA. In another example, Aylo’s RA Report weighs a severe risk to a small number of victim-survivors against a relatively insignificant risk to a large number of users seeing the material. This means a severe risk which only affects a handful of victim-survivors but leaves viewers largely, if not completely, unaffected (like, for example, NCII) receives a lower risk rating due to the fact that its severe impact only affects comparatively few individuals, predominantly women.

Finally, the key critiques put forth in our High-Level Feedback to the First Round of Risk Assessment Reports, published in March, in collaboration with the DSA Civil Society Coordination Group, also apply to the RA Reports of the three platforms in question. These include a) the focus on user-generated risks at the expense of design-related risks b) the lack of consultation with Civil Society in the development of mitigation measures, as mandated by recital 90 of the DSA and crucially c) the fact that reports lacked verifiable data critical to substantiate claims about the effectiveness of mitigation measures, in particular, metrics on exposure to TFGBV, user engagement with control tools, and the impact of design changes. To meaningfully assess the risk of TFGBV, data provided should be disaggregated as much as possible – at the very least, according to gender. For a comprehensive and intersectional assessment, disaggregation according to other factors of discrimination should be included.

Pathways for improvement

We identified three avenues for the implementation of a cross-cutting approach to TFGBV:

1. Definition:

All the platforms should establish a working definition of GBV in their RA Reports as a baseline to inform their methodological approach and outcomes. This definition should acknowledge that GBV affects the fundamental rights of victim-survivors – a baseline only met in one of the three RA Reports. Any definition should also take into account the cross-cutting nature of GBV, as the phenomenon affects the lives of women and LGBTQIA+ communities in a wide range of ways, which do not fit neatly into DSA risk categories.

2. Risk assessment methodology:

Therefore, it is absolutely crucial to address GBV as a cross-cutting risk in their RA Reports, which should be reflected in the platform's methodology. Platforms should attribute GBV-related risk scenarios to more than a single category: GBV should be recognized as such whenever it occurs, including when its occurrence has been assigned to another risk category. Conversely, the relevance of occurrences of GBV attributed to this risk category – the category of GBV – to other risk categories should equally be recognized.

3. Mitigation:

A comprehensive and intersectional approach is more effective as it allows for mitigation measures to be designed in a gender-responsive way, meaning they are designed to address different risk categories (be it illegal content, risks to fundamental rights, risks to civil discourse or the risks of GBV and to children as well as physical and mental wellbeing) and can address the specific ways in which these risks affect women and LGBTQIA+ communities. Additionally, a cross-cutting approach to assessing GBV allows for a better understanding of the issue as well as a big-picture approach to tackling it, enabling root causes to be taken into account rather than restricting the focus of mitigation measures to select symptoms of it.

In our brief, we delve deeper into exactly how the platforms that host pornographic content tackle GBV in their RA Reports and where they fall short of a satisfactory approach. It is crucial that all actors do their part to combat it. The DSA’s transparency provisions allow for unprecedented insight into how large platforms go about this, and should be used by platforms, including those hosting pornographic content, to showcase their seriousness in combating TFGBV.

Authors

Marie Seck
Marie Seck works in the Online Expression and Civic Space Programme at the Centre for Democracy and Technology Europe, focussing on key EU policy developments relating to online expression, such as the Digital Services Act (DSA) and its delegated acts, and the upcoming Digital Fairness Act (DFA). Ma...
David Klotsonis
David Klotsonis is the Deputy Director of the Online Expression and Civic Space Programme at the Centre for Democracy & Technology Europe. He focuses on research and advocacy on the Digital Services Act and the Regulation on the Targeting and Transparency of Political Advertising. David’s work has t...

Related

Analysis
Platforms Report to EU Regulators Under DSA With an Eye on US PoliticsDecember 16, 2025
Perspective
Deepfakes and Beyond: Mapping the Ethics and Risks of Digital DuplicatesMay 13, 2025

Topics