Seven Arguments for Limiting Russmedia’s Reach in European Courts
Daphne Keller / Apr 16, 2026
The European Court of Justice on Kirchberg Plateau in Luxembourg. Shutterstock
This is the second of two posts about the recent Russmedia ruling from the Court of Justice of the EU (CJEU). The first explained the remarkable obligations Russmedia placed on a small Romanian platform and discussed the legal implications for online expression and information hosted by other platforms in Europe. This post explores potential limits to Russmedia’s impact, and arguments that other platforms might make in saying the new obligations do not apply to them.
Russmedia involved Publi24, a Craigslist-like free ad hosting site. The plaintiff was the victim of a fake ad posted on the site, depicting her as a commercial sex worker and including her photos and phone number. She argued that by hosting the ad, Publi24 acted as a data controller, improperly processing her personal data in violation of the General Data Protection Regulation (GDPR).
The Court agreed with this reasoning. It held that the platform could not claim legal protections under the EU’s longstanding intermediary liability law, the eCommerce Directive (ECD). Instead, its obligations were defined by the GDPR. To meet them, the Court said, Publi24 must proactively review uploaded ads to weed out such harmful uses of sensitive personal data, confirm the identity of advertisers who sought to post such data, and take measures to prevent ads from being replicated on third party sites. In shifting from intermediary liability law to the GDPR as a basis for platform obligations, the Court also opened the door to strong arguments that platforms should no longer notify users and allow them to appeal removal decisions — rights that are guaranteed to users under the ECD’s successor law, the Digital Services Act (DSA).
The Court does not present its requirements as prescriptive measures to be taken by all platforms or data controllers. Instead, it says, the “appropriateness of such measures must be assessed in a concrete manner, taking into account the nature, scope, context and purposes of the processing in question and the likelihood and severity of the risks.” (Par. 94) That open-ended language may give comfort to observers focused on the long-term evolution of law. Ideally it will give courts and regulators flexibility to adopt better rules in the future. But in the meantime, platforms looking to understand their own obligations will find little in the ruling to distinguish their own businesses’ design and operations from those of Publi24. Behaving as if Russmedia does not apply to them would be, for most platforms, a significant gamble.
Platforms’ safest course in the wake of Russmedia will be to avoid risk to themselves by a show of compliance, carrying out whatever form of content filtering is economically feasible for them. If a platform’s Terms of Service (TOS) does not already prohibit a comfortable margin of legal expression beyond the content that European laws require them to remove, Russmedia adds reason for them to expand TOS prohibitions, thus avoiding hard calls and legal disputes in the future. I think many will also choose to readily remove lawful or non-TOS-violating content and settle cases to avoid going to court. The largest platforms in particular will not expect Russmedia’s standard to be interpreted in their favor, and are unlikely to want to be the next test case.
Whoever does wind up litigating these questions will be in search of arguments for limiting Russmedia’s applicability. Below, I will list seven lines of argument they might use. I think some of these, alone or in combination, could succeed before motivated courts and regulators, particularly in cases where threats to users’ rights are obvious. But none, I think, have all that much support in the reasoning of Russmedia itself.
Argument 1: The ruling only creates obligations for particularly harmful content or sensitive personal data
Russmedia’s outcome seems largely driven by the “manifestly unlawful and deeply harmful” content at issue: a fake sex ad featuring the plaintiff’s phone number along with photographs taken from her genuine social media account. (AG Opinion Par. 20) Some commenters have suggested that the case should impact only platform moderation of such deeply harmful material, or only affect sensitive special category data under GDPR Article 9 — meaning information about an individual’s race, ethnicity, sexuality, or health, as well as political opinions, religious or philosophical beliefs, or trade union membership.
But the systemic changes that Russmedia requires, while motivated by uniquely harmful content, cannot be limited to that content in practice. The only way to find worst-of-the-worst content is to monitor a much broader corpus of user uploads. Doing so, particularly if humans must review previously unseen material, risks giving platforms actual or constructive knowledge of other potentially illegal content — and stripping them of immunity for non-GDPR claims like copyright infringement, fraud, or defamation.
The DSA attempts to remove this risk under Article 7, which protects monitoring carried out to comply with law. But mandates like the one in Russmedia are, in practice, quite difficult to reconcile with laws that treat “knowledge” or “control” as grounds for losing immunity. Article 7 doesn’t try very hard, providing only that platforms can’t be stripped of immunity “solely” because of “diligent” efforts to comply with legal obligations. Any plaintiff’s lawyer worth their salt will find ways to argue that a platform’s review of content was not “diligent,” or provide just one of several reasons the platform is not immunized. Platforms must design their compliance efforts in recognition that (1) they might have monitoring obligations under Russmedia, (2) monitoring might give them knowledge or control over content sufficient to lose DSA immunity, and (3) DSA Article 7 might not save them.
Argument 2: The ruling only affects services with a particularly high risk of harmful posts involving sensitive information
A somewhat more plausible reading of the case is that platforms, even if they count as data controllers for hosted content, assume Russmedia obligations only if they know that users are particularly likely to upload harmful sensitive data. That would make sense if the ruling focused on — or even acknowledged the existence of — Publi24’s “Matrimonial” section. That section’s ads mostly depict scantily clad women and appear to offer paid sexual services; presumably this is where the ad impersonating Russmedia’s plaintiff appeared. Imposing more stringent obligations for ads in this section, but not in sections for used furniture or electronics, might make sense. The problem is that the Court does not discuss or consider this. Instead, it appears to reason that operating a general-purpose advertising platform was enough to make Publi24 legally responsible for anticipating and weeding out abusive uploads.
Sites that host user posts on impersonal subjects, like recipes or real estate listings, might still opt to argue that Russmedia does not apply to them for this reason. Ordinary social media sites — where people routinely post about themselves and other people — may find it less promising. Meta, for example, reports removing tens of millions of posts for nudity each quarter. That makes serious violations like the one in Russmedia highly foreseeable, even if they are only a tiny sliver of the site’s content. It’s also unclear what other forms of sensitive personal data platforms might need to remove, and should hence consider a source of foreseeable risk. A Reddit post disclosing a third party’s HIV status, for example, might be a harmful use of sensitive health data — or it might be a harmless celebration or expression of support, welcomed by the data subject. A conservative read of Russmedia might suggest that Reddit should monitor HIV-focused forums and demand that users who post identifiable information confirm their identities. I doubt the users of such a forum would appreciate that outcome. The same questions about sensitive GDPR data arise for anything from gossip about a celebrity pregnancy to social media posts wishing someone a happy Diwali to photos of a private individual using — or not using — a wheelchair.
Argument 3: The ruling only affects marketplaces, ads, or commercial sites
Russmedia is, of course, literally only about the specific facts addressed and questions resolved in the case. The questions referred to the CJEU by the Romanian court ask about “a website on which free or paid advertisements may be published[.]” (Par. 43). The CJEU regularly refers to the uploaded content as “advertisements” and frames the case as one about Publi24’s responsibilities for ads in “its online marketplace.” (Par. 45)
The Court’s legal analysis never seems to turn on these classifications, though. It mentions no considerations unique to advertisements, as opposed to “organic” non-advertising user posts. As Erik Tuchtfeld puts it, the ruling “does not contain any restriction according to which these principles would not be directly applicable to online platforms” of all kinds. In my quick review of law firms’ published takes on Russmedia, I saw none arguing that the ruling’s impact could be limited to ads or marketplaces.
The ruling does, however, repeatedly mention that Publi24 became a controller by processing data “for its own commercial purposes” (emphasis added). It is not clear what this refers to, beyond Publi24 being a for-profit business. But the Court’s controller determination seems grounded in the idea that the site processes personal data for “commercial or advertising purposes which go beyond the mere provision of a service which he or she provides to the user advertiser.” (Par. 66, 67) Perhaps this at least supports an argument that “commercial” purposes trigger duties under Russmedia. In this scenario, Wikipedia might be safe — dodging the bullet that might otherwise have threatened any entry about living persons.
Argument 4: The ruling only affects hosts that take too much control over user content
It is also unclear what separate purpose of its “own” Publi24 pursued when it processed user data, beyond the purpose, common to many platforms, of offering a free hosting site that earns revenue through paid ad placements.
The idea that entities should assume more legal responsibility the more control they exercise over third party data or content is familiar in both data protection and intermediary liability law. In data protection law, this is pretty straightforward: controllers are those who determine the purposes and means of data processing. In intermediary liability, it is more complicated. Exercising “control” over content may be grounds for losing immunity under the DSA, but both lawmakers and platform users generally want platforms to intervene and make some decisions about content. This can include weeding out spam, illegal material, and material prohibited by a platform’s TOS. That goal is undermined if platforms risk losing immunity every time they review or moderate users’ posts.
DSA Article 7 attempts to address this “moderator’s dilemma” by allowing platforms to engage in certain kinds of proactive moderation without losing immunities. The CJEU in Cyando similarly allowed YouTube to manage user content in some fairly standard ways without losing immunity. It held that the ECD still immunized the platform when it attempted to proactively filter out copyright-infringing uploads; “recommend[ed] videos on the basis of users’ profiles or preferences”; indexed content for search purposes; and organized videos in sub-categories such as “entertainment,” “music,” or “film and animation”. (Par. 114, 30) The Court more recently reiterated and relied on this holding in Zalando.
The indicia of control that seemingly caused Publi24 to be deemed a legally responsible controller in Paragraph 72 of Russmedia are very similar to the ones immunized in Cyando. Like YouTube, Publi24 “organises the classification” of uploaded content into pre-defined categories – its listings would not be very usable otherwise. Publi24 “sets the parameters for the dissemination of advertisements… depending on the recipients,” much as YouTube “recommend[ed] content on the basis of users’ profiles or preferences” in Cyando. Finally, the CJEU notes that Publi24 “determines the presentation” of hosted content, as well as the “duration” — which seems unsurprising for time-limited content like classified ads.
It’s not entirely clear what the CJEU thinks Publi24 does, or exactly which of its behaviors other platforms must forego if they are to avoid that site’s fate. Advocates or authorities seeking to distinguish Russmedia may be reduced to avoiding any specifics, and simply insisting that other platforms vary — in undefined ways — in their “specific architecture”, “design choices”, or “specific design”.
Argument 5: The ruling only affects hosts that claim excessive rights in their Terms of Service
The ruling repeatedly emphasizes Publi24’s TOS as evidence that the platform acts as a controller of personal data in uploaded content. (Par. 72, 73) The TOS allowed Publi24 to “copy [uploaded content], distribute it, transmit it, publish it, reproduce it, modify it, translate it, transfer it to partners and remove it at any time, without the need for any valid reason for doing so.” (Par. 42)
This is, as lawyers at TwoBirds observed, “rather standard” contract language. Even the European Commission told the CJEU that it wasn’t unusual. (AG Par. 61) Long strings of verbs like those in Publi24’s TOS are there for good reasons. For starters, it’s obvious why platforms need rights to publish and remove content. Rights to copy, distribute, transmit, and reproduce content matter for copyright licensing. For comparison, the YouTube TOS described in Cyando granted a license “to use, reproduce, distribute and create derivative works and to display and perform” uploaded videos in connection with its service. (Par. 30, emphasis added) Comparable language exists on the TOS pages of many other regional European classified ads sites.
Permission to modify or translate content could cover transcoding uploaded files into new formats, or content moderation measures like blurring or labeling images. Translating content into another language is also something that many users — as both producers and consumers of content — want and expect. And if a platform shares content via APIs or other syndication tools — which, I have argued, is an unexceptional platform function — then it needs legal rights to transfer to partners.
To my mind that only leaves one potential problem with Publi24’s TOS: it lets platform operators remove content without “any valid reason.” In a case covered by the DSA, I suspect that such a provision would be unenforceable. Platforms need valid reasons for content moderation, and users can challenge unjustified removals. A platform like Publi24 might look better if it used milder TOS language, like reserving the right to remove content “as appropriate for business purposes.” That would be a pretty superficial change. But appearances matter in a world where infelicitous TOS wording can be enough to convince the CJEU that Publi24 was actually exploiting user data “for its own advertising and commercial purposes” beyond its basic service. (Par. 67)
Argument 6: The outcome might be different under the DSA
Arguably, Russmedia should obsolesce swiftly because it only interpreted the relationship between the GDPR and now-superseded provisions of the ECD. If platforms can persuade courts that the ruling no longer applies now that the DSA is in effect, it would create an opportunity for a major do-over, putting the EU’s data protection and intermediary liability laws on a more even footing.
6A: The DSA has a different relationship with the GDPR
The DSA describes its own relationship with the GDPR using different words than the ones the ECD uses. Perhaps that means that Russmedia’s critical determination that the ECD did not apply to GDPR-based takedown cases would come out differently under the DSA. I’m wary of this argument, mostly because the operative statutory language is so open to interpretation. As Advocate General Maciej Szpunar dryly observed, the laws’ wording might “cause the reader to reach various different conclusions” without any one conclusion being clearly correct. (AG Par. 178)
This “the DSA is different” argument depends on some fairly eye-glazing language in the relevant laws. The GDPR states that it shall operate “without prejudice” to intermediary liability rules of the ECD. (Art. 2(4)) The ECD describes its own status in more deferential terms, saying that its rules simply “shall not apply to” questions addressed by EU data protection law. (Art. 1(5)(b)) But the DSA, which replaced relevant portions of the ECD, seems to assert more equal legislative footing with the GDPR. It reciprocates the GDPR’s own language, saying that the DSA rules are “without prejudice” to the GDPR.
Does the newly matched “without prejudice” language in the GDPR and DSA re-set the relationship, and avoid an outcome in which the GDPR simply trumps and displaces the EU’s intermediary liability laws? I’m no expert in intertextual relations among EU legislative instruments, but I’ve heard skepticism about this from a handful of European specialists. That said, Tobias Mast makes a far more detailed and textually substantiated version of the argument, and I would love for him to be right. Of course, as Martin Husovec pointed out to me, any ruling about this kind of legislative language could have sweeping implications for similar wording in unrelated EU laws, which complicates matters further.
If this argument did succeed, courts would have an opportunity to reconcile the two bodies of law, instead of picking a winner. The AG’s Opinion in Russmedia and the one adopted in a 2014 Italian Supreme Court ruling both offer ways to do that. (Incidentally, the Italian case arose from — and vindicated — a judgment call I made as a baby platform lawyer at Google about not needing pre-publication review of all uploaded videos. My boss couldn’t travel to Italy for several years while the case played out. He was nice about it.)
6B: The DSA’s rule against general monitoring obligations is stronger
Possibly the DSA can be read to create a new and different rule for one of Russmedia’s most controversial mandates: its requirement that Publi24 identify and block prohibited advertisements “before their publication.” (Par. 97) The EU’s older intermediary liability law, the ECD, said that EU Member States could not impose general monitoring obligations on immunized platforms. The DSA goes farther, effectively restricting EU-level authorities from doing so as well. As the EU policy wonk known online as Gateklons has pointed out, arguably this could mean that the ECD left the CJEU free to impose a monitoring obligation grounded in the GDPR’s EU-level rules, while the DSA does not. Like so many interpretive questions about Russmedia, this one is complicated by the CJEU’s confusing assertion that the obligation it imposes on Publi24 is actually not a “general monitoring” obligation.
Argument 7: The outcome would be different if more fundamental rights arguments were raised
Finally, as multiple commenters have noted, the Russmedia ruling doesn’t address any fundamental rights beyond privacy and data protection. Perhaps if someone were to remind the CJEU about other rights, including expression and information rights, the outcome would be different. The EU Charter also includes the “freedom to conduct a business,” which the CJEU has in the past considered relevant given the burden of monitoring obligations on platforms.
Of course, the AG did point out tensions with these rights, and reminded the Court of the more nuanced balance it struck in rulings about the GDPR and web search engines. The Court’s silence on fundamental rights issues may be beneficial to claimants seeking a different outcome in future cases. At least the Court didn’t consider other fundamental rights and proclaim them irrelevant. This perhaps leaves more room to maneuver for litigants in national courts, who may raise fundamental rights either as independent arguments or as grounds for different interpretations of relevant legislation.
Conclusion
None of these arguments, standing alone, strike me as entirely convincing. But the overall argument to allow the DSA to do its work, and not sweep it aside in favor of governing online content using the GDPR, is compelling. Hopefully some combination of these points, or others I’ve missed, can help move future court rulings in that direction.
Authors
