How Researchers Won a Legal Fight to Access X's Data Under the DSA
Daniela Alvarado Rincón, Simone Ruf, Jürgen Bering / Mar 6, 2026
X logo is displayed on a smart phone, X app formerly known as Twitter with X seen in the background, in this photo illustration. On January 5, 2024 in Brussels, Belgium. Shutterstock
A court in Berlin delivered a consequential ruling Feb. 17, 2026, ordering X to grant Democracy Reporting International access to its publicly available data through its API, free of quota limits and through June 30, 2026. The access is intended to enable research into systemic risks ahead of the Hungarian elections. The English translation of the ruling is available here.
The decision coincided with the second anniversary of the Digital Services Act coming into full effect. On the day of the court ruling, researchers, policymakers and civil society representatives were gathered for the DSA and Platform Governance Conference in Amsterdam to assess how the EU’s digital rulebook is working in practice.
As participants debated the private enforcement of the DSA, the Berlin ruling offered a concrete example of that very issue in motion — and underscored how urgent the question has become. Geopolitical tensions over tech governance between Washington and Brussels have been rising steadily, and with them, doubts about whether political headwinds might blunt the DSA's edge.
The Berlin ruling pushes back against that doubt — at least with respect to data access, one of the DSA's core provisions — on multiple fronts. It confirms that Article 40(12) grants researchers a subjective right to data access, and offers guidance on how the eligibility criteria in Article. 8 (b-e) should be interpreted and confirms that national courts have international jurisdiction over data access claims, meaning researchers can sue where they are based. More broadly, it is a reminder that strong and effective rule of law frameworks are the last line of defence when political dynamics shift.
DRI vs. X: The case at a glance
Article 40 of the DSA establishes different forms of researcher data access. Article 40(4) grants access to non-public data, with requests subject to vetting by competent authorities. Article 40(12), by contrast, grants researchers access to publicly available data, with platforms themselves responsible for evaluating and approving requests. In a way, Article 40(12) formalizes what researchers can potentially obtain through scraping. On paper, the provision appears relatively uncontroversial. In practice, it has proven anything but.
Researchers have consistently reported obstacles, delays, and denials to data access by X, even in obvious cases (for a detailed account of the challenges researchers face with VLOPs' vetting processes, please see DRI’s latest brief). X’s performance in this area was serious enough to be the basis of the European Commission’s fine decision for € 120 millions, which found that X only accepts 4.7 percent of the data access requests it receives. Most requests thus end in rejection, typically after a lengthy back and forth.
This was also DRI’s experience. X had denied previous applications for data access and, in fact, in 2024, DRI had already initiated legal proceedings on a previous research project, relating to the effects of online discourse on elections in Germany. This case was lost: according to the court, DRI had waited too long before addressing the court. The court largely relied on the relatively strict German case law for interim proceedings. German courts expect applicants to act swiftly. Waiting too long, typically more than one to two months, after becoming aware of the relevant circumstances may suggest that the applicant itself did not regard the issue as pressing, which can call the asserted urgency into question. Nonetheless, the court also held that it had jurisdiction to hear this claim, thus ruling against the main line of argumentation brought forward by X, representing a significant partial victory.
This case marked the first formal cooperation between DRI and the Gesellschaft für Freiheitsrechte (Society for Civil Rights/GFF), a Berlin-based nonprofit association focused on strategic litigation.
DRI and GFF had learned their lesson: when DRI began its research project on online discourse and the Hungarian elections, it always set X strict deadlines, not accepting unnecessary delays. Unsurprisingly, the request was once again denied, leading to immediate legal action by DRI and GFF, this time supported by the law firm Hausfeld.
To obtain interim measures, the applicant must show that the case is urgent — meaning it cannot wait for a regular civil procedure. The court will grant such measures only if, without immediate action, enforcing the claim would be frustrated or seriously hindered. In practice, this means the applicant must demonstrate that, without the requested measures, they would face a concrete and significant threat to their legal interests — a condition clearly met for research projects tied to upcoming elections.
The case was once again heard by the regional court of Berlin — albeit a different judge than in the previous proceedings, who now denied the court’s jurisdiction and dismissed the application. What can be seen as a great sign of judicial independence is also slightly frustrating for researchers. But that was not the end. The first-instance decision could still be appealed. From a strategic perspective, this opened the door for DRI and GFF to bring the case before the Higher Regional Court of Berlin (Kammergericht), which was the first time an appellate court dealt with Art. 40 DSA.
Importantly, the Higher Regional Court of Berlin also examines both the urgency and whether the requirements of Article 40(12) are met, only if it first considers itself competent to hear the case. In other words, establishing jurisdiction is the essential first hurdle.
The first battle: establishing jurisdiction
Against the background, the most pertinent question before the Kammergericht concerned its own jurisdiction: are researchers able to sue in their place of research or establishment, or must they bring their claims in Ireland?
There is, of course, a law governing these kinds of questions: Brussels 1bis (Regulation 1215/2012). As a rule, legal actions against corporations usually must be brought at their (European) place of establishment. In this case, that would mean bringing proceedings in Ireland, where X is registered.
Having to litigate in another country will almost always act as a deterrent. Researchers are rarely in a position to assess the procedural and financial implications of a foreign legal system. In addition, Ireland, where most Very Large Online Platforms (VLOPs) have their European headquarters, is known for comparatively high litigation costs and significant financial risks in the event of a loss. Research projects rarely have the budget to finance cross-border legal enforcement, even in less expensive jurisdictions. Enforcement in Ireland is, therefore, for most researchers, hardly a realistic option. It is therefore essential that researchers can enforce their rights against platforms within their own jurisdictions.
There are, however, exceptions to the general rule under the above mentioned regulation. Consumers, for example, may sue in their place of residence. EU legislation can also provide for specific jurisdictional rules, as is the case under the GDPR, but not under the DSA. Another exception applies in matters relating to “tort, delict or quasi-delict.” It is on this ground that the court ultimately relied.
Framing “lack of data access as a tort,” the Higher Regional Court in Berlin confirmed its jurisdiction, overruling the regional court.
Since the DSA itself does not contain explicit rules on court jurisdiction for individual enforcement, decisions like this are essential. Germany does not, however, recognize the concept of stare decisis, which means that courts in Germany are not strictly bound by decisions of higher courts. Hence, another German court could decide very differently about the question of jurisdiction in other data access cases. That said, in practice, courts often follow these precedents.
The court has argued convincingly and shown a way for data access to be more than a right on paper. While the decision does not resolve data access once and for all, it marks an important step toward a functioning system and could encourage other researchers to assert their rights in court, including outside Germany.
What the Berlin ruling changes for researchers
Access to platform data is a subjective right of researchers
Most fundamentally, the ruling confirms that Article 40(12) grants researchers a subjective right to data access. In other words, data access is not merely a general provision for platforms to comply with under Commission supervision, but a right that researchers and organizations themselves hold and can enforce directly before a court. As DRI reported last year, the Berlin Court had begun moving in this direction in our previous case. This interpretation matters enormously: to our knowledge, it is the first judicially recognized right of this kind anywhere in the world, and as such could carry precedent-setting weight well beyond the EU.
The ruling also provides a meaningful interpretive framework for Article 40(12)'s eligibility criteria. In ruling in favor of DRI, the Court worked through each of the requirements set out in Article 40(8)(b-e), and it offered concrete guidance on how those requirements should be understood.
Who counts as a “researcher”?
X tried hard to narrow the answer. It argued that only natural persons,not organizations, could qualify as researchers. The Court disagreed, noting that the DSA neither defines "researcher" nor restricts it to individuals. Other provisions within Article 40, such as paragraph 11, indicate that it applies to both. X then went further, characterising DRI as "an activist organization with political objectives" and arguing this disqualified it. The Court dismissed this, too.
Setting the data protection benchmark
X also challenged the data protection measures outlined in DRI's application, arguing they were insufficient, though without specifying what exactly was problematic. The Court rejected this objection, noting that DRI had adequately described how access to the data would be restricted to authorized personnel, secured, and handled.
The Court further confirmed that DRI's request — limited to publicly available data within the timeframe of January 1 to June 30, covering three months before and three months after election day — was appropriate, necessary, and proportionate for research purposes. Beyond this case, this may set an early benchmark for what constitutes a reasonable data access request in the context of elections, offering a useful reference point for future disputes between researchers and platforms.
Systemic risks: the ruling confirms a broad and inclusive interpretation
The Court readily accepted that DRI's research objectives, focused on identifying coordinated inauthentic behavior (CIB) and foreign information manipulation and interference (FIMI) during the Hungarian elections, fell within the systemic risk category under Article 34(2)(c) of the DSA, which covers risks to democratic processes and civic discourse.
Unresolved tensions
Further legal questions remain open. Cases of this kind often demand swift action through summary or interim proceedings, yet the precise limits of these instruments in the context of data access are still to be tested.
Art. 40(12) DSA itself has several questions unanswered: the outer boundaries of what constitutes publicly available data, the meaning of "without undue delay," the challenges that arise once data is provided (particularly around quality and accuracy), and the required connection between a research project's objectives and the systemic risks it seeks to examine. In the present case, the link to risks surrounding elections and civic discourse was clear and direct. In future cases, however, that connection may be less obvious and open to debate.=
But even if all these questions are solved in (let’s be naive) five years, none of this will eliminate the structural hurdles that researchers have to deal with, including the extreme power imbalance between themselves and platforms. Researchers will always know that platforms can hire the most expensive law firms and have much more resources.
And, overall, the atmosphere is getting rougher: The US House Judiciary Committee has published documents with the unredacted names of disinformation researchers in Europe (including names associated with DRI) and some individuals and organizations in the realm of DSA enforcement are even the target of US sanctions. Even during the court proceedings, the law firm representing X has argued that this research was a means to unduly interfere with elections.
These pressures are not incidental. They reflect how much is at stake in holding platforms accountable to their own standards and the law, and, in their own way, affirm the importance of this work. We take them as a reminder of why cases like this matter, and as a reason to keep going.
Authors


