Why Europe Could Block X Over Grok Scandal But Probably Won’t
Owen Bennett / Jan 12, 2026‘Demeaning and degrading’, ‘appalling’, ‘disgusting’. These are just a handful of the arresting descriptors that European Union and United Kingdom lawmakers have reached for in recent days, in response to the latest obscene content-related scandal engulfing the platform X and its owner, Elon Musk’s xAI.
The proliferation of non-consensual intimate image abuse (NCII) content on X, compounded by the initially unserious response of its operator, has led to calls for regulators to sanction the platform in the most invasive means available to them, namely access restrictions.
While they are understandably desperate to - and be seen to - address the harm emanating from X, regulators would do well to resist these calls.
What do European laws say?
Here, I look specifically at the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA). While calls to probe and restrict access to X in response to this scandal have come from various parts of the world, the DSA and the OSA provide useful benchmarks for assessing the merits of access blocking in this case, given that they are amongst the most comprehensive of online safety rulebooks in force today and because they both foresee access blocking as a regulatory tool.
The DSA empowers the European Commission DG CNECT to take ‘interim measures’ against so-called Very Large Online Platforms (VLOPs) where ‘there is an urgency due to the risk of serious damage for the recipients of the service’ and where DG CNECT has ‘prima facie’ belief that a compliance breach has occurred. While the nature of these ‘interim measures’ is unelaborated in the law, a contextual reading of the DSA suggests that the legislator had access blocking in mind. In addition, the DSA grants Digital Services Coordinators (DSCs) explicit powers to seek the blocking of access to non-compliant services from their territory.
Under the UK’s OSA, the regulator Ofcom can apply for a court order requiring ISPs and other relevant intermediaries (like search engines, payment providers, etc) to restrict or frustrate access to services, in cases of continued non-compliance with the OSA’s regulatory duties and where the risk of harm to users warrants such a drastic intervention, amongst other factors. As in the DSA, Ofcom can seek interim measures in urgent cases where there are high degree of harm.
So far, so good in terms of regulatory powers.
Why this isn’t as straightforward as it might look
While internet access blocking has a longstanding judicial tradition in the EU and the UK, the legal provisions that enable it under the DSA and the OSA have not yet been practically applied by the regulators. Consequently, we have no direct precedents that can tell us how courts will likely view attempts at access blocking X in response to this scandal.
But by looking to the broader judicial and operational context that surrounds access blocking, we can make a reasonable assessment of how courts are likely to interpret the key legal principles governing the relevant DSA and OSA provisions, and the proportionality thresholds that they will want regulators to meet to render such enforcement actions legally sound.
A preliminary assessment suggests they would look skeptically upon attempts by regulators to impose it vis-a-vis X.
1. Access blocking X is unlikely to meet legal standards for proportionality
In implementing the DSA, the Commission’s DG CNECT and the national DSCs must adhere to the EU Charter of Fundamental Rights (CFR). Over the last 15 years, the Court of Justice of the European Union (CJEU) has developed a clear doctrine for how content restriction orders trigger fundamental rights considerations. In cases including Scarlett Extended (2011), Netlog (2012), and Google Spain(2014), the Luxembourg court has asserted that a ‘fair balance’ must be struck between all fundamental rights that are implicated by online content restrictions, including freedom of expression. In UPC Telekabel Wien (2012), a case which dealt directly with the access blocking of websites, the Court held that blocking measures must be ‘strictly targeted….[so as not to] impact internet users who are using the [ISP’s] services in order to lawfully access information.’
Looking specifically at the DSA, in a letter to the civil society organizations in 2023, then EU Commissioner Thierry Breton clarified that the DSA’s access blocking measures could only be used as a ‘last resort’, in the most ‘far-reaching situations’. Moreover, the legislation itself notes that access blocking, where undertaken by DSCs, ‘should not go beyond what is necessary to achieve its objective’ and should not have the effect of ‘unduly restricting lawful access to information’. Taken together then, both the case law of the CJEU and the DSA’s inbuilt safeguards suggest that EU enforcers would have a very high threshold to meet to secure a blocking order against X.
A similar dynamic is at play in the UK. While not subject to the jurisdiction of the CJEU, UK courts must act in accordance with the European Convention on Human Rights (ECHR) and its jurisprudence. While the case law at national-level around access blocking is limited, the supranational European Court of Human Rights (ECtHR) has developed a rich body of case law around freedom of expression online that has direct applicability to access blocking requirements under the OSA. In Yildirim v. Turkey (2012), the Luxembourg court found that generalized blocking of Google-hosted blogging micro-sites in response to illegal content found on one instance of the service was disproportionate and incompatible with the ECHR’s provisions on freedom of expression. The Court has maintained a strong line on freedom of expression online in numerous subsequent cases, and its view has been buttressed by Recommendations of the Council of Europe (the intergovernmental organization which undergirds the ECtHR). While the ECtHR allows a ‘margin of appreciation’ for how national courts interpret the Convention’s provisions on freedom of expression, when faced with an access blocking order under the OSA, UK courts could not deviate widely from this clear jurisprudence.
The OSA itself limits the situations in which Ofcom can seek access blocking orders from UK courts. In its enforcement guidelines, the regulator concedes that business disruption measures amount to a ‘significant regulatory intervention’ and asserts that they can only be pursued in line with the regulator’s statutory responsibility to act in a manner that is ‘proportionate’ and ‘targeted’. As with the DSA, the statutory and judicial context surrounding the OSA suggests UK enforcers would have a very high bar to meet to demonstrate that access blocking X is proportionate to address the ongoing harm.
2. Significant technical and operational challenges to deployment
Attempts at access blocking under the DSA and OSA give rise to notable technical and operational challenges that will likely be weighing on the mind of enforcers. Most obviously, access blocking can be easily circumvented by end users through the use of virtual private networks (VPNs) and other privacy-enhancing technologies. The entry-into-force of the OSA’s age assurance requirements in July last year illustrated the extent to which content restriction efforts can and are circumvented by users. Ofcom’s own research found that VPN use in the UK spiked to 1.5 million daily active users in the month following implementation of the age assurance rules. Access restricting X in Europe, drawing-in technologically-savvy adults who actively wish to access content on the service, would be so easily and widely circumvented as to be nothing more than an enforcement fig-leaf.
Enforcers in the EU face a further unique operational challenge in any effort to restrict access to X, arising from the complex governance arrangements of the DSA. While DG CNECT is responsible for VLOPs’ compliance with the VLOP-specific duties, restrictions on access to a service would require a high-degree of coordination amongst the DSCs who hold relationships with ISPs in their jurisdictions. Patchwork enforcement of such a high-profile intervention, with X accessible in some Member States and not others, would damage the DSA’s credibility and the integrity of the EU’s single market. Yet the heterogeneity of judicial processes in the 27 EU Member States, and the reasonable likelihood that not all DSCs will support an access blocking intervention, make this operational hurdle a significant one for DSA enforcers.
What then is to be done?
In recent days, the Commission, Ofcom, and several DSCs have sought to regain the initiative in this unfolding crisis, through the issuance of urgent information requests, documentation preservation orders, and by publicly signalling their willingness to use invasive enforcement powers against X.
At the time of writing, Ofcom has just announced a formal investigation into X for potential non-compliance with the OSA’s risk assessment, illegal content, and protection of children of rules. It surely won’t be the last such investigation by a European (or indeed international) regulator. These redoubled efforts, which build on already intensive supervisory engagement by the regulators of X arising from prior concerns, should be allowed to run their course. There are prima facie grounds to believe that X has committed breaches of both the DSA and the OSA, and the relevant regulators have a suite of escalatory enforcement measures they can pursue before entertaining the ‘nuclear’ option of access blocking. The spectre of dawn raids, multi-million euro/pound fines, and criminal liability for senior staff should, for any rational actor, make continued non-compliance unappealing.
In any case, the DSA and OSA drafters probably never foresaw a world in which the access-blocking powers they ushered into being would be leveled against a service used by millions of Europeans daily, and which remains a key conduit for politicians, public bodies, and corporate brands to engage with the public. Yet the safeguards built into the rulebooks, and the jurisprudential context within which courts would scrutinize attempts by regulators to impose blocking orders, mean such orders are likely to remain limited to the most egregious cases of non-compliance by services that systematically trade in illegal content and conduct.
Once the dust settles on this supervisory crisis, there will invariably be some soul-searching within the walls of European online safety regulatory agencies. Harms researchers had forewarned of the NCII-related risks posed by Grok, and its integration within the X platform last year should have set alarm bells ringing within regulatory corridors. The fact that both the DSA and the OSA require that services like X undertake novel risk assessments when making significant service changes (e.g., integrating a GenAI application into the platform) is an important first starting point for regulators seeking to learn from this episode. They will want to be much more attentive to similar such roll-outs by regulated platforms in the future, and be ready to undertake far greater upfront scrutiny of those services’ risk assessment and mitigation efforts for GenAI integrations.
For these regulators, the pressure they face to flex their access blocking muscles is unlikely to dissipate as more compliance failures emerge. The DSA/OSA theory of change, whereby statutory guardrails progressively stimulate a culture of procedural accountability within the operations and governance arrangements of online platforms, has not yet proven itself.
At a time when US political and corporate power is increasingly aligning in the effort to resist European platform accountability efforts, and with companies racing to integrate risky GenAI features into existing products with limited due diligence, the likelihood is that we will see more and not less calls for European regulators to use the bluntest and most theatrical tool in their arsenal.
A rocky road lies ahead.
Authors

