EU Tests DSA with Investigative Proceedings Against Musk’s X
Gabby Miller / Dec 19, 2023On Monday, Dec. 18, the European Commission launched “formal infringement proceedings” into X (formerly Twitter) on the basis of suspected breaches in its transparency obligations, failure to counter illegal content and disinformation, and deceptive design practices, all of which are potential violations of the Digital Services Act (DSA). The EU’s first-ever “in-depth investigation” into a Very Large Online Platform or Search Engine comes after months of public warnings hinting that the Commission would throw the full weight of the DSA at X and its mercurial owner, Elon Musk.
The formal proceedings stem from a preliminary investigation conducted using X’s mandated risk assessment and transparency reports published this fall, as well as a request for information regarding illegal content on its platform amid the Israel-Hamas war, according to a European Commission press release.
The DSA and the Digital Markets Act (DMA) are a pair of laws passed by the European Union last year intended to create a safer, more open internet and level the playing field for businesses. The European Commission issued its first round of designation decisions this spring to seventeen platforms and two search engines that each reach at least 45 million monthly active users. In February, X reported to the Commission that it had 112 million monthly active users in Europe. Platforms were required to submit their risk assessments to the Commission by Sept. 1, and the transparency reports were published in early November.
In May, Thierry Breton, the elected European Commissioner who oversees digital policy, took aim at X after the company abandoned the European Union’s Code of Practice on disinformation. Breton tweeted that Twitter can run but “can’t hide” from its legal obligations under the DSA. The situation was later seemingly defused after Breton traveled to Silicon Valley to meet with Musk and CEO Linda Yaccarino. At the time, he announced that X was the first designated platform to undergo a voluntary “stress test” ahead of the August compliance deadline. Breton said X leadership took the exercise “very seriously,” and Musk later vowed to “obey the law” in an interview with the television channel France 2.
Later in the year, after Hamas launched a full-fledged assault on Israel on Oct. 7, graphic images and videos of abductions, killings, and more proliferated across social media platforms, along with misinformation and manipulated content. Musk, who has taken a more laissez-faire approach to content moderation since purchasing X, quickly faced intense scrutiny for the platform’s failure to efficiently remove violent and fake content from its platform, some of which is considered illegal in Europe.
Just days after the initial attack, EU Commissioner Thierry Breton posted a letter to X informing Musk that “the DSA sets very precise obligations regarding content moderation” and that the platform must take down illegal content in a “timely, diligent and objective” manner. A public back-and-forth between Musk and Breton ensued, with Musk calling for more specificity from the EU on its alleged violations, and Breton doubling down that Musk was “well aware” of “users’ — and authorities’— reports on fake content and glorification of violence.” Since the onset of the Israel-Hamas war, Musk has several times amplified and endorsed antisemitic and hateful content on his platform.
“The opening of formal proceedings empowers the Commission to take further enforcement steps, such as interim measures, and non-compliance decisions,” read the Commission’s Monday press release. This, for Breton, marks the end of the era where big online platforms can behave like they are ‘too big to care,’ he said in the same press release. “We now have clear rules, ex-ante obligations, strong oversight, speedy enforcement, and deterrent sanctions and we will make full use of our toolbox to protect our citizens and democracies.”
Others noted that the investigation of X could have existential consequences for the company. “The European Commission will need discipline and resolve to assess what exact violations took place,” said Marietje Schaake, international policy director at Stanford University Cyber Policy Center and a former Member of the European Parliament, in a statement to Tech Policy Press. “I hope this accountability mechanism will have positive ripple effects to protect the public interest from careless or even malicious corporate governance practices.”
The DSA outlines a number of procedural steps Europe can take to rein in uncooperative VLOPs and VLOSEs, ranging from noncompliance penalties to a temporary suspension of the platform in Europe. This includes providing a “reasonable period,” which is not defined in the DSA, for the company to remedy its violations and draw up an action plan for how it will address the infringements. If the Commission finds that the plan is insufficient or poorly implemented, the company may face penalties, including fines up to six percent of its total worldwide annual turnover or periodic penalty payments.
Suspending a platform is the most extreme scenario and would require the Commission to first coordinate with the Member State where the provider is mainly located – Ireland in X’s case – to repeat the process of drawing up an action plan for remediation. However, a Member State can use its national judicial or administrative authorities to temporarily suspend a platform as a last resort for repeatedly failing to comply with the DSA.
What’s playing out is, at least in part, a result of the pressure the EU Commission is under regarding its first enforcement actions, according to Julian Jaursch, project director at the not-for-profit tech policy think tank Stiftung Neue Verantwortung (SNV), based in Berlin. “The Commission had intrinsic motivation and faced external pressure to move fast but also had the need and desire for an airtight legal case. Now the Commission apparently felt it had a strong enough case to push ahead, somewhat earlier than expected,” Jaursch told Tech Policy Press in an emailed statement. Although other VLOPs have previously faced EU scrutiny over their business practices, it comes as “little surprise” to Jaursch that the Commission is kicking off DSA enforcement action with X after months of heightened attention.
How the Commission moves forward in this case may clarify how it intends to balance concerns over disinformation and illegal content online with other European values. “The European Commission’s recent moves against X are pursuing this point very aggressively, treating the DSA as an anti-disinformation law without yet clarifying how this can be reconciled with freedom of expression,” said Paddy Leerssen, postdoctoral researcher at the University of Amsterdam’s DSA Observatory, in a written statement to Tech Policy Press. The investigation could be an opportunity for the Commission to clarify how best to combat disinformation without resorting to restricting speech, according to Leerssen. “At worst, it could be the Commission’s way of bringing in censorship through the backdoor,” he said.
For now, the Commission says it will continue to gather evidence by sending additional requests for information and conducting interviews or inspections.