The EU’s Digital Omnibus Must Be Rejected by Lawmakers. Here is Why.
Itxaso Domínguez de Olazábal / Dec 3, 2025
Source: People vs Big Tech
Last month, the European Commission released the Digital Omnibus. Some early media coverage, including commentary arguing that both the Commission and civil society misunderstand the package, frames the debate around competitiveness and burdens. But this lens misses what the proposal actually does. Key parts of the text weaken the General Data Protection Regulation (GDPR) and the ePrivacy framework, two laws that anchor fundamental rights across Europe and the rest of the EU digital rulebook.
A leaked draft circulated before publication revealed attempts to dangerously rewrite these laws. Public criticism forced the Commission to roll back some of these elements. The final text appears to avoid the most extreme ideas, but it still embeds structural weakening. Some observers present this as a pragmatic recalibration. That framing ignores the broader deregulatory momentum shaping the proposal and the political context in which it has been introduced. Removing the worst ideas does not fix the larger problems, which have to do with both substance and process. To top it all off, the Omnibus was launched alongside the opening of the Digital Fitness Check, leaving room for amending any law with a digital component.
EDRi and civil society across Europe oppose the Digital Omnibus and its rollback of data protection and privacy. Here is why.
1. Omnibus is not evidence-based, and it lacks a rights impact assessment
Changing core rules on privacy and data protection requires clear evidence and rigorous analysis. The Commission has provided neither. It did not publish a fundamental rights impact assessment, nor present data showing that current rules harm responsible firms. Similarly, it did not explain why enforcement challenges justify weakening rights.
Even with proper procedure, some changes remain unacceptable. Laws tied to Article 7, the protection for private and family life, and Article 8, the protection of personal data, of the Charter of Fundamental Rights of the European Union safeguard people against both state and corporate intrusion and should not be lowered under any circumstances.
2. It does not really help small and medium enterprises (SMEs)
Smaller businesses that the Omnibus claims to protect have invested significant resources to comply with the GDPR. They asked for practical support: templates, standardized forms, consistent guidance and predictable enforcement. Most did not ask for new legal bases, new exceptions or narrower definitions. SMEs asked for clarity and consistent enforcement, not new exceptions or broader room for interpretation.
The Omnibus introduces complexity instead of clarity. Smaller firms will face uncertainty, while larger actors gain more room to interpret rules. This contradicts the stated aim of supporting European businesses and ignores the multiple factors behind the EU’s lack of competitiveness.
3. It narrows the scope of personal data and fragments protection
The GDPR relies on an objective test: if a person can be identified by anyone using reasonable means, the information is personal data. This avoids subjectivity and ensures a consistent baseline.
Expanding the interpretation of a recent Court of Justice of the EU (CJEU) ruling far beyond the context, the Omnibus moves toward a controller-specific view. Meaning, firms may argue that information is not personal if they believe they cannot identify anyone, even if others could. Two firms could classify the same data differently. Regulators must evaluate claims rooted in internal processes that they cannot fully audit. Additionally, the proposal allows the Commission to decide when certain pseudonymized datasets should be reclassified as non-personal.
This creates fragmentation. It also allows firms to treat behavioral signals, pseudonymized identifiers and other high-value datasets as non-personal. This reduces the reach of the GDPR and undermines the clarity it was designed to ensure. Additionally, this change does not simplify rules; it multiplies classifications and creates divergent interpretations.
4. It paves new routes for AI training with personal and sensitive data
The Omnibus broadens the use of “legitimate interest” for AI. Under current law, this ground is tightly constrained and difficult to use for large-scale processing. The proposal reframes AI development and operation as an activity that firms can justify as in their own interest. This adds new clauses that allow personal and sensitive data to remain in training datasets.
This proposal turns legitimate interest into a much easier route for AI training than the GDPR ever intended. It also allows so-called sensitive data to remain in training datasets whenever removal is labelled disproportionate. The proposal also normalizes the presence of sensitive traits in AI training datasets and makes the right to object ineffective in practice.
This opens wide channels for scraping and reuse. People often do not know their data is entered into training pipelines and if they do, they cannot object in time. Removal from trained models is almost impossible. These changes erode purpose limitation and reduce the control over sensitive information. And they do not address any enforcement problem and instead remove the guardrails that limit how AI systems absorb personal and sensitive data.
5. It weakens access and transparency, restricting oversight by the public
Transparency and access are gateway rights: they allow people to understand how firms use their data. They help journalists, workers, litigants and researchers to uncover unfair or unlawful practices. The Omnibus introduces grounds for refusing access, including claims that a person already knows the information or has an inappropriate motive. These subjective and vague tests allow firms to limit oversight. This undermines accountability and contradicts case law that interprets transparency broadly as essential to data protection.
6. It expands automated decision-making
Automated systems shape access to credit, jobs, insurance and public services. Article 22 of the GDPR provides strict limits, ensuring that people are not subject to significant decisions made solely by automated systems without meaningful human involvement.
The Omnibus reframes Article 22 as a broader permission for automation, giving firms more room to deploy automated systems in sensitive contexts where human review matters, weakening safeguards against unfair decisions. It also reduces the threshold for what counts as meaningful human involvement and turns a prohibition with exceptions into an authorisation subject to conditions. This shift makes it easier to rely on automation by default and harder for people to challenge decisions that significantly affect their lives.
7. It redefines “scientific research” in ways that weaken purpose limitation
The GDPR grants research specific flexibility because it serves public interest. These safeguards depend on clear boundaries between scientific research and general product development. The Omnibus expands the definition to include a wide set of industrial and commercial activities, blurring that line and allowing firms to justify broad reuse of data without full compatibility assessments. This weakens purpose limitation and grants large firms room to rely on research-based arguments for data reuse not tied to rigorous scientific standards.
8. It creates conflicts across the digital rulebook
EU digital laws are interdependent. The Digital Services Act relies on the GDPR for rules on targeting. The Digital Markets Act relies on clear limits on cross-service profiling. The AI Act relies on safeguards for sensitive data and automated decision-making. The Cyber Resilience Act depends on strong protection for device integrity. By weakening core GDPR principles, the Omnibus weakens each law of the EU digital rulebook, creating contradictions that reduce legal certainty and weaken enforcement.
Each route reinforces the others, creating a system that favours wide reuse of sensitive data and weakens deletion rights. The implications are global: many countries have adopted data protection laws on the model of GDPR. If the EU lowers its standards, it weakens its leverage abroad. It signals that rights-based digital governance is negotiable, benefiting actors who seek to dilute protections in other regions.
9. It creates a domino effect
The danger does not lie in one clause taken alone, but in how several changes across the Omnibus align and shift the handling of data. Taken alone, each change might look technical, but together, they reshape how data enters AI systems and how people can contest those uses.
A good example of the risks comes from the Omnibus, which expands the use of sensitive data in AI systems. The new Article 9(2)(k) GDPR lets companies keep sensitive data in training or testing sets whenever removing it would take ‘disproportionate’ effort, which in practice makes retention the default. The AI Omnibus then adds a new Article 4a and removes Article 10(5) of the AI Act, giving providers a broad ‘public interest’ legal basis to use sensitive data for bias detection across many types of systems, not only high-risk ones. At the same time, AI training remains linked to legitimate interest and to an expanded definition of scientific research. These changes work together. They make it easier to reuse sensitive data throughout AI pipelines and shift oversight away from the stricter data protection rules people rely on.
The combined effect is clear. More sensitive data will enter AI models. Removal becomes harder. Oversight moves from data protection regulators toward AI governance bodies with narrower mandates. People will face more profiling based on race, health, religion or sexuality, with weaker tools to challenge systems that rely on those inferences.
10. It weakens privacy (yes, it’s a different fundamental right to data protection in the EU!)
ePrivacy protects the privacy of communications. It stops companies and states from looking into what your phone does, who you talk to or when you interact with a service unless you give clear consent. This includes metadata such as who you contacted, when and for how long. These details reveal sensitive patterns about your life even without reading the content of a message.
The Omnibus weakens this protection. It moves parts of ePrivacy into the GDPR. This matters because the two laws do different things. ePrivacy protects the act of accessing your device. GDPR regulates how data is used once access has already happened. Shifting rules into the GDPR removes the strong consent requirement and replaces it with weaker obligations. It also creates new exceptions that allow access to device data without consent. This makes it easier for firms to track how people use their phones, apps and connected devices.
Many people think ePrivacy is only about cookie banners. These pop-ups were created by the tracking industry to preserve its profiling model. They push people to click quickly, interrupt browsing and make refusal harder than acceptance. The real issue was never the cookies. It was the tracking system behind them that followed people across sites and built behavioural profiles. The Omnibus, however, does not solve this. Consent fatigue remains, banners remain and the new exceptions widen the space for non-consensual access to device data. This makes it harder for people to control how they are tracked online. But this might come with a bit of good news.
The silver lining
The only element that resembles genuine simplification is the reference to privacy signals: privacy signals offer real simplification because they replace thousands of fragmented consent interactions with a single, standardized choice made at the device or browser level. Instead of every website forcing people through pop-ups, banners, or dark patterns, a machine-readable signal expresses one refusal across the entire ecosystem. But they only do so when they are binding and enforceable. Browser or device-based signals already show how people can refuse tracking globally with no banners. They strengthen control and ease enforcement. However, the Omnibus leaves major gaps in scope and delays that must be corrected. They do not compensate for the new exceptions that enable tracking without consent, nor for carve-outs such as the media exemption, which directly contradict the purpose of privacy signals.
Privacy signals point toward real simplification because they give people a single, global way to refuse tracking and set clear duties for controllers. The Omnibus leaves gaps in enforceability and scope, yet the idea itself shows what a rights-centred approach looks like. This makes the rest of the package stand out for the wrong reasons, since the other changes weaken protection and expand room for actors who already shape the digital ecosystem to their advantage.
Authors
