janeiro 28 2026

EU Omnibus: Key Proposed Changes for Global Businesses

Share

The European Commission’s EU Omnibus Proposal package represents an attempt to streamline Europe’s digital regulatory framework. The package comprises two legislative proposals: the “Digital Omnibus”, which proposes targeted amendments to the GDPR and consolidates data economy rules, and the “Digital Omnibus on AI”, which proposes targeted simplifications to the EU AI Act. Rather than creating new standalone laws, the Proposal seeks to consolidate and harmonise rules across the digital regulatory landscape.

Practically speaking, if adopted, the EU Omnibus Proposal might bring regulatory relief in certain areas, but it will also require organisations to review their existing processes, documentation, and preparedness in areas such as breach reporting, privacy compliance (including legal bases for processing, disclosures, and use of personal data in the context of AI), and data-reuse governance.

Reactions to the proposal by EU institutions and the industry have been split, some showing support and some critical. EU Member States have signalled caution regarding the scope of the Commission’s ambitions with the proposal, and the Cyprus Presidency of the EU Council has drafted an initial compromise on the AI Act simplifications that reinstates several obligations the Commission had proposed to remove, including with regard to timelines. We expect negotiations to be heated.

In this Legal Update, we first provide an overview of the key proposed changes across four sections: Data Privacy, AI, Cybersecurity, and Digital Regulation; we then provide commentary on the most significant proposals and their likely impact for organisations.

Changes to Data Privacy Rules

The EU Omnibus Proposal seeks to update the GDPR to address the use of AI, research requirements, pseudonymisation, transparency fatigue, and fragmented breach-reporting requirements.

Topic Current Position Proposed Change
GDPR Art. 4(1) – definition of personal data Personal data includes information relating to an identified or identifiable natural person; identifiability is assessed by “means reasonably likely to be used.” Clarifies that whether information constitutes personal data is assessed on a relative, entity-specific basis. Data may be treated as non-personal for a particular controller where that entity lacks reasonable means to identify the individual, while the same data may still constitute personal data for another recipient that does have such means. Empowers the Commission to specify criteria for determining when data resulting from pseudonymisation should no longer be treated as personal data for certain entities. This is intended to reflect existing CJEU case law and provide greater legal certainty for low-risk processing scenarios, while preserving GDPR protections where re-identification remains realistically possible.
GDPR Art. 6(1)(f) Processing shall be lawful if for the purposes of legitimate interests. Expands “legitimate interests” to include:
  • AI development/operation subject to balancing and safeguards, with an unconditional right to object; and
  • Scientific research which does not exclude that the research may also aim to further a commercial interest.
GDPR Art. 9(2) – special categories Processing is prohibited unless an Art. 9(2) condition applies; no AI-specific carve-outs; biometric verification generally requires a derogation.
  • Processing special-category personal data for AI development/operation with strict measures to avoid, detect, remove such data, or, where removal is disproportionately difficult, to protect against use/disclosure; and
  • Biometric verification where necessary and under the data subject’s sole control.
GDPR Art. 12(5) – manifestly unfounded/excessive requests Controllers may refuse/charge for manifestly unfounded or excessive requests and bear the burden of proof. Clarifies that abusive Art. 15 access requests (e.g., for leverage or harassment) may justify refusal or fees; reduces the controller’s burden to show “excessive” (as distinct from “manifestly unfounded”) requests.
GDPR Art. 13(4) – transparency information Controllers must provide prescribed information when collecting personal data, with limited exceptions. Exempts duty where there is a clear, limited relationship, low-intensity processing, and reasonable assumptions that data subjects already have the information, except for transfers, automated decision-making, or high-risk processing.
GDPR Art. 33 – breach notification Notify supervisory authority (SA) without undue delay and within 72 hours unless no risk to rights/freedoms. Aligns the SA notification threshold with the data-subject notification threshold (only if the breach is likely to be high risk). Extends deadline to 96 hours. Establishes the use of a single EU reporting entry point. Tasks the EDPB to propose an EU common notification template and a list of circumstances indicating “high risk”.
GDPR Art. 35 – DPIAs National SAs issue lists of processing requiring/not requiring DPIAs. Harmonises via EU-level lists with the EDPB to propose single EU lists, plus a common DPIA template and methodology.
GDPR Art. 88b (new) – machine-readable signals Not applicable. Establishes a framework for standardised, automated, machine-readable consent/objection signals.

Changes Related to EU AI Act

The Digital Omnibus on AI proposes targeted amendments to the EU AI Act to address implementation challenges and reduce administrative burdens for AI providers and deployers.

Topic Current Position Proposed Change
Art. 4 – AI literacy Providers and deployers must take measures to ensure a sufficient level of AI literacy of their staff and other relevant persons. Transforms the obligation into a requirement for the Commission and Member States to ‘encourage’ providers and deployers to ensure AI literacy, rather than a direct duty on providers and deployers.
Art. 6(3)-(4) – Registration of non-high-risk AI systems Providers of AI systems listed in Annex III must register in the EU database even where the provider concludes the system is not high-risk under Article 6(3). Removes the registration obligation for AI systems where the provider has determined the system does not pose a significant risk, though documentation of the assessment must still be maintained and made available on request.
Art. 10(5) / Art. 4a (new) – Processing special categories of personal data Providers of high-risk AI systems may exceptionally process special categories of personal data where ‘strictly necessary’ for bias detection and correction. Extends this exception to providers and deployers of all AI systems and models (not just high-risk), and relaxes the standard from ‘strictly necessary’ to ‘necessary’.
Art. 57 – AI regulatory sandboxes Member States must establish national AI regulatory sandboxes; no EU-level sandboxes. Introduces the possibility for the AI Office to establish EU-level regulatory sandboxes for AI systems based on general-purpose AI models or integrated into very large online platforms.
Art. 75 – AI Office supervision The AI Office supervises AI systems based on a general-purpose AI model where the model and system are developed by the same provider. Extends the AI Office exclusive competence to AI systems that constitute or are integrated into designated very large online platforms or very large online search engines.
Art. 113 – Implementation timeline High-risk AI system rules in Annex III apply from 2 August 2026; Annex I rules apply from 2 August 2027. Delays application to 6–12 months after Commission confirmation that adequate compliance support measures are available, but not later than 2 December 2027 (Annex III) or 2 August 2028 (Annex I).

Changes Related to Cybersecurity

The EU Omnibus Proposal seeks to eliminate duplicative incident-reporting obligations across the various regulations by introducing a unified reporting channel and aligning templates.

Topic Current Position Proposed Change
ePrivacy Directive 2002/58/EC Art. 4 – security and breach notifications Requires providers of publicly available electronic communications services to safeguard security and notify personal data breaches. Repeals this obligation as it is obsolete given NIS2 risk-management and GDPR security and breach rules. Security and breach notifications shift to GDPR/NIS2/DORA/eIDAS per sectoral scope.
NIS2 / eIDAS / DORA / CER / DGA These rules each have its own reporting platform. Establishes an EU single-entry point operated by ENISA for incident reporting under these multiple acts.
NIS2 – Cyber Resilience Act (CRA) linkage NIS2 establishes timelines and reporting content for significant incidents. The CRA sets separate duties for vulnerability and severe incident notifications for manufacturers. Inserts an explicit cross-recognition: where a manufacturer has notified a severe incident under the CRA and that notification already contains the information NIS2 requires, the notification will be deemed to fulfil the NIS2 submission for that incident.

Changes Related to Data Act

Much of the EU Omnibus Proposal focuses on consolidating data-economy rules into the Data Act.

Topic Current Position Proposed Change
Articles 4(8) and 5(11) of the Data Act – trade secrets Under these articles, users of connected products/related services can access and share “readily available” data. Data holders may refuse disclosure only if they can demonstrate they are “highly likely to suffer serious economic damage” from disclosure of trade secrets. Adds a new explicit refusal: A data holder may refuse disclosure where the disclosure “poses a high risk of unlawful acquisition, use, or disclosure” to entities in, or under the control of, third-country jurisdictions which offer weaker data protections.
Articles 14 to 22 of the Data Act – business-to-government (B2G) data sharing for “exceptional need” These articles allow public sector bodies to request data held by private entities in situations of “exceptional need”, which include public emergencies and other non-emergency public-interest tasks. Narrows these provisions from “exceptional need” to “public emergencies”. Allows public sector bodies to also add a targeted complaints regime for B2G data sharing to enable data holders and requesting public bodies to bring complaints to the competent authority about refusals, modifications, compensation levels, or onward transmission decisions.
Article 36 of the Data Act – essential requirements for smart contracts used to execute data-sharing agreements This provision sets “essential requirements” for smart contracts. Deletes Article 36 entirely and removes the essential-requirements regime for smart contracts, citing legal uncertainty, cost/innovation concerns, and lack of standards.
Chapter VII a-c of the Data Act (new) Not applicable. Pulls several rules spread over various regulations around data intermediation services, the free flow of non-personal data and the re-use of open data held by public sector bodies under the Data Act, and streamlines them (e.g., via common definitions).
Article 32x of the Data Act (new) – transfers of non-personal protected data to third countries by re-users Not applicable. Creates a dedicated third-country transfer rule for re-users, which will require prior notification of intended transfers and contractual commitments to uphold trade-secret/IP/statistical confidentiality under Member State jurisdiction and model clauses. This is not in the current Data Act, but is handled under the Data Governance Act Chapter II for certain protected public sector data.
Article 42 – Role of the European Data Innovation Board (EDIB) The EDIB (established under the DGA) supports consistent application of the data framework and coordination among authorities. Integrates and strengthens EDIB within the Data Act to:
  • Expand remit to advise on consistent enforcement across newly consolidated chapters;
  • Support capacity-building;
  • Coordinate policy across data economy domains; and
  • Facilitate guidance for interoperability and international data flows.

Commentary on the Proposals

We comment below on how the proposals have been received so far, the changes that we estimate would be most significant, and their likely impact for organisations.

Reactions to the Proposed Package

National governments and industry voices generally back the simplification thrust of the EU Omnibus Proposal, with targeted caveats. Digital ministers from member states have urged that streamlining must truly clarify overlapping obligations rather than create new burdens, reflecting concerns about regulatory coherence.

With regard to AI Act changes, in its Joint Opinion of 20 January 2026, the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) expressed support for the objective of addressing implementation challenges under the AI Act, but cautioned that administrative simplification must not lower the protection of fundamental rights. Key recommendations included maintaining strict necessity standards for processing special categories of personal data for bias detection, preserving registration obligations for AI systems that could be high-risk, and ensuring Data Protection Authorities remain closely involved in EU-level AI regulatory sandboxes.

Member States have signalled caution regarding the scope of the Commission’s ambitions. The Cyprus Presidency of the EU Council has drafted an initial compromise on the AI Act simplifications that reinstates several obligations the Commission had proposed to remove, including registration requirements for non-high-risk AI suppliers and the ‘strict necessity’ standard for processing special categories of personal data. The compromise also maintains the original 2027 timeline for high-risk AI system obligations and clarifies the enforcement powers of the AI Office directly in the legislative text rather than through delegated acts.

Changes to the Definition of Personal Data

The proposed amendments to the definition of “personal data” represent one of the more legally significant GDPR clarifications in the EU Omnibus Proposal. Rather than redefining the concept, the Commission seeks to codify the Court of Justice’s established “relative” approach to identifiability, under which information may fall inside or outside GDPR scope depending on the practical means available to a specific entity.

In principle, this clarification could reduce compliance burdens for organisations that process data in a form that is effectively non-identifying for them, particularly in data-sharing, analytics, and AI development contexts. At the same time, the entity-specific framing may introduce new factual and evidentiary questions in enforcement, particularly where data moves between multiple parties and supervisory authorities must assess differing re-identification capabilities across a processing chain.

Personal Data Breach Timeline and Risk Threshold Changes

The proposed changes to the GDPR personal data breach regime are also some of the most consequential elements of the EU Omnibus Proposal. The current position is that breaches trigger authority notifications unless a controller can demonstrate a lack of risk. The proposal would invert that assumption, requiring positive likelihood of high risk to be reportable to both authorities and individuals. This change would substantially narrow the scope of notifiable breaches and reduce defensive over-reporting of low-risk incidents.

This change – combined with the extension of the notification deadline from 72 to 96 hours, the introduction of EU-level breach notification templates, and the guidance on what constitutes “high risk” – is intended to promote greater consistency and legal certainty across Member States. If adopted, the new rules would alleviate regulatory burden to organisations handling a breach. However, these reforms also shift more responsibility onto controllers to make robust and defensible risk assessments at an early stage of an incident, with potentially greater scrutiny of those assessments by supervisory authorities.

Taken together, these changes signal a move toward a more explicitly risk-based and harmonised breach notification framework, with significant implications for incident response planning, internal governance, and documentation practices.

Single EU Notification Channel for Incidents

The proposal to introduce a single EU-level entry point for incident and breach notifications, operated by ENISA, has generated mixed reactions. While some support the objective of reducing duplicative reporting across GDPR, NIS2, DORA, eIDAS and related regimes, there has been notable concern that a centralised reporting hub could add complexity rather than remove it if it is not carefully designed, especially considering the different definitions, reporting thresholds, and templates under multiple rules.

Financial industry bodies and sectoral groups have cautioned that a single-entry point risks becoming an additional procedural layer, rather than a true simplification, if it does not integrate seamlessly with existing sector-specific timelines, escalation processes, and supervisory expectations. There are concerns that operational or capacity constraints at the EU level could delay the flow of time-sensitive information to national authorities, potentially undermining incident-response effectiveness. Other concerns include the likelihood that such a hub could become a high-value target for malicious actors.

Overall, a single reporting channel would alleviate regulatory burden for organisations responding to a cyber incident, but there must be detailed reporting flows and processes at ENISA and competent authorities for the handling of reports, to avoid information sharing with incompetent authorities and unnecessary questions to the reporting entities, which could lead to an increase in complexity.

Conclusion

The EU Omnibus Proposal aims to create a coherent, predictable, and interoperable regulatory system across data privacy, cybersecurity, and digital regulation. Practically speaking, if the proposal is adopted, organisations will have to review their practices, documentation, and preparedness in areas such as breach reporting, privacy compliance (including legal bases for processing, disclosures, and use of personal data in the context of AI), and data-reuse governance – but will also benefit from some regulatory relief in some of these areas.

Overall, if the proposal is accepted, organisations may expect:

  • more centralised EU-level oversight;
  • harmonised and hopefully simplified incident-reporting and DPIA processes;
  • less scrutiny on personal data processing in AI, but stricter controls on sensitive-data handling in AI; and
  • attempted unified data-economy framework with the Data Act as key driver.

Serviços e Indústrias Relacionadas

Stay Up To Date With Our Insights

See how we use a multidisciplinary, integrated approach to meet our clients' needs.
Subscribe