The Digital Omnibus: Plans to revise EU digital laws
Is Europe on a collision course between cutting red tape and protecting fundamental rights?
There’s been plenty of chatter about the European Commission’s Digital Omnibus. The leaked text has been poured over and now the official draft has been published. For some it raises concerns of weakened regulation impacting on people’s fundamental rights. For others it represents a hope the burden of compliance will be eased and innovation unleashed.
The EC is very much pitching this as “innovation friendly AI rules” and an “innovation friendly privacy framework”
I suspect the UK Government will be watching developments across the Channel closely and could find itself wishing it had been bolder with the Data (Use and Access) Act 2025 (DUAA).
What is the Digital Omnibus?
This is not a new law, nor a complete overhaul of existing legislation but an EC proposal to streamline, align and introduce specific legislative updates to existing digital rules such as the EU AI Act, GDPR, ePrivacy Directive, Data Act and Data Governance Act. An attempt to remove duplication and inconsistencies, along with alleviating some the burden of compliance for European organisations and others who operate within the EU.
What’s potentially on the cards?
AI Act
Key proposals include a delay in the applicable date for obligations in relation to high-risk AI systems, reducing AI literacy obligations, removing obligations for providers to register on the EU’s public database and introducing reduced penalties for small to medium sized businesses.
GDPR and ePrivacy
Key legislative adjustments which could be ushered in include the following:
Personal data
A narrower definition is proposed whereby information would not be considered personal data for a given entity when that entity does not have ‘means reasonably likely’ to identify individuals.
This could ease the current and not inconsiderable issues and debates caused by assessing whether people can be ‘indirectly’ identifiable. The existing GDPR definition states: ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly.
Interestingly a similar tweak was proposed in the UK under the previous Conservative Government’s data reform plans, but wasn’t carried over into DUAA.
Special category data
The idea is data would only be classified as special category data if it ‘directly reveals’ information about an individual’s health, sex life, racial of ethnic origin, political opinions, trade union membership, religious or philosophical beliefs. If introduced this would mark a step change away from the current broader inference-based rule and is likely to be particularly contentious.
The following two new exemptions are also proposed to the existing prohibitions on processing special category data:
1) allowing for the ‘residual processing’ of special category data for the development and operation of an AI system or AI model – subject to certain conditions.
2) permitting the processing of biometric data when necessary to confirm someone’s identity, and where the data and means of verification are under the sole control of that individual i.e. where biometrics are on the user’s device.
Right of Access – Data Subject Access Requests
‘Abusive’ requests could be rejected or a fee charged, if a controller considers the request is being used by someone for other purposes than the ‘protection of their personal data’. Enhanced clarification is also expected on the conditions under which a request can be deemed excessive.
This recognises a growing issue of DSARs being ‘weaponised’ and used for other purposes, such as litigation. I imagine there are plenty of organisations hoping this proposal will not be ditched during negotiations. I for one would welcome this move and know plenty of UK organisations would benefit from a similar legislative amendment in the UK.
Personal Data Breaches
It’s proposed the requirement to report data breaches to a supervisory authority would only kick in where there was a ‘high risk’ rather than the current threshold of ‘risk’. This would align the threshold for both reporting to regulators and notification to affected individuals. The deadline for reporting could also be extended from 72 to 96 hours.
Data Protection Impact Assessments
In a move to try and make sure there’s a consistent approach across the EU, the European Data Protection Board is expected to be tasked with creating harmonised lists of processing activities requiring a DPIA and those which would be exempt. The EDPB would also develop a common template and methodology for conducting DPIAs.
Automated decision making
We could see more freedom to rely on entirely automated decisions with legal or similarly significant effect when necessary for a contract, even if the same decision could be made manually by a human.
Cookies and similar technologies
In an attempt to try and alleviate the confusion and annoyance for users, as well as the cost to business, the EC is proposing simplify the rules. The stated aim is to reduce the number of times cookie banners pop up and allow users to indicate their consent with ‘one-click’, with preferences saved via their browser and operating system’s settings.
Any processing of personal data is expected to be governed solely by GDPR – not the ePrivacy Directive. It’s also proposed certain purposes which pose a low risk to people’s rights and freedoms will not require consent, for example when cookies and similar technologies are used for security and aggregated audience measurement.
EU legislators may find themselves looking across the pond to California’s new “Opt Me Out Act”. From January 2027 this requires web browsers to offer a one-click opt-out which automatically tells websites not to sell or share their personal information. While just one state’s law this is expected to have a more far-reaching impact. It will be simpler for browsers to roll this feature out more widely, as they won’t know if the organisation which runs a website is based in California or not.
AI and legitimate interests
A new provision could be introduced confirming the lawful basis of legitimate interests could be used for processing personal data for training AI models. It’s highly likely this would still be subject to a balancing test.
Privacy notices
Providing a privacy notice to an individual may not become necessary if a controller believes the individual already knows the organisation’s identity, the organisation’s purposes for processing and how to contact any Data Protection Officer.
What next?
None of the above is set in stone, and all is subject to change. And for those of you who remember the years of wrangling trying to amend the ePrivacy Directive, which ultimately failed, there’s a long road of negotiation and lobbying ahead. Ultimately, will technological advances continue to streak ahead with legislators struggling to keep up?
Also see EC Press Release and EC Digital Omnibus proposals