DUA Act – next steps

July 2025

When will provisions under the Data Use and Access Act 2025 (DUAA) take effect and when we can anticipate guidance to be published by the Information Commissioner’s Office?

The DUAA received Royal Assent on 19th June but while limited provisions came into effect immediately, the majority will be phased in over the coming months up to June 2026, with some requiring secondary legislation to be passed.

To be crystal clear, the DUAA does not replace UK GDPR, the Data Protection Act 2018 or the Privacy and Electronic Communications Regulations (PECR). The Act brings in amendments to these core pieces of legislation, much in the same way PECR was amended in 2009 with the so-called ‘cookie law’.

Commencement of DUAA provisions

One provision which has come in with immediate effect is clarification that when responding to Data Subject Access Requests (the right of access) organisations only need to undertake a “reasonable and proportionate search”. This change simply gives a statutory footing to case law and existing guidance from the ICO.

At present we don’t know precisely when other specific provisions will commence, such as the soft opt-in for charities, changes to the cookie rules and recognised legitimate interests, but we’ll update this article as and when we hear more. For a top-level summary of the Act see DUAA 2025: 15 key changes ahead.

ICO guidance

The ICO has published a timeline of when we can expect updated or new guidance covering the changes the DUAA ushers in.

Summer 2025

 Data Subject Access Requests – update to detailed Right of Access guidance
Substantial public interests conditions – a new interactive tool
Cookies & similar technologies (Part 1) – update to ‘cookie guidance’ and renamed ‘guidance on storage and access technologies’.

Winter (2025/26)

Direct marketing and Privacy and Electronic Communications Regulations guidance – update to existing guidance
Complaints procedures – new guidance for organisations on how to handle data protection complaints
Lawful basis of recognised legitimate interests – new guidance
Legitimate interests – update to existing guidance
International data transfers guidance – update to existing guidance
Cookies & similar technologies (Part 2) – (‘guidance on storage and access technologies’).
The purpose limitation principle– updated and enhanced guidance
Anonymisation and pseudonymisation for research purposes – guidance

Spring 2026

Automated Decision Making (ADM) and Profiling – updated guidance
Research, archiving and statistics provision – updated guidance.
SME data essentials – guidance

More detail and other updates from the ICO can be found here: plans for new and updated guidance.

Codes of practice

The ICO will also in due course be producing codes of practice on edtech and artificial intelligence.

There’s lots to watch out for and we’ll try our best to keep you up to date with developments as and when they happen.

Data Protection Nuggets Part 1

DUA Act and Legitimate Interests

July 2025

The Data Use and Access Act (DUAA) introduces changes to the concept of legitimate interests under UK GDPR. Once provisions take effect there will be a seventh lawful basis of recognised legitimate interests and legal clarity on activities which may be considered a legitimate interest.

Recognised Legitimate Interests

The DUAA amends Article 6 of GDPR to expand the six lawful bases for processing to seven, to include recognised legitimate interests. While a necessity test will still be required, for the following recognised legitimate interests there will no longer be a requirement for an additional balancing test (Legitimate Interests Assessment):

Disclosures to public bodies, or bodies carrying out public tasks where the requesting body has confirmed it needs the information to carry out its public task.

This means private and third sector organisations which work in partnership with public bodies will just need confirmation the public body needs the information to carryout out its public task. This is likely to give more confidence to organisations (such as housing associations and charities) when sharing information with public sector partners.

Data Sharing Agreements, Records of Processing Activities (RoPAs) and privacy notices may need to be updated to reference recognised legitimate interests as the lawful basis where appropriate. Staff training may also need updating.

Safeguarding vulnerable individuals – this allows for the use of personal data for safeguarding purposes. There are also definitions given for the public interest condition of “safeguarding vulnerable individuals”, which the ICO has written more about here.

Crime – this allows use of personal information where necessary for the purposes of detecting, investigating or preventing a crime; or apprehending or prosecuting offenders.

National security, public security and defence – this allows the use personal information where necessary for purposes of safeguarding national security, protecting public security or defence.

Emergencies – this allows use personal information where necessary when responding to an emergency. An emergency is defined by the Civil Contingencies Act 2004 and means an event or situation with threatens serious damage to human welfare or the environment, or war or terrorism which threatens serious damage to the security of the UK.

The ICO is planning to publish guidance on recognised legitimate interests over Winter 2025/26. For a timeline of when we can anticipate other DUAA related guidance from the ICO see DUAA – Next Steps.

Types of processing that may be considered a legitimate interest

There are some examples of activities which may be considered a legitimate interest in the recitals of UK GDPR. As such they provided an interpretation of the law but were not legally binding. DUAA moves the following examples of legitimate interests from the recitals into the body of the law:

direct marketing
intra-group sharing of data for internal administrative purposes, and
processing to ensure network and information security.

This may give organisations more confidence when relying on the lawful basis of legitimate interests however, unlike recognised legitimate interests, the above will still be subject to a Legitimate Interests Assessment.

The core rules under the Privacy & Electronic Communications Regulations (PECR) are not changing – unless you’re a charity wishing to benefit from the ‘soft opt-in’. For direct marketing activities, legitimate interests will still only be an option for specific marketing activities which don’t require specific and informed consent under PECR.

An update to both the ICO’s Legitimate Interests Guidance and PECR guidance is expected in Winter 2025/26.

Data Protection Basics: The 6 lawful bases

June 2025

A quick guide to the six lawful bases for processing personal data

One of the fundamental data protection principles is that our handling of personal data must be ‘lawful, fair and transparent’. To be lawful, clearly, we shouldn’t do anything illegal in general terms. But what else does it mean to be lawful?

We’re given six lawful bases to choose from under UK/EU GDPR. For each purpose we use personal data for, we need to match it with an appropriate lawful basis.

For example a purpose might be:

  • Sending marketing emails to our customers
  • Profiling our audience to better target our marketing
  • Handing staff payroll data to pay salaries
  • Handling customer enquiries about our services
  • Delivering a product a customer has requested
  • Implementing measures to prevent fraud

We need to select the most appropriate lawful basis and meet its own specific requirements. Each basis is equally valid, but one may be more appropriate than others for any specific task. We’re legally obliged to set out the lawful bases we rely on in our privacy notices.

If none of them seem to work, you may want to question whether you should be doing what you’re planning to do.

Quick guide to the 6 lawful bases

(This is not intended to be exhaustive, do check the ICO’s Lawful Basis Guidance)

1. Contract

This lawful basis will be appropriate if you need to process an individual’s personal information to deliver a service to them. Or you need collect certain details to take necessary steps before entering into a contract or agreement.

Example 1: An individual purchases a product from you and you need to handle specific personal information about them in order to deliver that product, including when you acknowledge their order, provide essential information, and so on.

Example 2: Someone asks you to give them a quote for your services, and you need certain information about them in order to provide that quote.

Contract tips:

  • It doesn’t apply to other purposes you may use the data for which are not essential.
  • It’s most likely to be used when people are agreeing to T&Cs, although it can also be used where a verbal agreement or request for information is made.
  • The person whose data you’re processing must be party to the contract or agreement with you. It doesn’t apply if you want to process someone’s details, but the contract is with someone else, or with another business.

2. Legal obligation

There may be circumstances where you are legally obliged to conduct certain activities, which will involve processing personal data. This could be to comply with common law or to undertake a statutory obligation.

Example 1: You are offering a job to someone outside the EU. You need to check they have a visa to work in the UK, as this is a legal obligation.

Example 2: Airlines and tour operator collect and process Advance Passenger Information (API) as this is a legal requirement for international air travel.

Legal obligation tips

  • Legal obligation shouldn’t be confused with contractual obligations
  • Document your decision. You should be able to either:
    a) identify the specific legal provision you are relying on
    or
    b) the source of advice/guidance which sets out your obligation.

3. Vital interests

You can collect, use or share personal data in emergency situations, to protect someone’s life.

Example: A colleague collapses at work, is unable to talk, and you need to tell a paramedic they have a medical condition. Common sense should prevail.

Vital interest tips

  • It’s very limited in scope, and should generally only apply in life and death situations.
  • It should only be used when you manifestly can’t rely on another basis. For example, if you could seek consent, you can’t rely on vital interests.

4. Public task

You can process personal data if necessary for public functions and powers that are set out in law, or to perform a specific task in the public interest.

Most often this basis will be relied upon by public authorities and bodies, but it can apply in the private sector where organisations exercise official authority, or carry out tasks in the public interest.

Public task tips

  • If you could reasonably perform your tasks or exercise powers in a less intrusive way this basis won’t be appropriate. The processing must be necessary.
  • Document your decisions, specify the task, function or power, and identify the statutory or common law basis.

5. Legitimate Interests

This is the most flexible lawful basis, but don’t just assume what you’re doing is legit. It’s most likely to be appropriate when you use people’s data in a way they’d reasonably expect. Where there is minimal impact on them, or where you have a compelling justification.

Legitimate interests must be balanced. You must balance the organisation’s interests against the interests, rights and freedoms of individuals. If your activities are beyond people’s reasonable expectations or would cause unjustified harm, their rights and interests are likely to override yours. Legitimate interests – when it isn’t legit

Legitimate Interests tips

  • Conduct and document a Legitimate Interests Assessment (LIA). This may be relatively simple and straight-forward, or more complex.
  • Consider whether you can provide people with an easy way to object. This is not essential in all situations (e.g. fraud protection).
  • Be open about where you rely on legitimate interests so its likely to be in people’s reasonable expectations.
  • Remember to include what your legitimate interests are in your privacy notice.
  • Check the ICO’s guidance on when legitimate interests can be relied upon for marketing activities.

Important note: In June 2025 the UK Data (Use and Access) Act introduced a new lawful basis for processing into the UK GDPR. This lawful basis of ‘recognised legitimate interests’ can be relied up by organisations for specific purposes without being required to conduct a balancing test (i.e. a Legitimate Interests Assessment).  The list of recognised legitimate interests includes the following (and may be expanded):

■ Disclosures to public bodies, where it is asserted personal data is necessary to fulfil a public function.
■ Disclosures for national or public security or defence purposes, emergencies.
■ Disclosures for prevention or detection of a crime, and safeguarding vulnerable individuals.

6. Consent

This is when you choose to give individuals a clear choice to use their personal details for a specific purpose and they give their clear consent for you to go ahead. The law tells us consent must be a ‘freely given, specific, informed and unambiguous’ indication of someone’s wishes given by a ‘clear affirmative action’.

Consent is all about giving people a genuine choice and putting them in control. They must be able to withdraw their consent at any time, without a detrimental impact on them.  Consent, getting it right.

Consent tips:

  • It should be clear what people are consenting to
  • Consent shouldn’t be bundled together for different purposes, each purpose should be distinct
  • It must not be conditional – people shouldn’t be ‘forced’ to consent to an activity as part of signing up to a service.
  • Consent is unlikely to be appropriate where there may be an imbalance of power. For example, if an employee would feel they have no option but to give consent to their employer (or might feel they could be penalised for not giving it).
  • The law sometimes requires consent. For example, under the electronic marketing rules consent is sometimes a requirement.

In summary, consider all the purposes you have for processing personal data. Assign a lawful basis to each purpose and check you’re meeting the specific requirements for each basis. Tell people in your privacy notice the lawful bases you rely on, and specifically explain your legitimate interests.

Finally, don’t forget, if you’re processing special category data (for example data revealing racial or ethnic origin, health data or biometric data) you’ll need a lawful basis, plus you’ll need to meet one of the conditions under UK GDPR Article 9.  For criminal convictions data you’ll need a lawful basis, plus one of the conditions under UK GDPR Article 10.

DUA Act 2025: 15 key changes ahead

June 2025

The Data Use and Access Act 2025 received Royal Assent on 19 June. Implementation of the new law will commence in phases with most provisions expected to come into force within two to six months, while some may take up to a year.

The key objectives of the DUA Act involve enabling data sharing and the introduction of digital verification schemes. Alongside this, we’ll see amendments to UK GDPR, the Data Protection Act 2018 and the Privacy & Electronic Communications Regulations (PECR). The level of impact will very much depend on your sector and data processing activities.

No radical shake-up

While significant, this legislation does not usher in radical changes and organisations do not face a big shake up of their approach to data protection compliance. This is not GDPR 2.0. The fundamental principles and obligations for data protection remain unchanged. We predict it will be business as usual for the majority of organisations, with some changes here and there.

Time to prepare

While limited provisions may take immediate effect, there will be time to prepare before the majority of provisions take effect, possibly up to 15 months after the law is enacted. The precise timescales have yet to be published, and we’d advise keeping abreast of developments, and ICO guidance as it comes out. Nothing needs to be done right away.

AI transparency and copyright not included

It’s worth noting the House of Lords lost its battle on AI. A key sticking point, which stalled progress of the Bill until now, was the Lords introducing successive amendments to transparency requirements for data used to train AI models, and the use of copyright materials to train AI. In the end these attempts failed, but an agreement was reached with the Government to publish a report on copyright and AI proposals in the coming months.

15 key changes ahead

1. Solely automated decision-making 

UK GDPR currently places strict restrictions on automated decision-making (including profiling) which result in legal or similarly significant effects. This will be relaxed so it only applies to automated decisions using special category data. With any other personal data, there will be a requirement to put in place certain safeguards, such as giving individuals the ability to contest decisions and request human intervention.

This change will give organisations more flexibility to make automated decisions using personal data (but not special category data). For example, when utilising AI systems. To prepare for this change, re-assess your use of solely automated decision-making and look to review relevant processes and policies.

As part of the recently launched ICO AI and Biometrics Strategy, the regulator has committed to:

updating its guidance on automated decision making (ADM) and profiling by autumn 2025
a public consultation on this updated guidance
developing a statutory code of practice on AI and ADM

2. Data Subject Access Requests (DSARs) 

Provisions to be introduced on DSAR handling give a statutory footing to existing ICO guidance. In practice this is unlikely to mean any significant changes if you’re already following regulatory guidance, but it does give a degree of extra confidence by being written into UK law. The key points are:

 the timescale for responding within one calendar month does not start until the organisation is satisfied the requestee is who they say they are
when seeking clarification, the clock can be paused while awaiting the individual’s response
organisations can conduct a “reasonable and proportionate” search for personal data.

When withholding information is based on legal professional privilege or client confidentiality, a new requirement will mean organisations have to explicitly inform individuals about the specific exemption being applied and the reasons. Individuals will also have the right to request the ICO reviews how these specific exemptions have been applied.

To prepare, you can start to review your current DSAR procedure, if relevant plan how to update response templates to include more explicit information and bolster internal documentation used to justify reliance on these exemptions.

3. The right to be informed 

The obligation to provide privacy information to individuals (e.g. under Article 14, UK GDPR) will not apply if providing this information “is impossible or would involve disproportionate effort”.

This is most likely to be particularly relevant where organisations have gathered personal data indirectly, i.e. not directly from the individuals. This was a point of contention in the Experian vs ICO case, where Experian argued it would be disproportionate effort to notify and provide privacy information to the millions of people whose data they process from the Edited Electoral Roll.

4. New Complaints procedure

The legislation includes a new right for individuals to raise complaints related to use of their personal data. These new rules will require controllers to make sure they have clear procedures to facilitate complaints, including providing a complaint form. Complaints will require a response within 30 days. Alongside this, certain organisations may also be obligated to notify the ICO of the number of privacy-related complaints they receive during a specified time period.

Some sectors, such as financial services and those which fall in scope of FOI requests, are already obliged to have complaints procedures in place to meet their legal obligations. These may need adapting to cover these new requirements while for others, procedures will need to be put in place. Privacy notices will also need to be updated to reflect this change.

5. Legitimate Interests & direct marketing

“The processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest”. This not insignificant line currently rests in a GDPR recital, and as such it’s not legally binding and simply provides a helpful interpretation of the law. However, under the DUA Act it will unambiguously set in stone that legitimate interests is an acceptable lawful basis for direct marketing purposes.

While there are concerns this will lead to more ‘spam’ marketing, I’d stress the direct marketing rules under PECR will still apply, so legitimate interests will only be an option when the law doesn’t require consent.

6. Recognised legitimate interests

The concept of ‘recognised legitimate interests’ is to be introduced, whereby organisations will not be required to conduct a balancing test (i.e. Legitimate Interests Assessment) when relying this lawful basis – but only for specific, recognised purposes. The list of recognised legitimate interests includes the following (and may be expanded):

Disclosures to public bodies, where it is asserted personal data is necessary to fulfil a public function.
Disclosures for national or public security or defence purposes, emergencies.
Disclosures for prevention or detection of a crime, and safeguarding vulnerable individuals.

In preparation, you can start by reviewing processing activities which rely on legitimate interests and assess if any will become ‘recognised’. I can see this being particularly helpful for private and third sector organisations which have direct relationships with public bodies involving the sharing of personal data.

7. Charities and the marketing ‘soft opt-in’

The use of the ‘soft opt-in’ exemption to consent for electronic marketing will be extended to charities. This means charities will be able to provide supporters and donors with an ‘opt-out’ mechanism rather than an ‘opt-in’ to marketing emails (and/or SMS), as long as the following specific conditions are met:

 The sole purpose of the direct marketing is for the charity’s own charitable purpose(s)
Contact details were collected when the individual expressed an interest in the charity’s purpose(s) or offered or provided support to further the charity’s purpose(s).
An opportunity to refuse/opt-out is given at the point of collection, and in every subsequent communication.

To prepare charities can consider whether they wish to switch from consent, and assess if this will relatively straight-forward to implement in practice or not. Pros and Cons of the ‘soft opt-in pros.

8. Cookies & similar technologies

The DUA Act will include extending the exceptions to consent from only ‘strictly necessary’ to include other specific types of ‘low risk’ cookies and similar technologies. The exemption will be permitted for certain statistical purposes and optimising website appearance, as long as clear information is provided and users are given a straight-forward ability to opt-out.

Alongside these changes under DUA, the ICO is reviewing PECR consent requirements to in its words; “enable a shift towards privacy-preserving advertising models”. This autumn, a statement is expected on ‘low risk’ advertising activities which in the ICO’s view are unlikely to cause harm or trigger enforcement action. You can read more about this in the ICO’s package of measures to drive economic growth.

In preparation, cookie audits can be conducted to identify which cookies used may qualify as ‘low-risk’, and prepare to update your consent management platform (CMP) and the cookie information provided.

9. PECR Fines

Fines for infringements of the Privacy & Electronic Communications Regulations, which govern electronic direct marketing, cookies and similar technologies, are set to significantly increase.

Currently the maximum fine under PECR is currently capped at just £500k. The limits will be brought in line with the much more substantial fines which can be levied under UK GDPR – up to a maximum of £17,500,000, or 4% of the organisation’s total annual worldwide turnover from the preceding financial year, whichever is higher.

Bear in mind the ICO issues more fines under PECR than UK GDPR or DPA, so the message is clear; make sure you comply with the PECR rules as the cost of enforcement action could be far higher.

It’s also worth noting what constitutes ‘spam’ is to be extended to include emails and text messages which are sent, but not received by anyone. This will mean the ICO will be able to consider much larger volumes in any enforcement action.

10. Compatible processing

Currently, UK GDPR makes it tricky to reuse personal data for new purposes, and DUA Act aims to make this slightly easier by listing specific compatible purposes for which organisations will not need to undertake a compatibility assessment.

11. Scientific research

There are detailed changes in relation to scientific research. To briefly summarise, the definition of ‘scientific research’ is to be clarified and will explicitly state research can be a commercial or non-commercial activity. Consent for scientific research is to be adapted, in part driven by a desire to make it easier for personal data collected for specific research to be reused for other scientific research purposes.

12. Data protection by design to protect children

When assessing appropriate ‘technical and organisational measures’ in relation to online services likely to be accessed by children, organisations will be legally obliged to take account of how children can best be protected right from the design phase, confirm that children merit additional protection, and have different needs at different ages and stages of development. Such measures strengthen the need to adhere to the UK Children’s Code.

13. Smart Data Schemes

The DUA Act will give the Government the ability to pass secondary legislation to enable business data sharing. The aim is to implement Smart data schemes to grow the UK economy, encourage competition and benefit consumers. Currently we have data sharing models for open banking, and the plans is similar models will be extended to other sectors such as telecoms, healthcare, insurance and energy.

14. Digital verification services

The Act will create a framework to enable the introduction of trusted digital verification services. The idea is people will be able to prove their identity via trusted digital identify providers, without having to provide a physical form of ID or other form of documentation.

Digital ID verification has been adopted successfully by certain businesses, but take up is patchy and the Government is keen to accelerate progress. It’s hoped this new framework will simplify processes such as registering births and deaths, starting a new job, and renting a home.

15. New Information Commission

The Information Commissioner’s Office is set to be replaced by an Information Commission, which will be structured in a similar way to the FCA, OFCOM and the CMA – as a body corporate with an appointed Chief Executive. It’s anticipated this change will come into effect in 2027.

What about UK adequacy?

The DUA Act will be carefully scrutinised by the European Commission when it reviews adequacy decisions for the UK. These currently allow for the free flow of personal data between the EEA and UK, without the need for additional risk assessments or safeguard measures. The outcome of the EC review of these decisions is expected in December 2025. It’s hoped there’s nothing to scare the horses and UK adequacy will be renewed. Nonetheless, this is one to watch.

In summary, although reform has its critics, the changes to be introduced by the DUA Act are not overly dramatic. More detail and regulatory guidance will gradually become available, and I’d stress there’s no need to do anything immediately. Over the coming months we’ll be sure to keep you updated on developments.

Why Data Protection Officer isn’t just a title

How misunderstanding lingers about DPOs

When GDPR came into force more than seven years ago, it made it mandatory for certain organisations to appoint a Data Protection Officer (DPO) – certainly not all organisations. As a result there are more than 500,000 organisations with Data Protection Officers registered across Europe, according to IAPP research.

But even after so long, a good deal of confusion remains about which organisations need to appoint a DPO, and what the role actually entails. The DPO isn’t just a title you can dish out to whoever you choose.

When a DPO is mandatory

The law tells us organisations must appoint a DPO if you’re a Controller or a Processor and the following apply:

you’re a public authority or body (except for courts acting in their judicial capacity); or
your core activities require large-scale, regular and systematic monitoring of individuals (for example, online behaviour tracking); or
your core activities consist of large-scale processing of special categories of data or data relating to criminal convictions and offences.

This raises questions about what’s meant by ‘large-scale’ and what happens if your organisation falls within the criteria above but fails to appoint a DPO. When it comes to interpreting ‘large-scale’ activities, the European Data Protection Board Guidelines on Data Protection Officers provide some useful examples.

Despite the previous Conservative government’s data reform proposals including the removal of DPO role, I should stress under the soon to be enacted Data (Use & Access) Act, these requirements remain unchanged.

What to do if it’s not mandatory to appoint a DPO

Many small to medium-sized organisations won’t fall within the set criteria for mandatory appointment of a DPO. For many organisations, their processing is neither ‘large scale’ nor particularly sensitive in nature.

The ICO tells us all organisations need to have ‘sufficient staff and resources to meet the organisation’s obligations under the UK GDPR’. So, if you assess you don’t fall under the mandatory requirement, you have a choice:

voluntarily appoint a DPO, or
appoint an individual or team to be responsible for overseeing data protection. You can take a proportionate approach, based on the size of your organisation and the nature of the personal data you handle.

The DPO’s position

Many organisations don’t realise the law sets out the DPO’s position and their specific responsibilities. If you have a DPO, their responsibilities are not optional or up for debate. The law tells us DPOs must:

report directly to the highest level of management
be an expert in data protection
be involved, in a timely manner, in all issues relating to data protection
be given sufficient resources to be able to perform their tasks
be given the independence and autonomy to perform their tasks

It’s worth stressing appointing a DPO places a duty on the organisation itself (particularly senior management), to support the DPO in fulfilling their responsibilities. As you can see above, this includes providing resources, and enabling independence and autonomy.

Not just anybody can be your DPO. While they can be an internal or external appointment, and one person can represent several different organisations, steps should be taken to make sure there are no conflicts of interest. A CEO being the DPO, or the Head of Marketing might be obvious examples of where a conflict could easily arise.

The law sets out the DPO must perform their role in an independent manner. Their organisation shouldn’t influence which projects they should be involved in, nor interfere with how to execute their role. A DPO therefore needs to someone of character and resilience who can stand their ground, even in the face of potential conflict.

When it comes to being an ‘expert’, there’s a judgement call to make, as the law doesn’t specify particular credentials or qualifications. The level of experience and specialist skills can be proportionate to the type of organisation and the nature of the processing.

The tasks a DPO should perform

The formal set of tasks a DPO is required to perform are as follows:

inform and advise the organisation and its employees about their obligations under GDPR and other data protection laws. This includes laws in other jurisdictions which are relevant to the organisation’s operations.

It’s worth noting the DPO is an advisory role, i.e. to advise the organisation and its people. Their role is not to make decisions on the processing activities. There should be a clear separation between advisor and decision-maker roles. The organisation doesn’t need to accept the advice of their DPO, but the DPO would be wise to document when their advice is ignored. In many smaller organisations people may undoubtedly be spinning multiple plates and will need to do some (or plenty) of the ‘doing’ work.

monitor the organisation’s compliance with the GDPR and other data protection laws. This includes ensuring suitable data protection polices are in place, training staff (or overseeing this), managing data protection activities, conducting internal reviews & audits and raising awareness of data protection issues & concerns so they can be tackled effectively. This doesn’t mean a DPO has to write every data protection related policy, or stand up and deliver training.

advise on, and to monitor data protection impact assessments (DPIAs).

be the first point of contact for individuals in relation to data protection and for liaison with the ICO.

A DPO must also be easily accessible, for individuals, employees and the ICO. Their contact details should be published, e.g. in your privacy notice (this doesn’t have to include their name) and the ICO should be informed you’ve appointed a DPO.

A DPO shouldn’t be penalised for carrying out their duties. The ICO points out a DPO’s tasks cover all the organisation’s processing activities. Not just those which required a DPO to be appointed – such as ‘large scale processing of special category data’. However, the ICO accepts a DPO should prioritise and focus on more risky activities. ICO Data Protection Officer Guidance.

We’d always advise making sure a DPO’s responsibilities are clearly set out in a job description, to save any debate about the role. It’s helpful to make sure the management team and key stakeholders are briefed on the DPO’s legal role.

What’s clear is being a DPO requires many qualities, and a broad skill set, which we’ve written more about here: What does it take to do the job?

AI Risk, Governance and Regulation

June 2025

The Artificial Intelligence landscape’s beginning to remind me of a place Indiana Jones might search for hidden treasure. The rewards are near-magical, but the path is littered with traps. Although, in the digital temple of ‘The New AI’, he’s not going to fall into a pit of snakes or be squished by a huge stone ball. No, Indy is more likely to face other traps. Leaking sensitive information. Litigation. Loss of adventuring advantage to competing explorers. A new, looming regulatory environment, one even Governments have yet to determine.

And the huge stone ball? That will be when the power of the Lost AI goes awry, feeding us with incorrect information, biased outcomes and AI hallucinations.

Yes, regulation is important in such a fast-moving international arena. So is nimble decision-making, as even the European Commission considers pausing its AI Act. Nobody wants to be left behind. Yet, as China and the US vie for AI supremacy, are countries like the UK sitting on the fence?

AI has an equal number of devotees and sceptics, very broadly divided along generational lines. Gen Z and X are not as enamoured with AI as Millennials (those born between 1981 and 1996). A 2025 Mckinsey report found Millennials to be the most active AI users. My Gen Z son, says of AI, ‘I’m not asking a toaster a question.’ He also thinks AI’s insatiable thirst for energy will make it unsustainable in the longer term.

Perhaps he has a point, but I think every industry will somehow be impacted, disrupted and – perhaps – subsumed by AI. And as ever, with transformational new technologies, mistakes will be made as organisations balance risk versus advantage.

How, in this ‘Temple of the New AI,’ do organisations find treasure… without falling into a horrible trap?

How to govern your organisation’s use of AI

While compliance with regulations will be a key factor for many organisations, protecting the business and brand reputation may be an even bigger concern. The key will be making sure AI is used in an efficient, ethical and responsible way.

The most obvious solution is to approach AI risk and governance with a clear framework covering accountability, policies, ongoing monitoring, security, training and so on. Organisations already utilising AI may have already embedded robust governance. For others, here are some pointers to consider:

Strategy and risk appetite

Senior leadership needs to establish the organisation’s approach to AI; your strategy and risk-appetite. Consider the benefits alongside the potential risks associated with AI and implement measures to mitigate them.

AI inventory

Create an inventory to record what AI systems are already in use across the business, the purposes they are used for, and why.

Stakeholders, accountability & responsibilities

Identify which key individuals and/or departments are likely to play a role in governing how AI is developed, customised and/or used in your organisation. Put some clear guard rails in place. Determine who is responsible and accountable for each AI system. Establish clear roles and responsibilities for AI initiatives to make sure there’s accountability for all aspects of AI governance.

Policies and guidelines

Develop appropriate policies and procedures, or update existing policies so people understand internal standards, permitted usage and so on.

Training and AI literacy

Provide appropriate training. Consider if this needs to be role specific, and factor in ongoing training in this rapidly evolving AI world. Remember, the EU AI ACT includes a requirement for providers and developers of AI systems to make sure their staff have sufficient levels of AI literacy.

If you don’t know where to start, Use AI Securely provide a pretty sound free introductory course.

AI risk assessments

Develop and implement a clear process for identifying potential vulnerabilities and risks associated with each AI system.

For many organisations who are not developing AI systems themselves, this will mean a robust method for assessing the risks associate with third-party AI tools, and how you intend to use those tools. Embedding an appropriate due diligence process when looking to adopt (perhaps also customise) third-party AI SAAS solutions is crucial.

Clearly not all AI systems or tools will pose the same level of risk, so having a risk-based methodology to enable you to prioritise risk, will also prove invaluable.

Information security

Appropriate security measures are of critical importance. Vulnerabilities in AI models can be exploited, input data can be manipulated, malicious attacks can target training datasets, unauthorised parties may access sensitive, personal and/or confidential data. Data can be leaked via third party AI solutions.

We also need to be mindful of how online criminals exploit AI to create ever more sophisticated and advanced malware. For example, to automate phishing attacks. On this point, the UK Government has published a voluntary AI cyber security code of practice.

 Transparency and explainability

Are you being open and up front about your use of AI? Organisations need to be transparent about how AI is being used, especially when it impacts on individuals or makes decisions that affect them. A clear example here is AI tools being used for recruitment – is it clear to job seekers you’re using AI? Are they being fairly treated? Using AI Tools in Recruitment

Alongside this there’s the crucial ‘explainability’ piece – the ability to understand and interpret the decision-making processes of artificial intelligence systems.

Audits and monitoring

Implement a method for ongoing monitoring of the AI systems and/or AI tools you are using .

Legal and regulatory compliance

Keep up to date with latest developments and how to comply with relevant laws and regulations in different jurisdictions relevant for your operations.

My colleague Simon and I recently completed the IAPP AI Governance Professional training, led by Oliver Patel. I’d highly recommend his Substack which is packed with tips and detailed information on how to approach AI Governance.

Current regulatory landscape

European Union

The EU AI Act was implemented in August 2024, and is coming into effect in stages. Some people fear this comprehensive and strict approach will hold back innovation and leave Europe languishing behind the rest of the world. It’s interesting the European Commission is considering pausing its entry into application, which DLA Piper has written about here.

On 2nd February this year, rules came into effect in relation to AI literacy requirements, definition of an AI system and a limited number of prohibited AI use cases, which the EU determines pose an unacceptable risk.

Like GDPR, the AI Act has extra-territorial scope, meaning it applies to organisations based outside the EU (as well as inside) where they place AI products on the market or put them into service in the EU, and/or where outputs produced by AI applications are used by people within the EU. We’ve already seen how EU regulation has led to organisations like Meta and Google excluding the EU from use of its new AI products for fear of enforcement under the Act.

The European Commission has published guidelines alongside prohibited practices coming into effect. Guidelines on Prohibited Practices & Guidelines on Definition of AI System

UK

For the time being it looks unlikely the UK will adopt a comprehensive EU-style regulation. A ‘principles-based framework’ is supported for sector specific regulators to interpret and apply. Specific legislation for those developing the most powerful AI models looks the most likely direction of travel.

The Information Commissioner’s Office published a new AI and biometrics strategy on 5th June with a focus on promoting compliance with data protection law, preventing harm but also enabling innovation. Further ICO activity will include:

Developing a statutory code of practice for organisations developing or deploying AI.
Reviewing the use of automated decision making (ADM) systems for recruitment purposes
Conducting audits and producing guidance on the police’s use of facial recognition technology (FRT)
Setting clear expectations to protect people’s personal information when used to train generative AI foundation models
v Scrutinising emerging AI risks and trends.

The soon to be enacted Data (Use and Access) Act will to a degree relax current strict rules in relation to automated decision making which produces legal or similarly significant effects. The ICO for it’s part is committed to producing updated guidance on ADM and profiling by Autumn 2025. DUA Act: 15 key changes ahead

Other jurisdictions are also implementing or developing a regulatory approach to AI, and it’s worth checking the IAPP Global AI Regulation Tracker.

AI is here. It’s transformative and far-reaching. To take the fullest advantage of AI’s possibilities, keeping abreast of developments along with agile and effective AI governance will be key.

Are you collecting more data than you need?

Five good reasons to apply the data minimisation principle

How often when completing an online form, or downloading a new app, do you think, “why do they need this information?”

I often do. I get frustrated when I can’t fathom out why certain fields are mandatory, like phone number or date of birth. Okay, so I work in data protection. I’m highly tuned to being affronted by this stuff, but I doubt I’m alone.

Sometimes we’re forced to grit our teeth and soldier on (standing in the rain, desperately trying to download yet another parking app, forced to hand over our vital details).

But in other situations we can choose not to engage with companies because they ask for too much of our personal information, or immediately delete an app for the same reason. Alternatively, we may be tempted to provide bogus details, where we can’t see any reasonable purpose for the request (or suspect a phone number will purely be used to badger us).

Faced with yet another data-hungry form this week, I began thinking (again) about the benefits of minimising the personal information collected.

Yes, it’s a core data protection principle under GDPR / UK GDPR, meaning organisations are legally required to collect personal data, which is relevant, adequate and limited to what’s necessary for the purpose(s) it’s being used for. But it’s also a sound approach for other reasons….

Here are five more reasons for streamlining data collection…

1. Build trust

If people think you’re collecting more information than necessary, they may be sceptical, not trust you, and decide to disengage. People are more likely to put their trust in organisations who collect data responsibly.

2. Reduce data breach risks

Minimising personal data mitigates the severity of any impact if you suffer a data breach. This could not only reduce the risk for those affected but lessen the negative impact on your organisation. It could even be the difference between a reportable breach and one that’s unlikely to pose a risk. A data breach of purely names and email addresses, won’t routinely be as serious as a breach which also includes telephone numbers, dates of birth, postal addresses etc.

3. Improve accuracy

Data minimisation can improve the quality of your data, reducing the risk of holding outdated and inaccurate information. This in turn helps to meet another data protection principle; personal data must be accurate and kept up to date.

4. Prevent other uses

If you collect more personal details than you need, you’re leaving the door open to employees (perhaps unwittingly) deciding to use it for other, unintended or unauthorised purposes. Or a purpose which you haven’t been transparent about and may lead to complaints or regulatory action. And yes, this helps to meet another principle: purpose limitation.

5. Save time and complexity of privacy rights requests

Minimising the data held, can make the process of handling privacy rights requests more efficient. For example, there’s less data to sift through when responding to a DSAR, or less data to erase. It also saves awkward questions like, “why do you have this information?”

These points all apply more broadly than simply to information collected via online forms or apps. The principle of data minimisation applies to all the personal data an organisation collects, uses and stores. But as a starter for ten? Why not streamline those data collection forms, they’re a window into your attitude to people’s information, and what your potential customers see first.