Managing data deletion, destruction and anonymisation

How to keep what you need and get rid of what you don't

Clearing out personal data your business no longer needs is a really simple concept, but in practice it can be rather tricky to achieve! It throws up key considerations such as whether to anonymise or how to make sure its deleted or securely destroyed. Let’s take a look at the key considerations and how to implement a robust plan.

Data retention requirements and risks

Data protection law stipulates organisations must only keep personal data as long as necessary and only for the purposes they have specified. There are risks associated with both keeping personal data too long, or not keeping it long enough. These risks include, but are not limited to:

  • causing the impact of a personal data breach to be significantly worse – i.e. it involves personal data which an organisation has no justification for keeping. Regulatory enforcement action could be more severe and the damage to an organisation’s reputation worse This also raises the risk of class actions or individual compensation claims.
  • falling foul of relevant laws by failing to keep records for legally-defined periods.
  • an inability to respond to complaints, litigation or regulatory enforcement for failing to keep data necessary to meet contractual or commercial terms.

Data retention policy and schedule

To manage this legal obligation successfully, you’ll need to start with an up-to-date data retention policy and schedule. These should clearly identify which types of personal data your business processes, for what purposes, how long each should typically be kept and under what circumstances you might need to hold it for longer.

If your data retention policy or schedule is lacking, first focus on making sure these are brought up to scratch. Our Data Retention Data Retention Guidance has some useful templates.

5 Key steps when the retention period is reached

When an agreed retention period is reach (as per your retention schedule), we’d recommend taking the following steps:

  1. Identify the relevant records which have reached their retention period
  2. Notify the relevant business owner to confirm the data is no longer needed
  3. Consider any changes in circumstances which may require longer retention of the data
  4. Make a decision on what happens to the data
  5. Document the decision and keep evidence of the action

Making the right decision when the retention period is reached

There are different approaches an organisation can take when the data retention period is reached, such as:

  • Delete it – usually the default option
  • Anonymise it
  • Securely destroy it – for physical records, such as HR files

Deletion of records might seem the obvious choice, and it’s often the best one too, but take care how you delete data. Sometimes deleting whole records can affect key processes on your systems such as reporting, algorithms and other programs. Check with your IT colleagues first.


Most organisations want to extract increasing information and value from their digital assets. In some situations, it can be helpful to remove any personal identifiers so you can keep the data that remains after the retention period has been reached. For example,

  • You might want to continue to provide management information or historical analysis, which you can do an anonymised form. This is quite common
  • If you have data of historic marketing campaign responders, you may wish to keep certain non-personal campaign data in an anonymised form for reporting or analytical purposes, such as response volumes by segment, phasing of responses, and so on
  • If you hold records of job applicants you may wish to keep certain demographics (such as gender or diversity information) in an anonymised form. This might support your equal opportunities endeavours

To be clear, anonymisation is the process of removing ALL information which could be used to identify a living person, so the data that remains can no longer be attributed back to any unique individuals.

Once these personal identifiers are deleted, data protection laws do not apply to the anonymised information that remains, so you may continue to hold it. But you have to make sure it is truly anonymised.

The ICO stresses you should be careful when attempting to anonymise information. For the information to be truly anonymised, you must not be able to re-identify individuals.  If at any point reasonably available means could be used to re-identify the individuals, the data will not have been effectively anonymised, but will have merely been pseudonymised. This means it should still be treated as personal data.

Whilst pseudonymising data does reduce the risks to data subjects, in the context of retention, it is not sufficient for personal data you longer need to keep.

How to manage deletion

There are software methods of deleting data, which may involve removing whole records from a dataset or overwriting them. For example, using of zeros and ones to overwrite the personal identifiers in the data.

Once the personal identifiers are overwritten, that data will be rendered unrecoverable, and therefore it’s no longer classed as personal data.

This deletion process should include backup copies of data. Whilst personal data may be instantly deleted from live systems, personal data may still remain within the backup environment, until it is overwritten.

If the backup data cannot be immediately overwritten it must be put ‘beyond use’, i.e. you must make sure the data is not used for any other purpose and is simply held on your systems until it’s replaced, in line with an established schedule.

Examples of where data may be put ‘beyond use’ are:

  • When information should have been deleted but has not yet been overwritten
  • Where information should have been deleted but it is not possible to delete this information without also deleting other information held in the same batch

The ICO (for example) will be satisfied that information is ‘beyond use’ if the data controller:

  • is not able, or will not attempt, to use the personal data to inform any decision about any individual or in a way that affects them;
  • does not give any other organisation access to the personal data;
  • has in place appropriate technical and organisational security; and
  • commits to permanently deleting the information if, or when, this becomes possible.

Destruction of physical records

Destruction is the final action for about 95% of most organisations’ physical records. Physical destruction may include shredding, pulping or burning paper records.

Destruction is likely to be the best course of action for physical records when the organisation no longer needs to keep the data, and when it does not need to hold data in an anonymised format.

Controllers are accountable for the way personal data is processed and consequently, the disposal decision should be documented in a disposal schedule.

Many organisations use other organisations to manage their disposal or destruction of physical records. There are benefits of using third parties, such as reducing in-house storage costs.

Remember, third parties providing this kind of service will be regarded as a data processor, therefore you’ll need to make sure an appropriate contract is in place which includes the usual data protection clauses.

Destruction may be carried out remotely following an agreed process. For instance, a processor might provide regular notifications of batches due to be destroyed in line with documented retention periods.

Don’t forget unstructured data!

Retention periods will also apply to unstructured data which contains personal identifiers. The most common being electronic communications records such emails, instant messages, call recordings and so on.

As you can imagine, unstructured data records present some real challenges. You’ll need to be able to review the records to find any personal data stored there, so it can be deleted in line with your retention schedules, or for an erasure request.

Depending on the size of your organisation, you may need to use specialist software tools to perform content analysis of unstructured data.

In summary, whilst data retention as a concept appears straightforward, it does require some planning, clearly assigned responsibilities for implementing retention periods, and the technical means to do so effectively.

UK Data Protection and Digital Information Bill Summary

Could the DPDI Bill pass into law before a general election?

There’s a very real possibility the UK’s Data Protection and Digital Information Bill (DPDI) may sneak through onto the statute books before a general election.

The Government’s stated aim in reforming UK data laws is to ease the burden on businesses, particularly smaller ones. GDPR is perceived by some to be overly burdensome, onerous, and at times a ‘box-ticking’ exercise. While the DPDI Bill has been welcome in some quarters, it faces fierce criticism in others.

The Bill sets out amendments to UK GDPR, the Data Protection Act (DPA 2018) and the Privacy and Electronic Communications Regulations (PECR).

Here’s a reminder of the changes which might come into effect and what they’d mean in practice.

In our opinion here at the DPN there’s nothing massively radical about the DPDI Bill. The core data protection principles, individual privacy rights and controller/processor obligations will remain the same. Yes, there’ll still be a need for detailed contracts between clients and their suppliers. How to manage suppliers and service providers

For many larger organisations which operate across EU / global markets, as well as the UK, it could be mostly business as usual with GDPR remaining the benchmark.

There’s unlikely to be a huge impact on most small to medium sized businesses whose processing is not particularly large scale or sensitive. Existing law already provided extra flexibility for these SMEs, for example they may not need to appoint a Data Protection Officer, or to create and maintain a Record of Processing Activities.

For others depending on their size, nature of their business and operational structural, it may necessitate changes and potential efficiencies. Remember, nothing is set in stone yet!

8 key areas of data reform

The Bill is over 200 pages long, so we’ve selected some broad top-level points, summarising what’s proposed and our take on these potential changes.

1. Records of Processing Activities (RoPA)

Currently organisations (both controllers and processors) are required to keep a RoPA, and GDPR sets out the requirements for what information must be included in your RoPA. There’s a limited exemption for organisations with less than 250 employees where the processing is not high risk and does not involve special category or criminal convictions data.

What’s proposed?

The requirement to have a RoPA as stipulated under GDPR will be removed. Organisations which carry out ‘high risk’ processing would be required to keep ‘appropriate records’. Other organisations would still be under an accountability obligation to make sure appropriate measures are in place to comply with data protection law and protect personal data.

Our take on scrapping the RoPA requirement

A RoPA is a valuable business asset, to identify and keep track of what data you have and where, what it’s used for, your lawful basis, any international data transfers and so on. It’s fundamental to many other data protection processes. It can prove invaluable in getting to grips with the full scope of your processing, identifying data risks, assisting with transparency requirements (e.g. privacy notices), fulfilling individual privacy rights requests and handling data breaches.

However, we know from DPN audience surveys creating and maintaining a RoPA can be a real headache for organisations. Many say their current records don’t fully meet GDPR requirements or ICO expectations. For some businesses, creating the RoPA can lead to duplication of effort and many businesses have taken a risk-based approach, focusing on their main risk areas.

We wouldn’t recommend ditching any hard work you may have already done, because you can still gain benefit from it. If your RoPA isn’t complete, these proposed changes could take the pressure off somewhat.

For smaller businesses (below the current RoPA threshold) we would always recommend keeping some form of record of your activities, and our advice wouldn’t change. Listen back to our webinar on data discovery and record keeping.

2. Data Protection Risk Assessments

Currently organisations are required to conduct a Data Protection Impact Assessment (DPIA) for ‘high-risk’ processing activities. The ICO and EU regulators provide a list of examples of when a DPIA must be conducted (and when it might be a good idea). EU/UK GDPR sets out what criteria should be included in these assessments.

What’s proposed?

The specific requirements relating to a DPIA will be removed. Organisations will need to conduct risk assessments for ‘high risk’ processing, but will have more flexibility and won’t be tied to specific DPIA requirements or templates.

Our take on scrapping DPIAs

Increased flexibility for organisations regarding when and how they conduct risk assessments should be welcomed. However, if you currently have an effective risk screening process and DPIA template which works for your organisation, and many do, you may decide there’s no reason to ‘fix something that’s not broken’. Also, don’t forget you may still be under an obligation to conduct DPIAs if subject to EU GDPR.

DPIAs are a well-established method to identify and mitigate privacy risks prior to the launch of any project involving personal data. We recognise some organisations may choose to benefit from this new flexibility and look for efficiencies by adopting a streamlined and perhaps bespoke process for risk assessments. Quick guide to DPIAs

3. Senior Responsible Individual for data protection

Currently some (but certainly not all) organisations fall within the mandatory requirement to appoint a Data Protection Officer. Others have voluntarily chosen to appoint one. It’s worth noting a DPO’s position within the business, responsibilities and tasks are mandated under EU/UK GDPR.

What’s proposed?

The requirement to appoint a DPO will be scrapped. Public authorities and other organisations carrying out ‘high risk’ processing will be required to appoint a Senior Responsible Individual (SRI) – someone accountable in the business for data protection compliance. This individual must be a member of senior management.

The proposed changes are also likely to impact on what ‘accountability’ looks like, and what businesses would be expected to have in place to demonstrate their compliance with data protection law. Currently the ICO has a detailed accountability framework. We understand a new ‘risked-based accountability framework’ will be introduced, requiring organisations to have in place a Privacy Management Programme, with flexibility to tailor this to suit the scale and nature of the organisation’s specific processing activities. It’s thought likely any existing accountability measures in place to comply with GDPR would not have to be changed.

Our take DPO changes

There’s been plenty of confusion since GDPR came into force in 2018 about which organisations are required to appoint a DPO. Some businesses have felt they needed to appoint one when in fact they didn’t need to. Others have appointed DPOs virtually in name only, without fully appreciating the legal obligations relating to the role. DPO myth buster

This change could give organisations more flexibility, but equally it could muddy the waters and potentially lead to conflicts of interest. The independent advice a DPO should give may be lost.

4. Vexatious Data Subject Access Requests

Currently requests under the Right of Access (aka DSARs/SARs) can be refused, in part or in full, if there are judged to be ‘manifestly unfounded’ or ‘manifestly excessive’.

What’s proposed?

A concept of ‘vexatious or excessive’ will replace ‘manifestly unfounded or excessive’. Controllers will be permitted to take into account whether a request is intended to cause distress, is made in bad faith or is an abuse of power.

Our take on vexatious DSARs

Anecdotally we know of many cases where DSARs are being seen to be ‘weaponised’; not submitted to benefit the individual, but used primarily as a means to cause problems for an organisation. We welcome changes giving businesses increased grounds to decline inappropriate requests, where it’s clear the individual is not genuinely making the request because they want copy of their personal data. DPN DSAR Guide

5. Recognised Legitimate Interests

Currently organisations can rely on the lawful basis of legitimate interests when the processing is considered to be necessary and balanced against the interests, rights and freedoms of individuals. There’s a requirement to conduct a balancing test; a Legitimate Interests Assessment (LIA).

What’s proposed?

The concept of ‘recognised’ legitimate interests is planned, where there will be an exemption from the requirement to conduct a balancing test (LIA) in certain situations. These ‘recognised’ legitimate interests cover purposes such as national security, public security, defence, emergencies, preventing crime, safeguarding and democratic engagement.

The Bill also includes other examples where legitimate interests could be appropriate, but would require a balancing test. Examples include; direct marketing, intra-group transmission for admin purposes and security of network and information systems (although we are a little surprised the latter didn’t make it onto the list of recognised legitimate interests).

Our take on ‘recognised’ legitimate interests

We welcome this change, as it makes sense to reduce the paperwork required for activities which are straight-forward or very clearly in the interests of both the organisation and individuals.

The fact direct marketing may be carried out as a legitimate interest is not new. This is already in GDPR Recital 47; but this is reinforced by its presence in the Bill. This is a welcome clarification, but we would caution under the UK’s Privacy and Electronic Communications Regulations (PECR) there will still be certain circumstances where consent is required. Quick Guide to GDPR and PECR

6. Extension of the ‘soft-opt-in’ exemption under PECR for charities & other not-for-profits

Currently under PECR it’s a requirement to have consent to send electronic marketing, for example email or text marketing messages, unless you can rely on and meet the requirements of the so-called ‘soft opt-in’ exemption. This exemption is only available where the data is used for commercial purposes. It’s use by charities is very limited to the context of a sale, for example selling goods in a charity shop, and can’t, for example, be used in the context of donations.

What’s proposed?

The soft opt-in exemption will be extended to non-commercial organisations, covering where the direct marketing is:

  • solely for the purpose of furthering charitable, political or other non-commercial objectives (i.e. including donations!)
  • where the contact details have been obtained during the course of a recipient expressing an interest or providing support, and
  • where the recipient is given a clear and simple means of objecting to direct marketing at the point their details were collected, and in every subsequent communication.

Our take on extending use of soft opt-in

We welcome the move to allow charities to take advantage of an exemption which has been available for commercial purposes for years. Clearly, it will be for each charity to decide whether they stick with consent or change to soft opt-in. It can only be used going forward – it’s not an opportunity to re-contact those who didn’t give consent or opted-out in the past!

Charities will have to carefully think through the pros and cons of moving to soft opt-in and would be wise to check if their CRM systems could store multiple permission statuses for legacy data alongside new data gathered under soft opt-in. What could the marketing soft opt-in mean for charities?

7. Cookies and similar technologies

Currently informed consent is required under PECR for all cookies and similar technologies deployed onto a user’s device. There is a limited exemption for ‘strictly necessary’ cookies.

What’s proposed?

There are provisions to expand the categories of cookies which don’t require consent, for example website analytics. There’s also a desire to reduce or eliminant the need for cookie pop-ups but it’s not yet clear how exactly this will be achieved.

Our take on cookies

Many businesses would welcome easing the existing requirements, although we anticipate few websites will, in reality, be able to compliantly get rid of cookie banners, unless radical changes are made! We look forward to clarification on exactly how the proposed changes might work in practice to benefit businesses and the public.

8. Increased fines under PECR

Currently, fines for violations under UK PECR are capped at £500,000.

What’s proposed?

Bringing the level of maximum fines in line with UK GDPR, meaning the ICO could issue fines of up to circa £17 million, or 4% of a business’s global turnover.

Our take on increased PECR fines

The ICO tends to take a proportionate approach to enforcement, and we envisage substantial fines would be reserved for spammers and rogue telemarketing businesses who flagrantly disregard the rules. If this goes some way to deterring bad operators and protecting the public, this could be a good thing.

Other DPDI Bill points worth noting

Scientific research

The Bill includes specific changes in relation to using personal data for scientific research, and what qualifies as scientific research. (This area could be an article in itself!)

International data transfers

The Bill doesn’t propose any significant changes to the international data transfer regime. It makes it clear mechanisms entered into before the Bill takes effect will continue to be valid. At last, some welcome news for all those grappling with the UK ITDA or the EU’s SCC with UK addendum! International Data Transfers Guide


The above just touches on key proposals, as said, it’s a very lengthy document! In our view the UK’s Data Protection and Digital Information Bill marks a significant but not giant step away from GDPR. There are good reasons why the Government is keen not to diverge too far. It does not want to risk the current European Commission ‘adequacy decision’ for the UK being overturned. This adequacy decision allows for the free flow of personal data between the EU and UK, and there could be a significant negative impact for many businesses if UK adequacy is revoked. We don’t know yet if the European Commission will view the Bill as a step too far.

What next?

The DPDI’s third reading is set for mid June, and there’s a possibility it will gain Royal Assent – i.e. be enacted into law before Parliament’s Summer Recess (which starts on 24 July).

Tackling AI and data protection

Applying key data protection principles to AI models

The growth of AI continues at a tremendous rate. While many people are jumping in with both feet, others have growing concerns about the implications for individuals and their personal data.

Generative AI and Large Language Models

Generative artificial intelligence relates to algorithms, such as ChatGPT, which can be used to create new content like text, images, video, audio, code and so on. Recent breakthroughs in generative AI has huge potential to affect our whole approach to creating content.

ChatGPT for instance relies on a type of machine learning called Large Language Models (LLMs). LLMs are usually VERY large deep-neural-networks, trained on giant datasets such as published webpages. Recent technology advances have enabled LLMs to become much faster and more accurate.

What are the main concerns?

With increased capabilities and the growth in adoption of AI come existing and emergent risks. We are at trigger point, where governments and industry alike are keen to realise the benefits to drive growth. The public too are inspired to try out AI models for themselves.

There’s an obvious risk of jobs being displaced, as certain tasks carried out by humans are replaced by AI technologies. Concerns recognised in the technical report accompanying GPT-4 include:

  • Generating inaccurate information
  • Harmful advice or buggy code
  • The proliferation of weapons
  • Risks to privacy and cyber security

Others fear the risks posed when training models using content which could be inaccurate, toxic or biased – not to mention illegally sourced!

The full scope and impact of these new technologies is not yet unknown and new risks continue to emerge. But there are some questions that need to be answered sooner rather than later, such as:

  • What kinds of problems are these models best capable of solving?
  • What datasets should (and should not) be used to create and train generative AI models?
  • What approaches and controls are required to protect the privacy of individuals?
  • What are the main data protection concerns?

Data inputs

The datasets used to train generative AI systems are often likely to contain personal data that might not have been lawfully obtained. In many AI models, the data used may be obtained by “scraping” (the automated gathering of data online), which often violates most privacy principles.

Certain information may have been used without consideration of intellectual property rights, where the owners have not been approached nor given their consent for use.

The Italian Data Protection Authority (Garante) blocked ChatGPT, citing its illegal collection of data and the absence of systems to verify the age of minors. Some observers have pointed out these concerns are broadly similar to why Clearview AI received an enforcement notice.

Data outputs

AI not only ingests personal data, but may also generate it. Algorithms can produce new data that may unexpectedly exposes personal details, which leaves individuals with limited control over their data.

There are many other concerns such as transparency, algorithmic bias and inaccurate predictions and the risk of discrimination. Fundamentally, there are concerns that appropriate accountability for AI is often lacking.

Key considerations for organisations looking to adopt AI

We need to understand what people across the business are already doing with AI, or planning to do. Get clarity about any personal data they are using; particularly any sensitive or special category data. Make sure they are aware of the potential risks and know what questions to ask, rather than dive straight in.

We suggest you start by talking business leaders and their teams to identify emerging uses of AI across your business. It’s a good idea to carry out Data Protection Impact Assessment (DPIA) to assess privacy risks and identify proportionate privacy measures.

Rather than adopting huge ‘off-the-shelf’ generative AI models like Chat GPT (and what may come next), businesses may consider adopting smaller, more specialised AI models trained on the most relevant, compliantly gathered datasets.

Differing regulatory approaches

EU – The EU has adopted the world’s first Artificial Intelligence Act. Its aim is to ban unacceptable use of artificial intelligence and introduce specific rules for AI systems proportionate to the risk they pose. It’s taking a ‘harm and risk’ approach which will impose extensive requirements on those developing and deploying high-risk AI systems, yet be lighter touch for low risk/low harm AI applications.
Some have questioned whether existing data protection and privacy laws are appropriate for addressing AI risk, which can increase privacy problems and add new complexities to them. IAPP EU AI Cheat Sheet

UK – Despite calls for targeted regulation, the UK has no EU-equivalent legislation and currently looks unlikely to get one in the foreseeable future. The Government says it’s keen not to rush in and legislate on AI, fearing specific rules introduced too swiftly could quickly become outdated or in effective. For the time being the UK is sticking to a non-statutory principles-based approach, focusing on the following:

  • Safety, security, and robustness;
  • Appropriate transparency and explainability;
  • Fairness;
  • Accountability and governance; and
  • Contestability and redress.

Key regulators such as the Information Commissioner’s Office (ICO), the Financial Conduct Authority (FCA) and others are being asked to take the lead. Alongside this a new advisory service; the AI and Digital Hub has been launched.

There’s a recognition advanced General Purpose AI may require binding rules. The government’s approach is set out in its response to the consultation on last year’s AI Regulation White Paper. ICO guidance can be found here: Guidance on AI and data protection. Also see Regulating AI: The ICO’s strategic approach April 2024

US – In the US a number of AI guidelines and frameworks have been published. The National AI Research and Development Strategic Plan was updated in 2023. This stresses a co-ordinated approach to international collaboration in AI research.

As for the rest of the world, the IAPP has helpfully published a Global AI Legislation Tracker 

Wherever you operate it is vital data protection professions seek to understand how their organisations are planning to use AI, now and in the future. Evaluate how the models work and assess any data protection and privacy risks before adopting them.

Data Protection Officers Myth Buster

March 2024

Why we don't ALL need a DPO!

Most small organisations, and many medium-sized businesses don’t have to appoint a Data Protection Officer. This is only a mandatory requirement under GDPR, and it’s British spin-off UK GDPR, if your organisation’s activities meet certain criteria.

However, this doesn’t mean you can’t voluntarily choose to appoint a DPO. However, it is worth bearing in mind the role of a Data Protection Officer is clearly defined in law. EU/UK GDPR sets out the position of a DPO, specific tasks they’re responsible for, and how the organisation has a duty to support the DPO to fulfil their responsibilities.

In the UK there are controversial plans to remove the role from data protection legislation. Whether this comes into effect all depends on the progress of the UK Data Protection and Digital Information Bill. I’ll come onto this in a bit later on.

The DPO Confusion!

I believe GDPR (perhaps inadvertently, through media coverage and elsewhere) created a degree of confusion about who needed a DPO and what the role actually entails.

It led many businesses to voluntarily appoint one, thinking they really should. It led clients to include ‘do you have a DPO?’ in their due diligence questionnaires. Suppliers to think, ‘oh we better have one.’

Some organisations understood the DPO requirements, others perhaps less so. Many will have correctly informed the ICO (or relevant EU regulator) who their DPO is, others won’t.

Some DPOs will be striving to fulfil their designated tasks, others won’t have the resources to do this, some may be blissfully unaware of the legal obligations their role carries with it.

When is it currently mandatory to have a DPO?

The law tells us you NEED to appoint a DPO if you’re a Controller or a Processor and the following apply:

  • you’re a public authority or body (except for courts acting in their judicial capacity); or
  • your core activities require large-scale, regular and systematic monitoring of individuals (for example, online behaviour tracking); or
  • your core activities consist of large-scale processing of special categories of data or data relating to criminal convictions and offences.

This raises questions about what’s meant by ‘large-scale’ and what happens if you are found not to have appointed a DPO when you should have.  The truth is many smaller businesses and not-for-profits don’t have to have one.

When it comes to interpreting ‘large-scale’ the European Data Protection Board Guidelines on Data Protection Officers, provide some examples.

What are your current options if you don’t fall under mandatory requirements?

The ICO tells us all organisations need to have ‘sufficient staff and resources to meet the organisation’s obligations under the GDPR’. If you don’t fall under the mandatory requirement, you currently have a choice:

  • voluntarily appoint a DPO, or
  • have a team or individual responsible for overseeing data protection, in a proportionate way based on the size or your organisation and the nature of the personal data you handle.

What is the ‘position’ of the DPO?

If you appoint a DPO, UK/EU GDPR tells us they must:

  • report directly to the highest level of management
  • be given the independence and autonomy to perform their tasks
  • be given sufficient resources to be able to perform their tasks
  • be an expert in data protection
  • be involved, in a timely manner, in all issues relating to data protection.

In short, not just anybody can be your DPO.

They can be an internal or external appointment.  In some cases a single DPO can be appointed for represent several organisations. They can perform other tasks, but there shouldn’t be a conflict of interests.  For example a Head of Marketing also being the DPO might be an obvious conflict.

A DPO must also be easily accessible, for individuals, employees and the ICO.  Their contact details should be published (e.g. in your privacy notice – this doesn’t have to be their name) and the ICO should be informed who they are.

What tasks should a DPO fulfil?

The DPO role currently has a formal set of accountabilities and duties, laid down within the GDPR.

  • Duty to inform and advise the organisation and its employees about their obligations under GDPR and other data protection laws. This includes laws in other jurisdictions which are relevant to the organisation’s operations.
  • Duty to monitor the organisation’s compliance with the GDPR and other data protection laws. This includes ensuring suitable data protection polices are in place, training staff (or overseeing this), managing data protection activities, conducting internal reviews & audits and raising awareness of data protection issues & concerns so they can be tackled effectively.
  • Duty to advise on, and to monitor data protection impact assessments (DPIAs).
  • Duty to be the first point of contact for individuals whose data is processed, and for liaison with the ICO.

In short, you can’t appoint a DPO in name only.

It’s also worth noting, if you don’t listen to the advice of your DPO you should document why you didn’t follow up on their recommended actions. Also a DPO cannot be dismissed or penalised for performing their duties.

What changes are on the cards in the UK?

The mandatory requirement to appoint a DPO is set to be dropped, IF the Data Protection and Digital Information Bill becomes law (without changes to the current draft text being made). Instead the DPDI Bill includes a new requirement to appoint a ‘senior responsible individual’ (SRI) for data protection, who is part of the organisation’s senior management.

It irks me somewhat the removal of this requirement is cited as way of easing the legislative burden on small businesses. As said, many small to medium sized businesses don’t fall under the current requirement to appoint one.

It seems this role won’t have the strict independence requirements of a DPO under GDPR and the proposed change raises a number of questions. What happens to existing DPOs? Will they need to be appointed to senior management?  Or will a member of the senior management team need to be appointed as SRI and be able to delegate tasks to the existing DPO? What about organisations who operate in Europe and need a DPO under EU GDPR?

Clarity on this would be very welcome. But it remains to be seen whether the DPDI Bill will become law.

Cookies – Consent or Pay?

March 2024

UK and EU data protection regulators are grappling with the compliance of the so-called ‘consent or pay’ model, also known as ‘pay or okay’. Put simply, this model means accessing online content or services is dependent on users either consenting to being tracked for advertising purposes (using cookies or similar technologies), or paying for access without tracking and ads.

This model – and the varying approaches to it – raises questions about whether this can be fair, and whether consent can be ‘freely given’. But it also touches on far more than data protection. It speaks to acceptable business practices, competition models, consumer protection laws, accessible credible journalism and more.

Ad-funded online content and services

‘Consent or pay’ is one of a number of solutions intended to address issues surrounding online advertising and its use of cookies. None of them, it has to be said, are perfect.

This is all coming to a head as data protection regulators in Europe and the UK push for compliance with cookie laws (e.g. PECR in the UK). For example, the UK’s ICO says for the necessary consent to be valid website operators must make sure it’s as easy for people to ‘Reject all’ advertising cookies as it is to ‘Accept all’. More UK companies to be targeted for non-compliant cookies

This causes a problem. As increasing numbers click ‘Reject all’, advertising revenues will take a significant hit. And advertising matters. When a US Senator asked Mark Zuckerberg how Facebook remained free, he famously and simply answered; “We run ads”.

It’s a point that can be made more broadly – we’ve all enjoyed a vast amount of free online content and services because of personalised advertising. Lots of the content and services we routinely access online are ad-funded and rely on a large percentage of users accepting cookies to target these ads. It’s why we can waste time (or relax) playing online games for free.

Online content and service providers have to pay people to create content, run websites, create apps and so on. Commercial businesses also want to turn a profit. The balance lies between the quality, value and integrity of the content they offer, and the advertising revenues which can be gained by personalised advertising.

We’ve all been tracked and served adverts as we browse the internet. Personalised ads mean we have a better chance of being shown ads for products and services which match our interests and needs. Yes, some of this activity is annoying, trades on our habits and may sometimes even be downright harmful. That isn’t to say all of it is problematic; again, this is a question of balance. Regulators have to tread a delicate line between protecting end-users without hampering business from offering us fair products, content and services.

We may not want to be tracked, but online publishers and service providers can’t be expected to provide something for nothing. Businesses aren’t under any obligation to provide us with stuff completely for free.

Which brings us back to the concept of ‘consent or pay’. This concept hit the headlines last year when Meta introduced a payment option to users of Facebook and Instagram in the EU (not in the UK), offering an ad-free experience for a fee. This is currently the subject of complaints by consumer rights groups in Europe. Meanwhile the ‘consent or pay’ approach has been adopted by some of Germany’s major newspapers, and others.

Just pay

Another option is for all content to be put behind a pay wall. For example, in the UK you have to subscribe and pay to read online articles published by the Telegraph, The Times and the Spectator magazine. Often a limited number of free articles are provided before you have to pay.

Cookie free solutions

Other cookie-less ad solutions are being rapidly developed, such as contextual advertising. You can read more about the options here: Life after cookies

But with solutions which don’t use third-party tracking cookies still in their infancy, and concerns they won’t be able to produce the same return on investment as cookie-driven advertising, there’s a need to plug the funding gap fast.

‘Consent or pay’ – compliant or not compliant?

In the UK, the ICO hasn’t decreed whether ‘consent or pay’ is a fair approach or not. It’s asked for feedback, and in doing so set out its initial ‘view’.

While stating UK data protection law doesn’t prohibit ‘consent or pay’, the Regulator says organisations must focus on people’s interests, rights and freedoms, making sure people are fully aware of their options in order to make free and informed choices. It’s worth noting that in the EU, ‘consent or pay’ is not prohibited either.

The ICO has set out four areas which need to be addressed when adopting this model, and has asked for feedback on any other factors which should be taken into account.

1. Imbalance of power

The ICO says consent for advertising will not be freely given in situations where people have little or no choice about whether to use a service or not. This could be where the provider is a public service or has a ‘position of market power’.

2. Equivalence of services

If the ad-free service bundles in other additional ‘premium’ extras, this could affect the validity of consent for the ad-funded service.

3. Appropriate fee

Consent for targeted advertising is, in the ICO’s view, unlikely to be freely given if the alternative is an “unreasonably high fee”. The Regulator is suggesting the fee should be set at a level which gives people a realistic choice between the options.

4. Privacy by design

Any consent request choices should be presented equally and fairly. The ICO says people should be given clear, understandable information about each option. Consent for advertising is unlikely to be freely given if people don’t understand how their personal information is going to be used.

Another key consideration is how people can exercise their right to withdraw their consent. The ICO reiterates it must be as easy for people to withdraw their consent as it is to give it. Organisations also need to make sure users can withdraw their consent without detriment. This may be a tricky circle to square.

In all of this there’s an important point – whilst consent must be ‘freely given’ under EU/UK data protection law, this doesn’t translate into meaning people must get content and services free too. The ‘consent or pay’ model, essentially offers a choice between pay with your data, or pay with your money.

Etienne Drouard is a Partner at Hogan Lovells (Paris) and his view is; “The very nature of consent is being offered an informed choice. ‘Pay or OK’ ( ‘Pay or Consent’) is, per se, a valid alternative. It requires a case-by-case and multi-disciplinary analysis. Not a ban.”

Have your say – UK ICO Call for Feedback on Consent or Pay

Time to plan ahead

Fedelma Good, Data Protection and ePrivacy Consultant, and former board member of the UK Data & Marketing Association, urges advertisers and publishers to plan ahead; “To say that online advertising is entering a period of turmoil is putting it mildly. Combining the issues of ‘consent or pay’ with Google’s cookie deprecation plans and you have an environment of uncertainty which advertisers and publishers alike will ignore at their peril. My advice to anyone reading this article is not only to track developments in these areas carefully, but perhaps more importantly to make sure you understand your own circumstances and options and plan ahead.”

Privacy and consumer rights groups

It’s clear privacy and consumer rights groups are pushing for change. Back in 2021 cookie banners were the focus, with the privacy rights group firing off hundreds of complaints to companies for using ‘unlawful banners’. The group developed software to recognise various types of unlawful banners and automatically generate complaints.

Max Schrems, Chair of noyb said: “A whole industry of consultants and designers develop crazy click labyrinths to ensure imaginary consent rates. Frustrating people into clicking ‘okay’ is a clear violation of the GDPR’s principles. Under the law, companies must facilitate users to express their choice and design systems fairly. Companies openly admit that only 3% of all users actually want to accept cookies, but more than 90% can be nudged into clicking the ‘agree’ button.”

Now the attention has turned to ‘consent or pay’, Meta’s use of this model has led to eight consumer rights groups filing complaints with different data European data protection authorities. The claims focus on concerns Meta makes it impossible for consumer to know how the processing changes if they choose one option or another. It’s argued the choice given is meaningless.

The fundamental right to conduct business

There’s a complex balance here between people’s fundamental privacy rights and the fundamental right to conduct business. For publishers and other online services, advertising is a crucial element of conducting business. In the distant past, advertising was expensive.

As Sachiko Scheuing European Privacy Officer at Acxiom & Co Chairwoman, FEDMA succinctly puts it; “Advertising used to be a privilege enjoyed by huge brands. Personalised advertisement democratised advertising to SMEs and start-ups.”

The growth of the internet and the advent of personalised advertising technologies has undoubtedly made digital advertising affordable and effective for smaller businesses and not-for-profits.

Well-established brands are more likely to be able to put up a paywall. People already trust their content, or enjoy their service and are prepared to pay. There’s a risk lesser-known brands and start-ups won’t be able to compete.

Is credible journalism under threat?

A Data Protection Officer at one premium UK publisher, who wishes to remain anonymous, fears the drive for cookie compliance risks damaging the ability to produce high quality journalism.

“In the face of unprecedented industry challenges, as more content is consumed on social media platforms, the vital ad revenues that support public interest journalism are under threat from cookie compliance, of all things. It seems like data regulators either don’t understand, or don’t care, about the damage they’re already inflicting on the news media’s ability to invest in journalism.

If publishers comply and implement “reject all” they lose ad revenue through decimated consent rates. If they fight their corner, they face enforcement action. Either way, publishers are emptying already dwindling coffers on legal fees, or buying novel consent or pay solutions.

Unless legislative change comes quickly, or the regulators realise that cookie compliance should not be an enforcement priority, local and national publishers may disappear, just at a time when trusted sources of news have never been more needed.”

Broader societal considerations

There’s a risk as more content hides behind paywalls, we’ll create a world where only those who can afford to pay will be able to access quality, trustworthy content.

‘Consent or Pay’ may be far from perfect, but it does allow people who can’t afford to pay to have equal access to content and online services. Albeit they get tracked, and those who have money to spend can choose to pay and go ad-free.

If the consent or pay model fails, and cookie-less solutions fail to deliver a credible alternative, I fear more decent journalism will go completely behind pay walls . If that’s the only option to plug the funding gap.

I am in my mid-50s and can afford to pay. My son, in his late teens, can’t. I worry poor quality journalism, fake news and AI-generated dross might soon be all he and his generation will be able to access. That’s not to say there isn’t some great user-generated content out there. But it does mean having difficult and honest conversations about regulation and the right of businesses to make a profit in an age of politicised, fraudulent and bogus online content.

International Data Transfers Guide

March 2024

A top-level overview of international data transfers

There are restrictions under UK and EU data protection law when transferring personal data to organisations in other countries, and between the UK and EU.

The rules regarding restricted transfers can be an enigma to the uninitiated and their complexity has been magnified by Brexit and by an infamous 2020 European Court ruling known as ‘Schrems II’.

This guide aims to give an overview of what international data transfers are and the key data protection considerations. It does not cover all the intricacies, nor data transfers for immigration and law enforcement purposes. Also please be aware there may be specific restrictions in place under laws in other territories around the world.

As a general rule, controllers based in the UK or EU are responsible for making sure suitable measures are in place for restricted transfers to other controllers, or to processors. A processor will be responsible when they initiate the transfer, usually to a sub-processor.

Some might be thinking; what would be the impact if we just put all of this into the ‘too difficult’ tray? It’s certainly an area which many feel has become unduly complicated and an onerous paperwork exercise.

However, getting the detail right will pay off should things go wrong. For example, if a supplier you use based overseas suffers a data breach, the consequences may be more significant if you have not covered off legal requirements surrounding restricted transfers. It’s an area likely to come under regulatory scrutiny, in the event of a breach or should a complaint be raised.

What is an international data transfer?

An international data transfer refers to the act of sending or transmitting personal data from one country to another. It also covers when an organisation makes personal data available to another entity (‘third party’) located in another country; in other words, the personal data can be accessed from overseas.

There are specific rules about the transfer of personal data from a UK sender to a receiver located outside the UK (under UK GDPR) and similar transfers from EEA senders (under EU GDPR); these are known as restricted transfers. A receiver could be separate company, public body, sole trader, partnership or other organisation.


Personal data can flow freely within the European Economic Area (EEA). A restricted transfer takes place when personal data is sent or accessible outside the EEA. Where such a transfer takes place, specific safeguards should be in place to make the transfer lawful under EU GDPR.


A restricted transfer takes place when personal data is transmitted, sent or accessed outside the UK, and safeguards should be in place to ensure the transfer is lawful.

The reason for these rules is to protect people’s legal rights, as there’s a risk people could lose control over their personal information when it’s transferred to another country.

Examples of restricted transfers would be:

  • Sending paper or electronic documents, or any kind of record containing personal data, by email or post to another country
  • Giving a supplier based in another country access to personal data
  • Giving access to UK/EU employee data to another entity in the same corporate group, based in another country.

There are some notable exceptions:

  • Our own employees: A restricted transfer does not take place when sending personal data to someone employed by your company, or them accessing personal data from overseas. However, it does cover the sending, transmitting or making personal data available to another entity within the same corporate group, where entities operate in different countries.
  • Data in transit: Where personal data is simply routed via several other countries, but there is no intention that this data will be accessed or manipulated while it is being routed via other countries, this won’t represent a restricted transfer. ICO guidance says; Transfer does not mean the same as transit. If personal data is just electronically routed through a non-UK country, but the transfer is actually from one UK organisation to another, then it is not a restricted transfer.

What are the safeguards for restricted transfers?

A. Adequacy

Adequacy is when the receiving country has been judged to have a similar level of data protection standards in place to the sender country. An Adequacy Decision allows for the free flow of personal data without any additional safeguards or measures.

Transfers from the EEA
The European Commission has awarded adequacy decisions to a number of countries including the UK, Japan, New Zealand, Uruguay and Switzerland. A full list can be found on the European Commission website – Adequacy Decisions.

Therefore personal data can flow freely between EEA countries and an ‘adequate’ country. These decisions are kept under review. There are some concerns UK Government plans to reform data protection law could potentially jeopardise the UK’s current EC adequacy decision.

EU-US Data Privacy Framework: The EC adopted this framework for transfers from the EU to US in July 2023.  It allows for the free flow of personal data to organisations in the US which have certified and meet the principles of the DPF. A list of self-certified organisations can be found on the U.S Department of Commerce DPF website.

Transfers from the UK
There are provisions which permit the transfer of personal data between the UK and the EEA, and to any countries which are covered by a European Commission ‘adequacy decision’ (as of January 2021). Therefore personal data can flow freely between UK and EEA and any of the countries awarded adequacy by the EC.

The UK Government has the power to make its own ‘adequacy decisions’ on countries it deems suitable for transfers from the UK. More information about UK adequacy decisions can be found here.

UK-US Data Bridge: The UK-US ‘Data Bridge’ was finalised on 21st September 2023 and goes live 12th October 2023. Like the EU-US Data Privacy Framework, organisations based in the US must self-certify to the DPF but they must also sign up to the ‘UK extension’. Read more about the Data Bridge

B. EU Standard Contractual Clauses

In the absence of an EC adequacy decision, Standard Contractual Clauses (SCCs) can be used which the sender and the receiver of the personal data both sign up to. These comprise a number of specific contractual obligations designed to provide legal protection for personal data when transferred to ‘third countries’.

SCCs can be used for restricted transfers from the EEA to other territories (including those not covered by adequacy). The European Commission published new SCCs in 2021 which should be used for new and replacement contracts. The SCCs cover specific clauses which can be used for different types of transfer:

  • controller-to-controller
  • controller-to-processor
  • processor-to-processor
  • processor-to-controller

There’s an option for more than two parties to join and use the clauses through a docking clause. More information can be found on the European Commission website – Standard Contractual Clauses

Two points worth noting:

  • The deadline to update contracts which use the old SCCs has passed – 27th December 2022.
  • Senders in the UK cannot solely rely on EU SCCs, see the point below about the UK Addendum.

C. UK International Data Transfer Agreement (IDTA) or Addendum to EU SCCs

Senders in the UK (post Brexit) have two possible options here as a lawful tool to comply with UK GDPR when making restricted transfers.

  • The International Data Transfer Agreement, or
  • The Addendum to the new EU SCCs

ICO guidance stresses; the new EU SCCs are not valid for restricted transfers under UK GDPR on their own, but using the Addendum allows you to rely on the new EU SCCs. In other words the UK Addendum works to ensure EU SCCs are fit for purpose in a UK context.

In practise, if the transfer is solely from the UK, the UK ITDA would be appropriate. If the transfer includes both UK and EU personal data the, EU SCCs with the UK Addendum would be appropriate, to cover the protection of the rights of EU as well as UK citizens.

It’s worth noting, contracts signed on or before 21 September 2022 can continue to use the old SCCs until 21 March 2024. Contracts signed after 21 September 2022 must use the IDTA or the Addendum to new EU SCC, in order to be effective. See ICO Guidance

The additional requirement for a risk assessment

The ‘Schrems II’ ruling in 2020, invalidated the EU-US Privacy Shield (predecessor of the Data Privacy Framework) and raised concerns about the use of EU SCCs to protect personal data. Concerns raised included the potential access to personal data by law enforcement or national security agencies in receiver countries.

As a result of this ruling there’s a requirement when using the EU SCCs or the UK IDTA to conduct a written risk assessment to determine whether personal data will be adequately protected. In the EU this is known as a Transfer Impact Assessment, and in the UK, it’s called a Transfer Risk Assessment (TRA).

The ICO has published TRA Guidance, which includes a TRA tool; a template document of questions and guidance to help businesses carry out a TRA.

D. Binding Corporate Rules (BCR)

BCRs can be used as a safeguard for transfers within companies in the same group. While some global organisations have gone down this route, it can be incredibly onerous and takes a considerable amount of time to complete BCRs.

BCRs need to be approved by a Supervisory Authority (for example the ICO in the UK, or the CNIL in France).  This has been known to take years, so many groups have  chosen to use EU SCCs (with UK Addendum if necessary) or the IDTA, in preference to going down the BCR route.

E. Other safeguards

Other safeguards measures include;

  • Approved codes of conduct
  • Approved certification mechanisms
  • Legally binding and enforcement instruments between public authorities or bodies.

What are the exemptions for restricted transfers?

It may be worth considering whether an exemption may apply to your restricted transfer. These can be used in limited circumstances and include:

  • Explicit consent – the transfer is done with the explicit consent of the individual whose data is being transferred, and where they are informed of possible risks.
  • Contract – where the transfer is necessary for the performance of a contract between the individual and the organisation or for necessary pre-contractual steps.
  • Public interests – the transfer is necessary for important reasons of public interest.
  • Legal necessity – the transfer is necessary for the establishment exercise or defence of legal claims.
  • Vital interests – the transfer is necessary to protect people’s vital interests (i.e. in a critical life or death situation) where the individual cannot legally or physically give their consent.

The ICO makes the point most of the exemptions include the word ‘necessary’. The Regulator says this doesn’t mean the transfer has to be absolutely essential, but that it “must be more than just useful and standard practice”. An assessment needs to be made as to whether the transfer is objectively necessary and proportionate, and can’t be reasonably achieved another way.

The regulatory guidance says exemptions, such as contractual necessity, are more likely to be proportionate for occasional transfers, a low volume of data and where there is a low risk of harm when the data is transfer.

The above is not an exhaustive list of the exemptions, further details can be found here.

There is no getting away it, international data transfers are a particularly complex and onerous area of data protection law! It pays to be familiar with the requirements and understand the potential risks.

Sometimes organisations will have little control over the terms under which they do business with others. For example, large technology providers might be unwilling to negotiate international transfer arrangements and will only proceed if you agree to their existing safeguards. A balance might need to be taken here on the necessity of entering the contract and the potential risks should restricted transfers not be adequately covered.

Life after cookies

March 2024

“The past is a different country: they do things differently there”.

I’m pretty certain when LP Hartley wrote this wistful line the changing world of advertising, data and privacy weren’t foremost in his mind. However, in five years from now, when all the current arguments surrounding the elimination of third-party cookies are long gone, that’s likely how we’ll view the universal use (and abuse) of a simple text file and the data it unlocked.

From one perspective, life after third-party cookies is very simple.

The majority of media is transacted without third party cookies already. Whether by media type, first-party user preferences, device or regulatory mandates, lots of money already moves around without reference to third-party cookies. As the saying goes “The future is already here, it’s just not very evenly distributed”.

That’s deliberately rather glib. Some sections of the media still rely upon third-party cookies and not every media owner has an obvious opportunity to build a first-party relationship with consumers. The advantages of an identifier that allows streamlining of experience for consumers whilst delivering audience targeting and optimisation for media owners and advertisers haven’t gone away.

When we look to life after third-party cookies, we need to understand the ways replacement identifiers have evolved to ameliorate the worst aspects of cookies, whilst leaving some advantages in place. One leader I interviewed on this topic back in 2020 said “It’s not the fault of the cookie, it’s what you did with the data” and that’s a useful measure to have in mind when looking at any alternative solutions.

Put very simply, the choices for a brand post the third-party cookie are:

  • Use a different identity approach
  • Buy into use of a walled/fenced garden toolset
  • Use another signal to match between media and audience that isn’t anchored directly to the user, such as contextual.

Alternative identity solutions

The advantage of these is they come with some aspect of permissioning and consumer controls – after the cookie arguments and much legislation in the UK, Europe and US, the industry has learnt these tools are critical. However, it remains a moot point as to whether consumers have much knowledge around any consent or legitimate interest options that are put in front of them – the ICO in the UK is currently clamping down on consent practices. More cookie action

Equally moot is whether the majority of consumers are really that bothered. Much consent gathering is viewed by both parties as an unwanted hurdle in a customer journey. The basic requirements for a consumer to know who has their data, for what purposes and for how long remain, but how to achieve the requisite communication and control is still work in progress.

On a global scale these identity solutions revolve either around a “daisy chain,” using hashed email as the ID link, or use a combination of signals from a device with other attributes to have some certainty around individual identity. Any linkage built with a single identity variable risks being fractured by a single consent withdrawal.

The solutions built on a combination of signals have potentially more durability because they are less dependent on any single signal as the anchor of their fidelity, but many device signals are controlled by browser or operating system vendors, who may obscure or withdraw access to these as Apple has done in recent years.

Walled garden toolset

Much discussion is made around Google’s Privacy Sandbox initiative. This is the ambition from Google to deliver some of the advantages of third-party cookies within the Chrome browser whilst not revealing individual data.

It’s been a much longer journey than envisaged at the start when Google first made their announcement in 2020. Google’s commitment, made under the shadow of the Digital Markets Act, has been that they will not remove third-party cookies from the Chrome ecosystem until the UK competition regulator, the CMA, has approved their plans.

As of March 2024, those closely following the travails of Google, the CMA and the opinions tabled from the IAB Tech Lab (amongst others) would be hard pressed to give a cast iron opinion that the current timescale will be met. Privacy and competitive advantage have become inextricably intertwined in these arguments, which is fair. However, slicing through this Gordian Knot was probably not on the CMA or Google’s agenda when they signed up to this process. But that’s about timing, not a permanent stay of execution for the third-party cookie.

Non-user signals

The final approach is to use tools that do not rely on individual level signals. What an individual reads or consumes online says much about them – more than a century of classified advertising is testament to this.

The contextual solutions of 2024 are faster, smarter and better integrated than ever before. They have their downsides – closed loop measurement is a significant challenge hampering some of the campaign optimisations that became common place in the ear of the third-party cookie. And they became common place because they were easy and universal, however, paraphrasing the aphorism, what is measured came to matter, when it should really be the other way round.

And here we come into the greatest change that is being ushered in by the gradual demise of third-party cookies. Measuring what actually matters.

In the late 2010’s when cookies were centre stage as the de facto identifier of choice in media and advertising, their invisible synchronisation gave almost universal, if imperfect, coverage. One simple solution, accessible to all.

As we enter 2024, many alternative identifiers struggle to get much beyond 30% coverage. Contextual solutions can deliver 100% coverage but have their own measurement challenges. This has driven a greater interest in a combination of broad business- and commercial objective-based approaches such as Marketing Mix Modelling (MMM) and attribution-based metrics where appropriate. Advances in data management and analysis have enabled MMM to deliver more frequent insights than the traditional annual deep dive, making it a core component for post cookie media management.

Underpinning any and all of these solutions is the need for first-party data. Whether to build models for customer targeting, collaborate with media and other partners to access first-party data assets or measure more efficiently and effectively, having a structured, accessible and usable set of tools around first-party data is critical to working in the current landscape of solutions.

The growth of cloud storage solutions takes some of the burden away from making this a reality, but the applications used to understand and activate that data asset are many and various. Taking time and advice to build understanding in this area is a knowledge base critical to prospering after the third-part cookie.

Life beyond the third-party cookie is far from fully defined.

Some of the longer-term privacy and competition elements are not that hard to envisage, but exactly how the next 24 months plays out is much, much harder to predict. It’s still really work in progress, especially around measurement and optimisation. For the user of data in advertising and marketing it’s essentially “back to basics”.

Your customer data is more valuable than anyone else’s, so capture and hold it carefully. Test many things in a structured way because the future is about combinations. And know what matters to your business and work out how to measure it properly, not just easily.

Guide to identifying and managing data protection risks

March 2024

Data protection risks come in all shapes, sizes and potential severities. We need to be able to identify the risks associated with our use of personal data, manage them and where necessary put appropriate measures in place to tackle them.

How can we make sure  good risk management practices are embedded in our organisation? In this short guide we cover the key areas to focus on to make sure you’re alert to, and aware of risks.

1. Assign roles and responsibilities

Organisations can’t begin to identify and tackle data risks without clear roles and responsibilities covering personal data. Our people need to know who is accountable and responsible for the personal data we hold and the processing we carry out.

Many organisations apply a ‘three lines of defence’ (3LoD) model for risk management. This model is not only used for data protection, but is also effective for handling many other types of risk a business may face.

  • 1st line: where the leaders of the business functions that process data are appointed as ‘Information Asset Owners’ and they ‘own’ the risks from their function’s data processing activities.
  • 2nd line: Specialists like the DPO, CISO & Legal Counsel support and advise these the 1st line, helping them understand their obligations under data laws, so they can make well informed decisions about how best to tackle any privacy risks. They provide clear procedures for the 1st line to follow.
  • 3rd line: An internal or external audit function provides independent assurance.

3 lines of defence for data protection

For example, risk owners, acting under advice from a Data Protection Officer or Chief Privacy Officer, must make sure appropriate technical and organisational measures are in place to protect the personal data they’re accountable for.

In this model, the second line of defence should never become risk owners. Their role is to provide advice and support to the first line risk owners. They should try to remain independent and not actually make decisions on behalf of their first line colleagues.

2. Decide if you should appoint a DPO

Under the GDPR, a Data Protection Officer’s job is to inform their organisation about  data protection obligations and advise the organisation on risks relating their processing of personal data.

The law tells us you need to appoint a DPO if your organisation is a Controller or Processor and one or more of the following applies:

  • you are a public authority or body (except for courts acting in their judicial capacity); or
  • your core activities require large-scale, regular and systematic monitoring of individuals (for example, online behaviour tracking); or
  • your core activities consist of large-scale processing of special categories of data or data relating to criminal convictions and offences.

In reality, most small organisations are unlikely to fall under the current UK or EU GDPR requirements to appoint a DPO. In fact, many medium-sized business won’t necessarily need a DPO either. Find out more in our DPO myth buster.

3. Conduct data mapping & record keeping

Mapping your data and creating a Record of Processing Activities (RoPA) is widely seen as the best foundation for any successful privacy programme. After all, how can you properly look after people’s data if you don’t have a good handle on what personal data you hold, where it’s located, what purposes it’s used for and how it’s secured?

Even smaller organisations, which may benefit from an exemption from creating a full RoPA, still have basic record keeping responsibilities which should not be overlooked and could still prove very useful. Also see Why is data mapping so crucial?

4. Identify processing risks

Under data protection laws, identifying and mitigating risks to individuals (e.g. employees, customers, patients, clients etc) is paramount.

Risks could materialise in the event of a data breach, failure to fulfil individual privacy rights (such as a Data Subject Access Request), complaints, regulatory scrutiny, compensation demands or even class actions.

We should recognise our service and technology providers, who may handle personal data on our behalf, could be a risk area. For example, they might suffer a data breach and our data could be affected, or they might not adhere to contractual requirements.

It’s good to be mindful about commercial and reputational risks too which can arise from an organisation’s use of personal or non-personal data.

International data transfers are another are where due diligence is required to make sure these transfers are lawful, and if not, recognise that this represents a risk.

Data-driven marketing activities could also be a concern, if these activities are not fully compliant with ePrivacy rules – such as the UK’s Privacy and Electronic Communications Regulations (known as PECR). Even just one single complaint to the ICO could result in a business finding themselves facing a PECR fine and the subsequent reputational damage. GDPR, marketing & cookies guide

Data protection practitioners share tips on identify and assessing risks

5. Risk assessments

In the world of data protection, we have grown used to, or even grown tired of, the requirement to carry out a Data Protection Impact Assessment (DPIA) or a Privacy Impact Assessment (PIA) as it called in some jurisdictions.

Build in a process of assessing whether projects would benefit from a DPIA, or legally require one.  DPIAs are a great way to pinpoint risks and mitigate them early on before they become a bigger problem.

The value of risk assessments in the world of data protection compliance and Quick Guide to DPIAs

6. Issues arising from poor governance or lack of data ownership

In the real world, the three lines of defence model can come under strain. Sometimes those who should take responsibility as risk owners can have slippery shoulders and refuse to take on the risks.

Some processing doesn’t seem to sit conveniently with any one person or team. Things can fall through the cracks when nobody takes responsibility for making key decisions. On these occasions a DPO might come under pressure to take risk ownership themselves. But should they push back?

Strictly speaking, DPOs shouldn’t ‘own’ data risks; their role is to inform and advise risk owners. GDPR tells us; data protection officers, whether or not they are an employee of the controller, should be in a position to perform their duties and tasks in an independent manner” (Recital 97).

The ICO, in line with European (EDPB) guidelines, says; …the DPO cannot hold a position within your organisation that leads him or her to determine the purposes and the means of the processing of personal data. At the same time, the DPO shouldn’t be expected to manage competing objectives that could result in data protection taking a secondary role to business interests.”

So, if the DPO takes ownership of an area of risk, and plays a part in deciding what measures and controls should be put in place, could they may be considered to be ‘determining the means of the processing’? This could lead to a conflict of interest when their role requires them to act independently.

Ultimately, accountability rests with the organisation. It’s the organisation which uses the data, collects the data and runs with it. Not the DPO.

7. Maintain an up-to-date risk register

When you identify a new risk it should be logged and tracked on your Data Risk Register. The ICO expects organisations to: identify and manage information risks in an appropriate risk register, which includes clear links between corporate and departmental risk registers and the risk assessment of information assets.

To do this you’ll need to integrate any outcomes from risk assessments (such as DPIAs) into your project plans, update your risk register(s) and keep these registers under continual review by the DPO or responsible individuals.