GDPR: Consent and why records are crucial

The ICO has fined a telemarketing firm £90k for their inability to demonstrate valid and specific consent was collected from the people they’d contacted. Data was collected directly, via the telemarketer’s website and via a third-party survey company.

Crucially, the firm couldn’t produce evidence of consent. This led me to think about other organisations; you may have gone to great efforts to make sure the consent you collect meets the GDPR standard, but are you keeping adequate records? Occasionally, the old legal adage applies – ‘If it isn’t written down, it didn’t happen.’

If your consent is subject to regulatory scrutiny, proof is highly likely to be requested. A customer might ask for evidence, and could escalate a complaint if you’re unable to produce it.

So, what records do we need to keep?

Here’s a refresher on the consent rules and how to retain adequate evidence. For simplicity’s sake when I refer to GDPR in this article I mean both GDPRs – the EU and UK flavours.

Consent is ONE of SIX lawful bases for processing

Consent is just one of six lawful bases. GDPR requires organisations to select an appropriate lawful basis for each purpose for processing personal data. They’re all equally valid; no single basis is better than another. You should choose the most appropriate basis for each activity. Often consent might not be appropriate, but sometimes consent is required by law for certain activities.

Just be mindful; don’t rely on consent if another lawful basis would be more appropriate. But also be careful not to try and shoe-horn your activities into another lawful basis (such as legitimate interests), when consent really would be the best approach, or is legally required.

What constitutes valid consent

GDPR defines consent as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.

Let’s break this down…

Freely given consent

People must be given a genuine choice
People should be able to refuse to give their consent without detriment
Consent should be easy to withdraw
Consent shouldn’t be bundled into T&Cs, unless necessary for the service

It’s also sometimes important to weigh up any ‘imbalance of power’ over the individual whose consent you seek. For example, consent may not be freely given if the individual feels they don’t really have a choice. Consent can therefore be tricky in employer-employee relationships, if staff might feel a degree of pressure, or feel they will be penalised or treated differently if they refuse.

Specific and informed consent

It must be clear who people are giving their consent to. The organisation relying on the consent must be clearly identified. If you want to rely on consent collected for you by a third party, your organisations must be named at the time consent is collected.
Consent must specifically cover all of the purposes for which it’s being collected. Separate consent should be collected, wherever possible, for different activities. For example, collecting separate marketing consents for different marketing channels. This isn’t a hard and fast rule and isn’t required if it would be unduly disruptive, or the activities are clearly interdependent.
It must be clear people can withdraw their consent at any time (and the ICO advises you include details of how to do so).

Remember, there’s specific information you’ll always need to provide when you collect people’s personal details. There are distinct transparency requirements and people have the right to be informed. You may choose to take a layered approach, and it’s advisable to always have a clear link to a Privacy Notice (aka Privacy Policy), or details of how to access this.

Consent by an unambiguous indication and clear affirmative action

Consent must be given by a deliberate and specific action to opt-in or agree. For example; an opt-in box, clicking ‘submit, signing a statement, or verbal confirmation. Failing to opt-out is not consent. Pre-ticked boxes are not consent.

For more information see ICO consent guidance, which covers how to collect consent, how to manage requests to withdraw, and more.

Evidence of consent

GDPR states: “Where processing is based on consent, the controller shall be able to demonstrate that the data subject has consented to processing of his or her personal data.”

This means, organisations must have an audit trail to meet their accountability obligations. This is what the telemarketing firm failed to grasp. In practice, this means keeping records of:

Who consented e.g. their name or other identifier.
When they consented e.g. an online time stamped record, a copy of a dated document or a note of the time and date verbal consent was given.
What they were told at the time e.g. a copy of the consent statement used at the time, along any separate privacy notice or other privacy information used at the time.
How consent was given e.g. a copy of the data capture form or a note of a verbal conversation.
Any withdrawal of consent, and when.

This is why we recommend when your updating consent statements or privacy notice(s) keeping copies of older notices and the dates they were operative. This doesn’t need to extend to keeping copies of every web form, but records held on your CRM or other relevant system need to be accurate. The ICO guidance on keeping records of consent is a useful resource.

Consent isn’t easy

Collecting valid consent can feel like a minefield. It means carefully ticking off requirements and keeping evidence. This isn’t hard once you’ve established a routine and get into the habit of thinking ‘that needs keeping hold of.’ Getting this right, means you’ll breathe a sigh of relief if you’re ever subjected to scrutiny.

For more detail on when consent is legally required under UK ePrivacy law for marketing activities see our guides to the email marketing rules and telemarketing rules.

International Data Transfers: When a Transfer Risk Assessment is required

A recent €530 million fine, issued to TikTok by the Irish Data Protection Commission (DPC) for failing to meet international data transfer rules, demonstrates why cross-border transfers of personal data must be effectively managed.

While in this case an EU fine, UK organisations are not immune to data transfer requirements, nor the potential fallout of non-compliance. Organisations need to be mindful that people risk losing their protection under UK data protection laws if their personal data is transferred outside the UK.

Unless one of the following four conditions can be met, organisations must make sure appropriate safeguards are in place to protect ‘restricted transfers’ overseas:

You have the specific consent of individuals for the international transfer
■ The transfer is absolutely necessary to perform contract with the individual
■ You can rely on an ‘Article 49 derogation’ – where the specific transfer is necessary for important reasons in the public interest, for litigation or for a public register
If there’s an approved Code of Conduct between members, e.g. members of a trade association

Often these conditions won’t be met, and appropriate safeguards will be necessary. In certain circumstances, there’s also a requirement to conduct a Transfer Risk Assessment (TRA). In the EU this is called a Transfer Impact Assessment (TIA) and this requirement was overlooked by TikTok.

A restricted transfer is where an organisation shares personal data with another organisation (i.e. a separate controller), or to a vendor/service provider/supplier (i.e. processor) and the processing will take place in another country. This includes where overseas data sharing takes place between companies which are part of the same group of companies. For example, one based in the UK and one in the USA. When data is anonymised, so that its no longer ‘personal’ data, its not classed as restricted transfer.

Both controllers and processors also need to consider any further transfers in the supply chain to ‘sub-processors’ located in other countries.

Crucially, we need to recognise a ‘transfer’ will take place if there’s ‘access to’ personal data. For example:

A UK based controller permitting a supplier based in India to access the personal data of its customers would represent a restricted transfer.
A UK based processor, permitting one of their suppliers (a ‘sub-processor’) based in France to access the personal data of its client (the controller).
An EU based controller sharing personal data with a separate controller based in San Francisco, USA.

For more detail on what constitutes a restricted transfer see our International Data Transfers Guide or the ICO Guidance.

Do we need to make a restricted transfer?

Before making a restricted transfer, organisations should consider whether they can achieve their requirements without sharing ‘personal’ data. If you share data in an anonymised form, so it’s never possible to identify individuals, it is no longer personal data, so the restrictions do not apply.

Why did TikTok get fined?

The DPC inquiry found the social media platform had infringed GDPR on the following three key points:

Equivalent protection: There was a failure to verify, guarantee and demonstrate that personal data of EEA users, remotely accessed by staff in China, was afforded a level of protection essentially equivalent to that guaranteed within the EU.
Transfer Impact/Risk Assessment: The necessary assessments were not undertaken to address potential access by Chinese authorities to EEA personal data under Chinese anti-terrorism, counter espionage and other laws, which were considered to materially diverge from EU standards.
Transparency: TikTok’s 2021 Privacy Policy (aka Privacy Notice) did not meet necessary transparency requirements to inform EEA users that personal data was stored in servers in the United States and Singapore and was remotely accessible by entities in a number of other countries including China, Malaysia and the Philippines. An updated 2022 Privacy Policy rectified this particular infringement.

When is a Transfer Risk Assessment (TRA) required?

A TRA is not always required, it depends on the appropriate safeguard mechanism an organisation is intending to rely on for a restricted transfer.

Adequacy decision (Article 45): No TRA required

Adequacy status is awarded to specific countries judged to have a similar level of data protection standards as those in the UK. An adequacy decision essentially allows for the free flow of personal data between the UK and the other country. The UK Government refers to these as ‘data bridges’. When you rely on adequacy, a TRA is not required.
Currently there is reciprocal adequacy between the UK and the EEA. You can check which other countries have adequacy in the ICO data transfer guidance.

Other safeguard mechanisms (Article 46): TRA required

The requirement to conduct a risk assessment came into effect following the 2021 EU Schrems II ruling, and will apply, for example, if you intend to rely on the following safeguard mechanisms:

ICO’s International Data Transfer Agreement (IDTA)
EU Standard Contractual Clauses (SCCs) with the UK Addendum
Binding Corporate Rules (BCRs)

What’s the purpose of a Transfer Risk Assessment?

A TRA aims to help organisations to consider if the relevant protections for people under UK data protection law will be undermined when their personal data is transferred overseas. The ICO explains there are two broad types of risks to be considered:

Risks to people’s rights arising in the destination country from third parties accessing the information that are not bound by the Article 46 transfer mechanism, in particular government and public bodies.
Risks to people’s rights arising from difficulties enforcing the Article 46 transfer mechanism.

It’s worth bearing in mind if a processor is making a restricted transfer, for example to a sub-processor, it’s their responsibility to conduct the TRA. A controller should still carry out reasonable and proportionate checks to make sure these transfers are compliant with UK GDPR.

When onboarding a new processor, some controllers may request to see copies of their processors’ TRAs to sub-processors.

More information is available in the ICO TRA Guidance.

How to conduct a TRA for a transfer from the UK

The ICO sets out three distinct options for conducting the risk assessment.

Option 1: ICO TRA tool

This is a specific risk-assessment tool. It enables you to evaluate any increased risk to people’s privacy and other human rights as a result of the transfer, comparing this with if the data remained in the UK.

In our view, the ICO has gone to considerable efforts to make this (Word document) tool as straightforward as possible. It helpfully provides a list of common categories of personal information with an initial risk score. You don’t have to use this specific template and can record your answers to six key questions in other ways.

However, to the uninitiated the TRA tool can be tricky to complete. If the circumstances of a specific transfer require a more detailed investigation it will involve a level of research into the legal system, respect for rule of law and the human rights record in the destination country.

Option 2: EDPB approach

This assessment looks at comparing the laws and practices of the UK with those of the destination country (the ‘data importer’). In particular, it means looking at the safeguards in place in relation to third party access to the information, particularly by Governments. The safeguards don’t need to be identical but need to be sufficiently similar to those in the UK.

Option 3: Reliance on published UK Government analysis in making adequacy regulations

As mentioned above, the UK Government can make adequacy decisions (known as ‘data bridges’). In making these decisions there are specific considerations the Government must take account of when assessing another country or territory. This includes an assessment of risks similar to the assessment which would be undertaken when using options 1 and 2. Therefore, if there’s relevant published UK Government analysis, which judges standards of data protection to be satisfactory, this can be relied upon. Notably, in 2023 the Department for Science, Innovation and Technology (DSIT) published analysis for the United States. DSIT Analysis

Transfers from the UK to the United States

It’s worth taking a look specifically at transfers from the UK to US. These are a common type of restricted transfer, especially for UK based organisations considering utilising the services of US based technology / SaaS providers.

Adequacy: the EU-US Data Privacy Framework, plus US-UK ‘data bridge’ extension

There’s an adequacy decision which UK organisations may be able to rely on, meaning a TRA is not required. However, unlike other adequacy decisions for specific countries (such as Japan, Israel and New Zealand), the ability to rely on the adequacy decision for the United States depends on whether the specific US company you are transferring data to has self-certified to the Data Privacy Framework and the UK extension to this framework. You can check if an organisation is certified here.

To give some commonly used examples, at the time of writing, Google LLC, Microsoft, Salesforce and Mailchimp are signed up to the Framework and UK extension (‘data bridge’).

Other safeguard measures and TRA

If an organisation isn’t listed as a signatory to the Data Privacy Framework and UK extension it’s likely you’ll need to rely on the ICO’s IDTA, EU SCCS with the UK Addendum, or BCRs (for intra-group transfers). And options 1, 2 and 3 outlined earlier for conducting a TRA will be in play.

I’d encourage you to read the ICO’s guidance on transfers to the US, which sets out the potential to streamline the TRA process by relying on UK Government analysis (e.g. Option 3). The ICO states: “a significant part of the analysis relates to broader issues not specific to the US data bridge but analyses the application of relevant US laws and practices more generally. It is equally relevant to personal information transferred using an Article 46 transfer mechanism.”

The ICO’s guidance  sets out in more detail how you can rely on this analysis, as an alternative to using the TRA Tool or the EDPB approach.

To conclude, international data transfer rules are not simple! They can often feel overly complex, with tricky compliance hurdles. Nonetheless, it’s both legally and ethically the right thing to do to make sure people don’t lose the rights they are entitled to under UK data protection law.

In practice, a risk-based approach is frequently adopted, applying more rigour to more risky transfers. For example, a transfer of a list of employees’ work email addresses is unlikely to pose as much risk as transferring more sensitive personal information. As ever, the devil is in the detail.

Rising cyber threats but data breaches aren’t always obvious

The UK Government and National Cyber Security Centre have issued warnings about significant and growing cyber threats, with the expectation of increased ransomware attacks, state-sponsored cyber activity and sophisticated cybercrime. Do take heed: the retail sector has already seen a number of damaging attacks.

Sometimes, it’s obvious a data breach has taken place. However, this isn’t always the case, especially when cyber criminals take steps to cover their tracks. A recent example illustrates the consequences for organisations who fail to fully appreciate the significance of a malicious attack.

The ICO has issued a £60k fine to law firm DPP, following a 2022 cyber-attack. The attack led to highly sensitive and confidential personal information being published on the dark web. The ICO investigation discovered lapses in IT security practices, leaving information vulnerable to unauthorised access. Hackers were able to exploit a user account which did not have Multi-Factor Authentication (MFA), enabling them to move laterally across the firm’s systems.

Let’s be clear; MFA is now a must have on all relevant data systems.

Announcing the fine, the ICO said; “DPP only became aware of when the National Crime Agency contacted the firm to advise information relating to their clients had been posted on the dark web. DPP did not consider that the loss of access to personal information constituted a personal data breach, so did not report the incident to us until 43 days after they became aware of it.”

A personal data breach is defined as ‘a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data.’ That’s a broad scope.

The ICO enforcement notice accepts actions taken by the attackers made DPP’s response to the incident difficult. Unfortunately, DPP’s initial assessment indicated no personal data had been exfiltrated and didn’t consider loss of access to personal data to be a breach – therefore the firm didn’t report it.

You can check out the full enforcement notice, but bear in mind it’s reported DPP disputes some of the ICO’s conclusions and may appeal.

Any organisation suffering a cyber-attack has my sympathy. Attacks are becoming more frequent, sophisticated and harder to track. They can severely disrupt day-to-day operations. Ascertaining the cause and consequences of an attack can be difficult. Indeed, in some cases the consequences might never be clearly established. And when it becomes public knowledge the organisation needs to work decisively, not just to get operations back up and running and mitigate any harms to those affected, but also manage PR.

As I write, we’re witnessing M&S battle a significant ransomware attack, which has left store shelves empty. Cyber criminals have also reportedly told the BBC their attack on the Co-op is more serious than the company had previously admitted.

Organisations are legally required to report personal data breaches to the ICO (or another relevant Data Protection Authority) within 72-hours of becoming aware, unless there is unlikely to be a risk to individuals. When it comes to ransomware attacks, it may be best to assume that (more likely than not) personal information is affected. The ICO states in a research paper; ‘If you become a victim of ransomware, you should assume the information has been exfiltrated (extracted).’

In other words, it would be wise to submit an initial data breach report. It’s understood you won’t know all the facts immediately and you may need to bring in digital forensics expertise. In this situation, you can submit an initial report and update the Regulator when more facts become known. The risk can subsequently be upgraded or downgraded as you continue your investigations. We’ve written more about how to assess the risks posed by a data breach here.

It’s important, even for small-to-medium sized businesses, to have sufficient knowledge about what constitutes a personal data breach, and the threats we all face. Here’s a refresher of some common ways a personal data breach can occur.

Cyber security incidents

We often hear about ransomware attacks where hackers gain unauthorised access to databases, exfiltrating or altering personal information, and making a demand for payment. There are also other forms of malicious attack, such as;

Brute force – this is where hackers use algorithms to ‘guess’ username and password credentials, testing multiple combinations to try to gain access to user accounts. It’s understood this is how hackers initially got into DPP Law’s systems. Clearly, these attacks are more successful when passwords are easy to guess and when MFA is not in place.

■ Denial of Service (DOS) – this works by overloading a computer network or website and can result in a degrading of performance, or render the system completely inaccessible. DoS attacks may result in full or partial loss of access (availability) to personal data records. And as we said above, that’s classed as a data breach.

■ Supply chain attacks – these attacks target vulnerabilities in third-party services your organisation is using. In 2023 the BBC, British Airways and Boots were among many organisations impacted by the well-publicised MOVEit supply chain breach. More recently the ICO issued a £3 million fine to an IT software company which provided services to many UK organisations including the NHS.

Phishing – this is when criminals use scam emails to trick people into clicking on a malicious link. Phishing attacks can trick people into sharing sensitive information, such as payment card details or login credentials. As well as email, phishing can be spread via text messages or over the phone.

I’d urge you to read the ICO’s Learning from the Mistakes; which provides detailed information on the types of cyber-attacks organisations can suffer and ways to mitigate the risk.

Loss or theft of devices or hard copy documents

This is pretty self-explanatory; a smartphone, laptop or other device containing personal data is lost or stolen. When devices are not encrypted this can lead to the exposure of potentially sensitive personal information. Alternatively, a data breach can occur when physical documents are lost or stolen.

Disclosure of personal information

This type of incident can occur in a number of different ways, for example;

An email sent to the wrong recipient(s).

Accidentally using the CC field in emails for multiple recipients, thereby revealing their email address to all recipients. In some cases this can just be embarrassing, but in others like the Central YMCA breach much more serious.

Information is posted to the wrong person, such as a hospital sending medical records by post to wrong recipient.

Publishing confidential information on a public website.

Sharing personal data with unauthorised third parties.

Unauthorised Disclosure

This type of incident may occur due to a malicious attack such as ransomware, or it may be an insider breach, as illustrated by these cases;

In 2023 two former Tesla employees leaked confidential and personal information relating to employees and customers.

Back in 2014 a Morrison’s employee leaked his colleagues’ payroll details in what was seen as an act of revenge after being given a verbal warning. A case which resulted in years of legal wrangling over whether Morrison’s was liable for the actions of a rogue employee.

This type of incident also includes ‘employee snooping.’ For example, a member of staff with access to a customer database browses the personal data of others without a legitimate business purpose. Or a police officer or council official looks up and discloses information without authority.

Improper disposal of records

Insecure disposal of electronic or paper records might lead to a data breach. For example, if a company disposes of old paper files containing customer details without shredding them, and a third party finds them.

The above is by no means an exhaustive list, but provides those less experienced in data breaches with a steer on what risks to be aware of.

Not all security incidents will be personal data breaches; they could involve commercially sensitive information, but no personal data. While these don’t need to be reported if they meet a certain threshold, they still have the potential to cause considerable fallout.

Privacy violations

In other circumstances there may be a violation of data protection law, which is not a data breach. As an example, I’ve been asked before whether it’s necessary to report an email marketing campaign accidentally sent to customers who’ve unsubscribed as a breach. While a clear violation of the right to object to direct marketing, this doesn’t represent a breach of security: there’s been no destruction, loss, alteration, unauthorised disclosure of, or access to personal data. The individuals’ personal data remains secure. Efforts therefore need to focus on trying to minimise the risk of complaints escalating, and making sure this never happens again.

To conclude, the DPP Law case is instructive; it’s not a big company, employing less than 250 people, but handles highly sensitive information relating to their clients. The attack suffered sends a clear message; any business can fall victim to cyber-attacks and personal data breaches. The more sensitive the data your organisation handles, the more damaging a breach could be. Not only must cyber security be treated as a priority, but so are robust data breach procedures to guide your team through any potential attack.

ICO fines software company £3.09m after cyber-attack

First UK processor fine is a stark reminder of supply chain risks

The Information Commissioner’s Office has fined Advanced Software Group Ltd (Advanced) £3.07 million following a cyber-attack in 2022 which put the personal information of nearly 80,000 people at risk. This marks the first fine issued under UK GDPR to a processor.

Advanced, which provides IT and software services to organisations including the NHS, was found to have failed to implement appropriate technical and organisational measures to protect its systems.

In the ransomware attack, hackers managed to access certain systems of Advanced’s health and care subsidiary. This was done via a customer account, which notably did not have Multi Factor Authentication (MFA). The attack caused massive disruption to critical NHS services and healthcare staff were left unable to access patient records. Advanced was found to have insufficient measures in place, including;

Gaps in deployment of Multi Factor Authentication
A lack of mature vulnerability management scanning mechanisms
Inadequate security patch management

A provisional fine of £6.09million was reduced to £3.07million after Advanced’s proactive engagement with the National Cyber Security Centre, the National Crime Agency and the NHS. Advanced has agreed to pay the fine without appeal. You can read the ICO enforcement notice here.

Key learnings from this case

This action serves as a timely reminder for both controller organisations and service providers to make sure robust measures are in place to protect personal data and ensure systems are secure throughout the supply chain.

Supplier due diligence

While this fine has been imposed on a processor, organisations which engage other parties to provide services have a duty to make sure they work with suppliers who can demonstrate robust standards in data protection and information security.

In our experience, controllers need to make sure they’re asking the right questions before they onboard any new supplier who’d be processing personal data on their behalf – whether this be cloud computing providers, SasS solutions or other technology providers. To give a simple illustration;

Do they have a DPO or another individual in the business who oversees data protection compliance?
Do they have an Information Security Officer, or other related role?
Can they provide evidence of data protection and info sec policies and procedures?
Have they experienced a data breach before?
What information security measures do they have in place?
Are security measures regularly test, and how?

Suppliers for their part need to be prepared to meet client’s due diligence requests, including being able to provide detailed information of data location(s) and security measures and controls in place to protect client data.

We’d stress a proportionate risk-based approach should be taken to this, the more sensitive the data the more robust the checks should be.

Seven quick information security tips

1. Restrict access to your data and services and use Multi Factor Authentication where possible
2. Choose secure settings for your network, devices and software
3. Protect yourself from viruses and other malware
4. Keep your devices and software up to date
5. Keep logs and monitor them
6. Restrict or prevent use of USB / memory drives
7. Back up your data

The ICO has published ransomware and compliance guidance which provides information on how to best protect systems.

Controller-processor contracts

Once satisfied with a prospective supplier’s approach to data protection and information security it’s then vital to make sure contractual terms cover core requirements under UK GDPR. Often covered in a Data Processing Agreement/Addendum, these shouldn’t be overlooked. We’ve written about supplier agreements here.

It’s worth noting liability clauses in such agreements are facing increasing scrutiny, reflecting the increased cost of non-compliance and the fall-out from data breaches. Irina Beschieriu, Deals Counsel for Atos IT Solutions has written an interesting article on this for IAPP and says; “General limitations of liability clauses are no longer considered sufficient to address the specific risks associated with data privacy. Instead, we have seen the rise of dedicated provisions meticulously crafted to address data privacy liabilities specifically. Negotiations surrounding these provisions are now more intense, more detailed, and carry higher stakes than ever before.” See: The growing burden of data privacy liability in tech contracts

While ICO fines are not commonplace, we’d urge both controllers and processors to take heed of this action. In announcing this enforcement action Information Commissioner John Edwards says; “With cyber incidents increasing across all sectors, my decision today is a stark reminder that organisations risk becoming the next target without robust security measures in place. I urge all organisations to ensure that every external connection is secured with MFA today to protect the public and their personal information - there is no excuse for leaving any part of your system vulnerable.” 

UK Data Reform – key changes ahead

March 2025

What data protection teams need to know

Plans to reform the UK’s data laws are making speedy progress through Parliament, with the Data (Use & Access) Bill expected to be passed in April or May.

When enacted, the new law will usher in significant amendments to the Data Protection Act 2018, UK GDPR and the Privacy & Electronic Communications Regulations (PECR), as well as measures which go beyond the realms of data protection and ePrivacy.

Controversial plans to amend UK GDPR’s accountability obligations, led by the previous Conservative Government, are not included. So, requirements in relation to Data Protection Officers, Data Protection Impact Assessments and Records of Processing Activities remain the same.

Some new provisions are likely to make data protection compliance efforts slightly easier, although others will impose increased obligations. Here’s our summary of some key changes ahead, with the caveat there’s still time for further amendments.

Individual privacy rights

New right to complain

People will have the right to raise complaints related to use of their personal data. This will require controllers to make sure they have clear procedures to facilitate complaints, for instance, by providing a complaint form. Complaints will require response within 30 days. Alongside this, organisations may also be obligated to notify the ICO of the number of privacy-related complaints they receive during a specified time period.

In practice this means individuals will first have to seek a resolution directly with an organisation, before escalation to the regulator. This is aimed in part at reducing the volume of complaints the ICO receives.
Some sectors, such as financial services and those who receive FOI requests, will already have complaints procedures in place to meet other legal obligations. For others, these procedures will need to be established.

It’s likely privacy notices will need to be updated to reflect this change. If notification to the ICO of complaint volumes is required, this raises questions about how complaints are categorised and what additional records organisations will be required to keep.

Timescales and seeking clarification

Amendments will clarify the time period for compliance with privacy rights requests. The clock does not start until the organisation is satisfied the requestee is who they say they are (i.e. proof of identity has been received). If an organisation reasonably requests further information to clarify a request, the timescale for responding can be paused (i.e. the ‘clock stops’) until this information is provided. These changes are unlikely to have much operational impact, as they simply provide statutory footing to existing ICO guidance on this subject.

Reasonable and proportionate searches

It’s confirmed organisations should conduct a “reasonable and proportionate” search for personal data in responding to Data Subject Access Requests (DSAR). Again, this gives current ICO guidance a statutory footing, and may prove helpful for organisations handling particularly demanding requests.

Court procedures

Where there’s a legal dispute over the information provided (or not provided) in response to a DSAR, a court will be able to request organisations make such information available for the court to inspect and assess. This means organisations will need to make sure they clearly document non-disclosure decisions, including their justifications. This is something we’d strongly advise doing already.

Right to be informed

The obligation to provide privacy information to individuals (i.e. under Article 13 and 14 of UK GDPR) will not apply if providing this information “is impossible or would involve disproportionate effort”. This is most likely to be particularly relevant where organisations have gathered personal data indirectly, i.e. not directly from the individuals. This was a point of contention in the Experian vs ICO case, where Experian argued it would be disproportionate effort to notify and provide privacy information to the millions of people whose data they process from the Edited Electoral Roll.

Legitimate Interests

Direct marketing

Legitimate interests will be confirmed in law as an acceptable lawful basis where necessary for direct marketing purposes. While there are concerns in some quarters this will lead to more ‘spam’ marketing, I’d stress the direct marketing rules under PECR will still apply, so legitimate interests will remain an option only when the law doesn’t require consent.

Recognised legitimate interests

The concept of ‘recognised legitimate interests’ is to be introduced, whereby organisations will not need to conduct a balancing test (i.e. Legitimate Interests Assessment) when relying this lawful basis for certain purposes. The list of recognised legitimate interests includes the following (and may be expanded):
Disclosures to public bodies, where it is asserted personal data is necessary to fulfil a public function.
Disclosures for national or public security or defence purposes, emergencies,
Disclosures for prevention or detection of a crime, and safeguarding vulnerable individuals.

International Data Transfers

There are amendments to risk assessment requirements for international data transfers. Currently, where there’s no ‘adequacy’ decision for the destination country, organisations need to undertake a Transfer Risk Assessment. Moving forward, organisations transferring data overseas will need to “reasonably and proportionately” consider if the data protection standards in the destination country will be materially lower than those in the UK. This gives potential room to streamline assessment procedures, especially to reduce the burden for low-risk transfers.

Reforms to UK data laws will be scrutinised by the EU Commission when it reviews its adequacy decisions for the UK. These currently allow for the free flow of personal data between the EEA and UK, without the need for additional risk assessments or safeguard measures. The EC review of these decisions was due in June this year, but this has been delayed until December. The general consensus is there’s hopefully nothing considered too radical to scare the horses and UK adequacy will be renewed. Nonetheless, this is one to watch.

Special Category Data

A mechanism is included allowing for future introduction of newly defined special categories of personal data. An example given is ‘neurodata’, which is information gathered from the human brain and/or from the nervous system. As the requirements for processing special category data are restricted under UK GDPR, introducing new types has the potential to lead to significant implications in some sectors.

Automated decision-making

A noteworthy amendment is to be made to Article 22 of UK GDPR which currently places strict restrictions on automated decision-making (including profiling) which results in legal or similar significant effects. This will be relaxed, only applying to automated decisions using special category data. With any other personal data, there will be a requirement to put in place certain safeguards, such as giving individuals the ability to contest decisions and requiring human intervention.

This change will give organisations more flexibility to make automated decisions using ‘normal’ personal data, for example when utilising AI systems. However, there are concerns it could have a negative impact on people’s rights. This also represents a marked distinction between the UK and the EU approaches, which may be a key consideration in the EU’s review UK adequacy.

Steve Wood, Founder of PrivacyX Consulting and former UK Deputy Information Commissioner says: “This creates a real importance on the Code that will be produced by the ICO, covering how the safeguards should be applied in practice. A current priority for the ICO is use of AI in recruitment and this is an emerging area of risk, including the use of AI in fire and hire decisions in the gig economy. Time will tell whether it was premature to remove the precautionary approach of Article 22 when the implications of using AI for automated decision making are still being assessed.”

‘High risk’ AI decisions

People will have the right to request information where a decision is either solely, or in part, based on automated processing including AI and machine learning, and has a legal or similar significant effect on them. Controllers will be required to provide an explanation of the criteria used to reach the decision along with a description of the key factors (or features) which most significantly influenced the decision. Individuals will be able to request human review or details of how to appeal the decision.

Data protection by design to protect children

Amendments to existing law make specific reference to additional protections for children (anyone under the age of 18). When assessing appropriate ‘technical and organisational’ measures in relation to online services likely to be accessed by children, organisations will be legally obliged to take account of how children can best be protected, confirm that children merit additional protection, and have different needs at different ages and stages of development. Such measures strengthen the need to adhere to the UK Children’s Code.

Charities and the marketing ‘soft opt-in’

The use of the ‘soft opt-in’ exemption to consent for electronic marketing is to be extended to charities. This means charities will be able to provide people with an ‘opt-out’ mechanism rather than an ‘opt-in’ to marketing emails (and/or SMS), as long as the following conditions are met:

The sole purpose of the direct marketing is for the charity’s own charitable purpose(s)
Contact details were collected when the individual:
a) expressed an interest in the charity’s purpose(s); or
b) offered or provided support to further the charity’s purpose(s).
An opportunity to refuse/opt-out is given at the point of collection, and in every subsequent communication.

We’ve written about the pros and cons of switching to the ‘soft opt-in’ here.

PECR Fines

Fines for infringements of the Privacy & Electronic Communications Regulations which govern direct marketing and cookies are set to significantly increase. Currently the maximum fine under PECR is currently capped at £500k, but the limits will be brought in line with the much more substantial fines which can be levied under UK GDPR. Reckless disregard for marketing and cookie rules is about to get more costly.

Spam emails and texts

What constitutes ‘spam’ is to be extended to include emails and text messages which are sent, but not received by anyone. This will mean the regulator will be able to consider much larger volumes in any enforcement action, which may result in much higher fines – SPAMMERS BEWARE!

Cookies & similar technologies

Exemptions are set to be introduced from the requirement to collect consent for certain types of cookies and similar technologies, as long as a clear opportunity to opt-out is provided. This will be permitted for purposes such as website analytics and optimising content. I envisage much reconfiguring of the array of website consent management platforms which have been implemented in recent years. But remember, targeting/advertising cookies (including social media targeting pixels) will still need consent.

Alongside these changes the ICO is reviewing PECR consent requirements to “enable a shift towards privacy-preserving advertising models”.  This autumn a statement is expected identifying ‘low-risk’ advertising activities which in the ICO’s view are unlikely to cause harm or trigger enforcement action. You can read more about this in the ICO’s package of measures to drive economic growth.

Research

Purpose limitation and provision of privacy information

Currently, UK GDPR makes it tricky to reuse personal data for new purposes, yet research projects can often move into areas which weren’t anticipated when data was originally collected. A new exemption is to be introduced, in relation to the provision of privacy information. Amendments are also set to be made to the purpose limitation principle to make further ‘RAS purposes’ compatible with the processing. Both these changes are subject to ‘appropriate safeguards’. (‘RAS purposes’ covers processing for scientific and historic research, and archiving in the public interests, and statistical purposes).

Scientific research

The definition of ‘scientific research’ is to be clarified and will explicitly state research can be a commercial or non-commercial activity. Consent for scientific research is to be adapted, in part driven by a desire to make it easier for personal data collected for specific research to be reused for other scientific research purposes.

Commenting on these changes Ellie Blore, Data Protection Officer at Best Companies says; “The aims are to provide greater flexibility for commercial research and innovation. It expands the definition of ‘Scientific Research’ to include certain privately funded and commercial research activities, meaning that some private AI training and research will now be classified under Scientific Research. Furthermore, secondary processing of data for Scientific Research and Development purposes will be considered compatible with the original purpose of data collection, provided the appropriate safeguards are in place. There are exemptions added here, and this will undoubtedly be an area to watch as the Secretary of State will have the power to further vary those safeguards.”

Smart data schemes

Provisions are being introduced to support the growth of new ‘smart data schemes’. The right to portability under UK GDPR currently allows individuals to obtain and reuse their personal data. Moving forward, this will be expanded to allow consumers to request their data is directly shared with authorised and regulated third parties. This will be underpinned by a framework with data security at its core. It’s hoped this will allow for the growth of smart data schemes, enabling data sharing in areas such as energy, telecoms, mortgages and insurance.

Healthcare information

Ever been to hospital and found your GP has no record of your treatment, or the hospital can’t access your GP’s notes? The government is hoping data reform will pave the way for a more consistent approach to information standards and technology infrastructure, so systems can ‘talk’ to each other. For example, allowing hospitals, GP surgeries, social care services, and ambulance services to have real-time access to information such as patient appointments, tests, and pre-existing conditions.

Department Board Appointments

A new measure is to be introduced requiring digital leaders to be represented at executive level within Government departments and other bodies, such as NHS Trusts. At least one of the following roles will need to be appointed to a departmental board or equivalent body; a Chief Information Officer, Chief Technology Officer, Chief Digital Information Officer, Service Transformation Leader or other equivalent role.

Digital verification services

The aim is to create a framework for trusted digital verification services, moving the country away from paper-based and in-person tasks. For example, proposals allow for digital verification services aimed at simplifying processes such as registering births and deaths, starting a new job and renting a home.

New Information Commission

The Information Commissioner’s Office is set to be replaced by an Information Commission. This is to be structured in a similar way to the FCA, OFCOM and the CMA, as a body corporate with an appointed Chief Executive. There’s also provision for the Government to have considerable influence over the operations of the new Commission.

In summary, reform of UK data law has its critics. Among other matters they fear a watering down of people’s rights and an increased ability for personal data to be shared, perhaps recklessly, with and within the public sector. However, the changes are not overly radical, having varying degrees of impact depending on your sector and organisation’s core activities.

Chris Combemale, Director of Policy and Public Affairs at the Data & Marketing Association, welcomes the changes ahead; “The DMA strongly supports the DUA Bill and has worked tirelessly for almost five years to achieve reforms that balance innovation and privacy in accordance with the principles laid out in recital 4 of GDPR. We particularly welcome the greater certainty on the use of legitimate interests as a lawful basis for direct marketing, the extension of the email soft opt-in to charities, exemptions to consent for some types of cookies, greater clarity in Article 22 for automated decision making and the obligation for the ICO to consider innovation and competition alongside privacy.”

Privacy X Consulting’s Steve Wood doesn’t believe the impact will be hugely significant; “The DUA Bill represents an evolution of UK GDPR that should not drive many changes for multi-national companies’ DP governance, which is likely to remain focused around the EU GDPR standard. The more interesting opportunities may lie in the confidence that is provided to the take up of federated digital identity by the statutory underpinning for the Trust Framework and opportunities for data intermediary businesses in relation to the Smart Data provisions.”

UPCOMING ONLINE EVENT – UNWRAPPING UK DATA REFORM

Join a great line up of speakers on 29 April who’ll be discussing the changes under the DUA Bill and taking your questionsBOOK YOUR PLACE  

DPIAs: how to get organisational buy-in

March 2025

Data Protection Impact Assessments (DPIAs) can get a bad rap. Project managers, team leaders and others may not understand them, complain they’re too onerous to complete or say they ‘slow things down’. The result – data protection risks may not be identified or mitigated. Assessments may get overlooked, conducted in a less than thorough way, or get started but remain incomplete.

To banish the negative vibes we need to shout about the benefits of DPIAs. Make sure relevant teams know what they are, when and how to conduct them, and most importantly make sure the process is clearly explained and straightforward to follow.

When used well in the right situations, they can be one of the most useful tools in your organisation’s data protection toolkit. It can’ be stressed enough – DPIAs help to identify, assess and tackle risks before they see the light of day. They help you meet you protect the rights and interests of your customers and employees, protect your business reputation, meet your GDPR accountability obligations and demonstrate how you comply with data protection laws.

Let’s take a look at how we breathe new life into the DPIA process. But first a quick recap on what the law requires…

When DPIAs are mandatory

Sometimes there’s no choice and a DPIA is a ‘must do’. Under GDPR/UK GDPR it is mandatory to conduct a DPIA when projects are likely to represent a ‘high risk’ to those whose personal data is involved. The law gives us three examples:

Large scale use of special category data
Systematic and extensive profiling with significant effect
Public monitoring on a large scale

The above activities are far from routine, so thankfully the UK’s Information Commissioner’s Office (ICO) and other European Data Protection Authorities have published their own lists of processing ‘likely to result in high risk’. For example, the ICO sets out the following:

1. Using innovative technologies or the novel application of existing technologies (including AI).

2. Any decisions which could lead to denial of service; processing which makes decisions about an individual’s access to a product, service, opportunity or benefit which is based to any extent on automated decision-making (including profiling) or involve processing special category data.

3. Large-scale profiling of individuals.

4. Any processing of biometric data, where this is used for identification purposes.

5. Any processing of genetic data (unless by an individual GP or health professional for the provision of health care directly to the person concerned).

6. Combining, comparing or matching personal data gathered from multiple sources.

7. Any invisible processing – this is where personal data is not collected directly from individuals, and they are not aware of how it’s being used (i.e. the effort of providing privacy information to individuals would be disproportionate).

8. Tracking individual’s geolocation or behaviour.

9. Targeting children or other vulnerable individuals.

10. Risk of physical harm – where a personal data breach could jeopardise the physical health or safety of individuals.

For more detail please see the ICO DPIA Guidance.

How to assess ‘high risk’

DPIAs aren’t required for every new or change of activity and insisting teams undertake them too often can turn them into a needless box-ticking exercise and can feed into a general air of malaise.

Judgement calls need to be made to assess ‘high-risk’ and ‘large-scale’ and a method for evaluating where the threshold falls. This will differ depending on sector, nature of data handled, organisational risk appetite and so on. Regulated sectors, such as financial services and telecoms, have more to think about and may adopt a cautious approach. Also, bear in mind a DPIA can be a helpful risk assessment exercise even when a project doesn’t fall under the mandatory requirements.

Adopt a screening process

In my experience, embedding a straight-forward screening questionnaire is a great way to effectively sift through change projects and decide which need a more detailed assessment and which don’t. You can either ask teams to complete the questionnaire, or set aside 30 minutes to lead them through the screening. Then the DPO or data protection leader can make the call. A screening process may include questions such as:

What does the project /activity hope to achieve?
What personal information is involved?
Does this include more sensitive data (like financial details) or special category data?
Where did we source the data from?
Does the activity involve children’s data or others who would be considered vulnerable?
Will data be shared with other organisations?
Could what we’re doing be considered innovative or cutting edge?
Are we using personal details for a new purpose?

This is not an exhaustive list, there are other pertinent questions to ask, but try not to make it too long.

Engage with your teams

First rule of DPIA Club is… we MUST talk about it!

Build relationships with the people who ‘do new stuff’ with your data. The people who run development projects and the key stakeholders – such as heads of the main functions which process personal data across your business, e.g. Marketing, Operations, HR, etc. If you have a Procurement team, then target them too.

Ask what projects they have on the horizon which could affect the way personal data is used. The aim is to make them aware of DPIA requirements and ask them to give you an early ‘heads up’ if they are looking to onboard a new service provider or use data for an innovative new project.

Let them know tech projects and system migrations almost always involve some form of personal data processing or other. They should be mindful of the potential for this to lead to privacy risks.

If they think about data protection from the outset it will save valuable time and money in the long run. Save unwelcome hiccups along the line. Give them examples of how things have gone wrong or could go wrong.

You could raise awareness across the business using your intranet, email reminders, posters, drop-in clinics … whatever it takes to get the message across. ‘Training’ sessions with key stakeholders can also really help to enhance their risk assessment skills.

Use a good DPIA template

In my opinion too many businesses use complex and jargon-filled DPIA templates, which many people find hard to understand. They ask questions in ‘GDPR-talk’ which people find hard to grasp and answer, and they often don’t really help people to identify what privacy risks actually look like.

Take a look at your DPIA template with fresh eyes. If you don’t like it use a better one, or adapt it to fit your business ways of working.

Be prepared for Agile working

Many development projects use Agile methodology; breaking projects into smaller manageable cycles called sprints. These allow teams to adapt quickly to changes and deliver incremental gains more quickly. This means adapting your assessment approach. You won’t get all the answers you need at the start. Stay close to the project as it evolves and be ready to roll your DPIA in line with scheduled sprints.

I hope this has given you some ideas for how to engage your colleagues and freshen up the DPIA process. Dispelling the myth DPIAs are a waste of time, too complex or too onerous is a fight worth winning.

Online services face scrutiny over use of children’s data

March 2025

The importance of compliance with the UK Children’s Code

Social media platforms. Content streaming services. Online gaming. Apps. Any online services popular with children carry an inherent privacy risk.

Along with growing concerns over protecting them from harmful content, there’s an increasing focus on children’s privacy. Beyond the companies you’d expect to be impacted, it’s worth remembering these issues can affect a growing number of other organisations.

We know children are being exposed to inappropriate and harmful content. Some content is illegal like child sexual abuse images or content promoting terrorism, but other material can still cause harm. Such as content promoting eating disorders, or content which is inappropriate for the age of children viewing it, or is overly influential.

The Information Commissioner’s Office (ICO) recently launched an investigation into TikTok, amid concerns at how the platform uses children’s data – specifically around how their data is used to deliver content into their feeds. The regulator is also investigating the image sharing website Imgur and the social media platform Reddit, in relation to their use children’s data and their age verification practices.

These investigations are part of wider interventions into how social media and video sharing platforms use information gathered about children and the ICO say it’s determined to continue its drive to make sure companies change their approach to children’s online privacy, in line with the Children’s Code, which came into force in 2021.

This all serves as a timely reminder of the need to comply with this Code.

What is the Children’s Code?

The Children’s Code (aka ‘Age-Appropriate Design Code’) is a statutory code of practice aimed at protecting children’s privacy online. It sets out how to approach age-appropriate design and gives fifteen standards organisations are expected to meet. These are not necessarily technical standards, but more principles and required privacy features.

Who does the Children’s Code apply to?

A wide range of online services are within scope, including apps, search engines, online games, online marketplaces, connected toys and devices, news and educational sites, online messaging services and much more.

If children are ‘likely’ to access your online service(s), even if they are not your target audience, the code applies. For example, free services, small businesses, not-for-profits and educational sites are all in scope. Companies need to ask themselves – is a child likely to use our product or service online?

What ages does it apply to?

The code adopts the definition of a child under the UN Convention on the Rights of the Child (UNCRC) which is anyone under the age of 18. This means it applies to online services likely to be accessed by older children aged 16 and 17, not just young children. (This shouldn’t be confused with the age of consent for a child, which for online services is 13 in the UK).

Who does the Children’s Code not apply to?

Some public authority services are out of scope. For example, an online public service which is not provided on a commercial basis, or a police force with an online service with processes personal data for law enforcement purposes. Preventative or counselling services are also not in scope, such as websites or apps which specifically provide online counselling or other preventative services to children. However, more general health, fitness and wellbeing apps are in scope.

How do you assess ‘likely to be accessed’ by a child?

Crucially, the Code covers services which may not be specifically ‘aimed or targeted’ at children, but are ‘likely’ to be accessed by them. Each provider will need to assess this, and the Code provides some questions to help you:

Is the possibility of children using your service more probable than not?
Is the nature and content of the service appealing to children even if not intended for them? (Remember this includes older 16-17 year olds)
Do you have measures in place to prevent children gaining access to an adult only service?

This assessment may not always be clear cut, but it’s worth nothing the Code states:

If you decide that your service is not likely to be accessed by children and that you are therefore not going to implement the code then you should document and support your reasons for your decision. You may wish to refer to market research, current evidence on user behaviour, the user base of similar or existing services and service types and testing of access restriction measures.

If you initially judge that the service is not likely to be accessed by children, but evidence later emerges that a significant number of children are in fact accessing your service, you will need to conform to the standards in this code or review your access restrictions if you do not think it is appropriate for children to use your service.

This means there’s a clear expectation online services may need to have evidence if they decide they do not need to conform with the Code.

The 15 standards of the Children’s Code

The Code is extremely detailed. Here’s a summary of the salient points:

1. Best interest of the child – consider the needs of children using your service and how best to support those needs. The best interests of the child should be a primary consideration when designing and developing an online service.

2. Data Protection Impact Assessments (DPIAs) – these should be carried out to identify and minimise any data protection risks to children.

3. Age-appropriate application – assess the age range of your audience. Remember, the needs of children of different ages should be central to design and development. Make sure children are given an appropriate level of protection about how their information is used.

4. Transparency – UK GDPR specifically states privacy information must be easy for children to find and easy for them to understand. The Code says you should consider bite-sized ‘just in time’ notices when collecting children’s data.

5. Detrimental use of data – don’t use children’s information in a way which would be detrimental to children’s physical or mental health and well-being.

6. Policies and community standards – if children provide you with their personal information when they join or access your service, you must uphold your own published T&Cs, policies and community standards. Children should be able to expect the service to operate in the way you say it will and for you to do what you say you are doing.

7. Default settings – privacy settings for children must be set to ‘high’ by default, unless you can demonstrate a compelling reason for taking a different stance. It’s not enough to allow children to activate high privacy settings, you need to provide these by default.

8. Data minimisation – only collect and keep the minimum amount of data about children that’s necessary for you to provide your service.

9. Data sharing – do not share children’s data unless you can demonstrate a compelling reason to do so. The word ‘compelling’ is significant here. It means the bar for non-compliance is set very high.

10. Geolocation – any geolocation privacy settings should be switched off by default. Children should have to actively change the default setting to allow their geolocation to be used (unless you can demonstrate a compelling reason for this to be switched on).

11. Parental controls –make it clear to children if parental controls are in place and if children are being tracked or monitored by their parents.

12. Profiling – options which use profiling should be switched off by default (again, unless you can demonstrate a compelling reason for these to be switched on). This doesn’t mean profiling is banned, but if conducted measures should be in place to protect children from harmful effects. If profiling is part of the service you provide, you need to be sure this is completely necessary.

13. Nudge techniques – don’t use techniques which lead or encourage children to activate options which mean they give you more personal information, or turn off privacy protections.

14. Connected toys and devices – you need to conform with the code if you sell toys or devices (such as talking teddy bears, fitness bands or home-hub interactive speakers) which collect personal data and transmit via a network connection. If your electronic toys or devices do not connect to the internet and only store data in the device itself, the code does not apply.

15. Online tools – it must be easy for children to exercise their privacy rights and report concerns. The right to erasure is particularly relevant when it comes to a children. Mechanisms used to help children exercise their rights should be age appropriate, prominent and easy to use.

The ICO investigations into TikTok, Imgur and Reddit should be seen as a direction of travel for us all. Going forward, they signal how the regulator intends to treat compliance around children’s online privacy, welfare and safeguarding.

If your service uses children’s data even tangentially, it’s worth remembering, re-examining and considering the Code and how it might impact on your business.

Right to erasure in the spotlight: how to manage requests

March 2025

10 tips to tackle erasure requests

The European Data Protection Board (EDPB) has announced a year-long focus on the right to erasure. Data Protection Authorities across Europe are taking part in this Coordinated Enforcement Framework initiative and will be contacting a number of organisations from different sectors, either launching formal investigations or undertaking fact-finding exercises.

The EDPB chose to focus on the right to erasure as its one of the most frequently exercised GDPR rights and one DPAs frequently receive complaints about. In my work I find organisations are not always handling these requests appropriately and often don’t have clear and comprehensive procedures in place.

While this is not a UK specific initiative, this right can raise a number of questions wherever your organisation is located. When can we refuse? What data should we erase? And, on a technical level, how do we make sure everything that needs to be erased is actually destroyed, especially when the data is held on multiple systems?

It can raise complex challenges. Add to this the tight timeframe to action individual requests and the dreaded bulk requests from third parties, and it can turn into a bit of a minefield. We’ve got some tips to help you navigate the mines. But first, a little refresher on what the right to erasure means.

What is the right to erasure?

As the name suggests, a person has the right to request their personal data is erased from your systems if you no longer have a compelling lawful reason to keep it. This applies to ALL systems, back-ups and even data held in the cloud.

You may hear it referred to as the ‘Right to be Forgotten’. This stems from a decision in 2014 by the Court of Justice of the EU which recognised the right of EU citizens to request the removal of links to personal information on search engines. GDPR took this ruling a step further and enshrined a broader right into EU law, taking it beyond the context of publicly available personal information. Under the UK GDPR the right remains the same as its EU counterpart.

Crucially, the right to erasure is not an absolute right. Organisations may have a clear justification for denying a request either in part or in full.

When does the right to erasure apply?

You need to fulfil a person’s request for erasure in the following circumstances:

It’s no longer necessary for your organisation to retain the personal data for the purposes it was collected;
They gave you their consent to use their personal data for a specific purpose/s and they have now withdrawn their consent;
You’re relying on legitimate interests as your lawful basis to handle their data, they object to this, and you have no compelling and overriding legitimate interest to continue to hold it;
You’re fulfilling a legal ruling or legal obligation to erase the data;
You’re processing a child’s data to provide information services (i.e. online services) and an appropriate party is making the request, be this a parent or guardian, or the child themselves if they are of a competent age to decide for themselves.
You’re handing their data unlawfully.

The last point, a general ‘catch-all’ is a tricky one to balance as there may be many reasons why personal data could be processed unlawfully. For example, the handling of personal data might be considered unlawful if it’s inaccurate, or even if necessary information about your processing activities was not provided in a privacy notice.

When can an erasure request be refused?

The law specifically tells us the right to erasure will not apply when you’re holding personal data for the following reasons:

to exercise the right of freedom of expression and information;
to comply with a legal obligation;
for the establishment or defence of legal claims;
to perform a task carried out in the public interest or when exercising and organisation’s official authority;
for public interest in the area of public health;
for archiving purposes in the public interest, scientific or historical research or statistical purposes (where erasure would make this impossible or seriously impair your objectives).

Under UK GDPR and the Data Protection Act 2018 there are two specific circumstances where the right to erasure doesn’t apply to special category data. Further information about these exemptions can be found in the ICO erasure guidance.

It’s also important to consider whether you have a contract in place with the individual, which necessitates the continued processing of their data. There may also be grounds for a refusing a request where you can justify its manifestly unfounded or excessive.

There are many variables at play and each request needs to be assessed on a case-by-case basis. This is where the devil really is in the detail. In more complex cases you’ll need to consider the potential fallout should you delete personal data and subsequently discover you really needed to keep it. If you have a robust justification for needing to keep personal data, then you should keep it and document the reason(s) for your decision. This highlights the requirement for accurate record keeping, not only for erasure requests but for all privacy rights requests.

If you refuse to comply with a request (either in part or in full), you must explain why and tell the individual they have the right to raise a complaint with the UK’s Information Commissioner’s Office, or other relevant Data Protection Authority.

10-point checklist for handling erasure requests

1. Awareness

An individual can request their personal data is erased either in writing or verbally. They might make this request to anyone in your organisation. So, everyone in your organisation needs to know how to recognise this type of request, what to do if they receive one, and who to direct it to. Awareness campaigns, training and easy-to-understand policies and guides all play their part in getting the message across to all staff.

2. Identity verification

You clearly don’t want to delete someone’s details unless you are absolutely sure they are who they say they are. Sometimes this will be obvious, but in other circumstances you’ll need to ask for verification of identity. However, if the deletion has not negative impact on the individual, for example they are only on your marketing list, asking for proof of identity is likely to be a disproportionate step.

When asking for proof of identity only ask for the minimum amount of information necessary to confirm identity. Don’t accumulate additional personal information such as copies of passports or driving licences, unless it’s truly justified, and remember to destroy these too!

If a request is received via another organisation, make sure the third party genuinely has the authority to act on behalf of the individual in question. The responsibility lies with the third party to provide any necessary evidence to prove this.

3. Technical measures

Your customers might think deleting their data is as simple as clicking a button. If only it were that easy!

It can be difficult to locate, identify, assess and properly destroy data – especially if it’s held on many different systems. You might hold records on emails, backed-up systems, on the cloud… all must be deleted.

Make sure your systems, applications and databases allow easy identification and deletion of individuals. You may also need to assess the implications of deletion; it can impact on how different software works.

This is where the concept of Data Protection by Design really supports businesses. If from the outset of any new project or onboarding of new technology systems you factor in how to successfully manage all individual privacy rights, it will make life much easier in the long run.

It’s worth reiterating – the right to erasure extends to deleting data from backups. However, the ICO recognises the inherent difficulties here and says, “the key issue is to put the backup data ‘beyond use’, even if it cannot be immediately overwritten.”

4. Timeline

You don’t have long to comply with erasure requests, so keeping track of time is crucial. The request must be actioned ‘without undue delay,’ and in any case within one calendar month of receiving it. You may be able to extend this by up to two months if it’s particularly complex. If you need to extend, make sure you tell the individual before the first month is up, giving them clear reasons for the delay.

5. Who else holds their data?

The right to erasure doesn’t just apply to the records your organisation holds. You’re also expected to inform both your suppliers (processors) and other controllers you have shared it with.

Having a clear understanding of all your suppliers and other organisations you share personal data with, such as in your Record of Processing Activities, means you can efficiently contact them and inform them of erasure requests. You don’t have to do this if it would prove impossible or involves disproportionate effort, but you may need to be able to justify this is genuinely the case.

6. Public domain data

The right to erasure also applies to personal data which has been made public in an online environment (‘the right to be forgotten’). So take note if you publish personal data, or pass it on for others to publish.

You need to be ready to take reasonable steps to inform other organisations who are handling the personal data; asking them to erase links to, copies of, or replication of the data. What’s ‘reasonable’ is another judgement call, and the expectation scales with size; the bigger your organisation and the more resources you have, the more you’ll be expected to do.

7. Children’s data erasure rights

Children have special protection under data protection law, and the right to erasure is particularly relevant when a child has given their consent (or their parent/guardian has) and at a later stage (even when they’re an adult) they want their personal information removed, especially if it’s available on the internet. Baking in the ability to delete children’s information from the start is crucial.

8. Exemptions

It’s helpful to have a clear checklist of the exemptions which might apply and be relevant for your organisation. They don’t all apply in the same way, so be sure to examine each exemption on a case-by-case basis. The ICO exemptions guide is a good starting point, and it’s likely you’ll also need to reference the Data Protection Act 2018.

9. Maintain an erasure log

How do we delete someone, but also prove we have done it? Feels ambiguous doesn’t it? However, organisations are required to keep a log of erasure requests, actions taken and justifications for these to demonstrate compliance. The key is only recording the minimum amount of information necessary to meet this obligation, and keeping this secure. I know some organisations who’ve taken the step of making sure this log is pseudonymised for extra protection.

10. Minimisation and retention

The right to erasure (and indeed other privacy rights, such as DSARs) can be less complex if you try to stick to two of the core data protection principles; data minimisation and data retention (storage limitation). Collecting ‘just enough’ data in the first place, using it in specified ways and only keeping it for as long as you need it, means there’s less data to trawl through when an erasure request comes in.

Sounds simple, less easy in practice, but worth the effort. For useful tips, tools and templates see our Data Retention Guide.

Giving those responsible for handling erasure requests a clear procedure to follow which covers the key considerations and how to actually fulfil requests in practice, is really worth developing.  With the right elements in place you’ll be in a much better place to handle the right to erasure effectively, within the statutory timescale and with less risk of mistakes.