Court of Appeal rejects appeal against ICO fine

December 2024

The very first fine the ICO issued under the GDPR was back in 2019. It was to pharmacy, for storing unlocked boxes containing sensitive medical information in the yard behind its offices. More than five years later, the fine has yet to be paid.

The initial penalty notice was for £275,000 against Doorstep Dispensaree, a pharmacy in Edgware, North London. The company appealed, arguing the ICO’s actions were disproportionate and failed to take into consideration the firm’s financial hardship. It also argued less personal information was affected than originally thought. 67,000 documents were involved, rather than the 500,000 the original enforcement notice cited. Furthermore, the pharmacy claimed their backyard storage area was largely secure from public access.

The fine was subsequently reduced to £92,000.

As an aside, I’d suggest this is still a huge number of records stored in unlocked boxes. The data concerned involved customer’s names, addresses, dates of birth, NHS numbers, medical information and prescriptions.

This wasn’t the end of it. Doorstep Dispensaree raised a subsequent appeal, arguing the judge in the previous appeal failed to recognise the burden of proof lay with the ICO, and that undue weight had been given to the ICO’s reasons for opposing and setting a penalty.

In a decision welcomed by the ICO, the Court of Appeal has now dismissed this appeal. It ruled the burden of proof should lie with the appellant, Doorstep Dispensaree, and subsequent tribunals and appeals aren’t required to ignore original monetary penalty notices when making decisions.

Responding to the news, Information Commissioner John Edwards said, “I welcome the Court of Appeal’s judgment in this case as it provides clarity for future appeals. We defended our position robustly and are pleased that the court has agreed with our findings.”

The ICO has been much criticised for its lack of enforcement action under GDPR. It’s issued multiple fines under the Privacy and Electronic Communications Regulations (PECR), but fewer under GPDR (now UK GDPR). This may be due to the fact violating the PECR rules can be more clearcut. While much of the criticism may be fair, I believe this case demonstrates the legal hurdles the Regulator can face when taking enforcement action. However, the more cases we get, the more case law we’ll have for UK GDPR.

2024’s Data Protection Milestones

December 2024

Each year sees significant developments across the data protection landscape. I’ve asked industry insiders for their ONE most significant data protection or ePrivacy related milestone of 2024. Interestingly, everyone offered a different take. And all of these milestones will remain significant well into 2025.

UK GENERAL ELECTION AND DATA BILLS

Chris Combemale, Chair of the Data and Marketing Association

The most significant event for me was the General Election. For three years the DMA worked hard with the former government to ensure key reforms were included in the DPDI Bill, including certainty around legitimate interest as a lawful basis for direct marketing. At the time the election was called, DPDI was in the final stages of passage in the House of Lords. The DMA campaigned throughout the election to persuade the new government to pick up the mantle, including a joint letter to all political parties from the DMA, Tech UK and other members of the Business Advisory Group which I chaired. Our efforts paid off and the Data (Use and Access) Bill is now at Committee Stage in the House of Lords. DUA brings forward the best parts of DPDI while dropping the most controversial reforms, salvaging years of work and creating a better Bill that will transform public services, contribute to growth in the economy and maintain high levels of data protection.

Simon Blanchard, Data Protection Consultant, DPN Associates

The DUA Bill includes plans for Smart Data Schemes which allow consumers and businesses to safely share personal information with regulated and authorised third parties, for example, to generate personalised market comparisons. There are plans to create a framework for trusted identity verification services which could simplify processes like starting a new job, renting a home, as well as registering births and deaths. For me it’s significant there are now no plans to dilute accountability obligations under UK GDPR (e.g. remove the Data Protection Officer role and no changes to DPIA and RoPA requirements). DUA will give a statutory footing for many commonly used practices regarding Data Subject Access Requests. Certain legitimate interests will become ‘recognised’, such as national security, safeguarding and emergency response. The Bill’s progress is definitely one to watch in 2025. Updated DPN Legitimate Interests Guidance v3

DOORS OPENED TO EU PRIVACY ‘CLASS ACTIONS’

Fedelma Good, Data Protection and ePrivacy Consultant

Top of my list was definitely going to be the news Australia had introduced a law banning social media use for under 16s, not least because of all the attendant concerns that have been expressed it will actually backfire, driving teenagers to the dark web, or making them feel more isolated. Well, at least this was top of my list right up until the announcement on 3rd December that the privacy rights group noyb had been approved in Austria and Ireland – but with validity throughout the EU – as a so-called ‘Qualified Entity’ to bring collective redress actions in courts throughout the European Union. I would really love to have a crystal ball to be able to see if a few years from now we will see Max Schrem’s, chair of nyob, comment that “So far, collective redress is not really on the radar of many – but it has the potential to be a game changer,” as the understatement of the decade.

AI & DATA PROTECTION COMPLIANCE

Steve Wood, Consultant and Researcher, Privacy X and former Deputy Commissioner, ICO

In 2024 our community has dug deeper into the key implications of AI for data protection compliance. We’ve seen a range of consultations from data protection regulators globally. Addressing issues such as whether large language models are classed as personal data, when legitimate interests can apply as a lawful basis, how data subjects’ rights apply to AI models and what safeguards to mitigate DP risks. Given the pivotal role the EU GDPR plays in global data protection governance the key event for me will come right at the end of the year, just before the 23 December (some Xmas holiday reading!) when the EDPB will release their GDPR Article 64(2) Opinion and AI models, requested by the Irish Data Protection Authority. The Opinion will provide a significant regulatory framing for the approach companies need to take to AI governance for the coming years, noting the breadth of application of the GDPR compared to the focus of the EU AI Act on high-risk systems.

GLOBAL ADOPTION OF DATA PROTECTION PRINCIPLES

Robert Bond, Senior Counsel, Privacy Partnership Law

The one most significant data protection event in 2024 for me was the number of countries around the world who were passing and updating their data protection law significantly influenced by the GDPR. From Kenya to Sri Lanka, from Australia to Saudi Arabia and from China to many States in the USA, the similarities around data protection principles, data subject rights and data transfer restrictions are considerable. Whilst these global developments may not apply to smaller organisations, in the case of multinationals, the ROI for all the hard work invested in complying with the GDPR is that complying with data protection laws in other parts of the world is getting somewhat easier.

UNLAWFUL INTERNATIONAL DATA TRANSFERS

Eduardo Ustaran, Partner Hogan Lovells International LLP

An issue which has returned as a top priority for regulators is cross-border data transfers. Due to geopolitical tensions, the resulting increase in surveillance and the populist appeal of data localisation, the legal restrictions on international data transfers have attracted implacable scrutiny and enforcement. A worrying concern in this area is that there seems to be no room for a balanced assessment of the risk in practice, as the mere possibility of access to data by law enforcement or intelligence agencies is leading regulators to conclude that such transfers are unlawful. This regulatory line of thinking poses a real test for everyone seeking to apply a pragmatic, risk-based approach to legitimising global data flows.

CASE LAW & THE DEFINITION OF ‘PROCESSING’

Claire Robson, Governance Director, Chartered Insurance Institute

An interesting development in case law came in the decision of the Court of Appeal in Farley v Paymaster (trading as Equiniti), a case about infringement of data protection rights in postal misdirection. Over 450 current and former police officers took action against their pension administrator, after statements were sent to out-of-date addresses. The High Court dismissed many of the claims, stating there was not enough evidence to show the post (pension benefits statements) had been seen by a third party, so no processing had occurred. The Court of Appeal overturned this, granting permission for claimants to appeal. It felt there was prospect of success in claiming processing had taken place through extraction of the information from the database, electronic transfer of data to the paper document, along with the mistaken address and was not necessary to rely on a third party reading the statement. An interesting one for Data Controllers to watch in how this develops and what it means for the definition of, and limits to ‘processing’.

LACK OF ICO ENFORCEMENT

Emma Butler, Data Protection Consultant, Creative Privacy

For me, sadly, the most significant event of 2024 has been the decline of data protection enforcement. Yes, we have seen fines for marketing breaches and some enforcement notices, but there has been a long list of serious compliance breaches with significant impacts on people that have only received a reprimand. This leads me to wonder how bad it has to get before there is serious enforcement action to change behaviours. I have seen a corresponding lessening of the importance of compliance among organisations in terms of increased risk appetites for non-compliance, and feeling they can ‘get away with’ practices because ‘everyone else is doing it’ and they see no consequences from the ICO. I have also noticed a decrease in DPO / senior roles and more combining of the DP role with other functions, as well as low salaries for the roles that exist. Not a vintage year.

REJECT ALL COOKIES

For my part, a significant change this year has been the ‘reject all’ button springing up on so many UK websites. Giving people a clear option to reject all non-essential cookies. (Albeit this is certainly not universal and I’m not sure clicking ‘reject all’ always works in practice). This change followed an ICO warning late in 2023 to the operators of some of the country’s most popular websites, demanding compliance with the cookie rules. Particularly focused on advertising/targeting cookies, website operators were told they had to make it as easy to reject all, as it is to accept all. We then saw some websites moving to the controversial consent or pay model; which gives users a choice 1) pay for an ad-free service 2) consent to cookies, or 3) walk away. I’ll be watching closely for the ICO’s hotly awaited views on the legitimacy of this approach. I’m also pleased it looks like the DUA Bill will pave the way for first party website analytics cookies to be permitted without consent.

As you can see, from the DUA Bill to AI, global privacy laws to data transfers and the real possibility of EU ‘class actions’, these milestones are likely to keep the industry busy well into 2025 and beyond. And we’ll continue to keep you updated of the most significant developments as they happen.

Using AI tools for recruitment

November 2024

How to comply with GDPR

AI tools offer dynamic, efficient solutions for streamlining recruitment processes. AI is capable of speedily identifying and sourcing potential candidates, summarising their CVs and scoring their suitability for the role.

What’s not to like?

Nonetheless, these processes must be fair and lawful. Is there a potential for bias and/or inaccurate outputs? How else will AI providers use jobseekers’ personal details? What data protection compliance considerations are baked into the AI’s architecture?

The Information Commissioner’s Office (ICO) is calling on AI providers and recruiters to do more to make sure AI tools don’t adversely impact on applicants. People could be unfairly excluded from potential jobs and/or have their privacy comprised. Why undo the good work HR professionals undertake to satisfy legal and best practice by using questionable technology?

The ICO recently ran a consensual audit of several developers and providers of AI recruitment tools. Some of the findings included;

Excessive personal data being collected
Data being used for incompatible purposes
A lack of transparency for jobseekers about how AI uses their details

The AI Tools in Recruitment Audit Report provides several hundred recommendations. The unambiguous message is using AI in the recruitment processes shouldn’t be taken lightly. Of course, this doesn’t mean recruiters shouldn’t embrace new technologies, but does mean sensible checks and balances are required. Here’s a summary of key ICO recommendations, with some additional information and thoughts.

10 key steps for recruiters looking to engage AI providers

1. Data Protection Impact Assessment (DPIA)

DPIAs are mandatory under GDPR where a type of processing is likely to result in high risk. The ICO says ‘processing involving the use of innovative technologies, or the novel application of existing technologies (including AI)’ is an example of processing they would consider likely to result in a high risk.

Using AI tools for recruitment purposes squarely meets these criteria. A DPIA will help you to better understand, address and mitigate any potential privacy risks or harms to people. It should help you to ask the right questions of the AI provider. It’s likely your DPIA will need to be agile; revisited and updated as the processing and its potential impacts evolve.

ICO DPIA recommendations for recruiters:

Complete a DPIA before commencing processing that is likely to result in a high risk to the people’s rights and freedoms such as procuring an AI recruitment tool or other innovative technology.
Ensure DPIAs are comprehensive and detailed, including:
– the scope and purpose of the processing;
– a clear explanation of relationships and data flows between each party;
– how processing will comply with UK GDPR principles; and consideration of alternative approaches.
– Assess the risks to people’s rights and freedoms clearly in a DPIA, and identify and implement measures to mitigate each risk.
Follow a clear DPIA process that follows the recommendations above.

2. Lawful basis for processing

When recruiting organisations need to identify a lawful basis for this processing activity. You need to choose the most appropriate of the six lawful bases such as consent or legitimate interests.

To rely on legitimate interests you will need to:
1. Identify a legitimate interest
2. Assess the necessity
3. Balance your organisation’s interests with the interests, rights and freedoms of individuals.

This is known as the ‘3-stage test’. We’d highly recommend you conduct and document a Legitimate Interests Assessment. Our recently updated Legitimate Interests Guidance includes a LIA temple (in Excel). Your DPIA can be referenced in this assessment.

3. Special category data condition

If you will be processing special category data, such as health information or Diversity, Equity and Inclusion data (DE&I), alongside a lawful basis you’ll need to meet a specific special category condition (i.e. an Article 9 condition under UK GDPR).

It’s worth noting, some AI providers may infer people’s characteristics from candidate profiles rather than directly collecting it. This can include predicting gender and ethnicity. This type of information even if inferred, will be special category data. It also raises questions about ‘invisible’ processing (i.e. processing the individual is not aware of) and a lack of transparency. The ICO recommends not using inferred information in this way.

4. Controller, processor or joint controller

Both recruiters and AI providers have a responsibility for data protection compliance. It should be clear who is the controller or processor of the personal information. Is the AI provider a controller, joint-controller or processor? The ICO recommends this relationship is carefully scrutinised and clearly recorded in a contract with the AI provider.

If the provider is acting as a processor, the ICO says ‘explicit and comprehensive instructions must be provided for them to follow’. The regulator says this should include establishing how you’ll make sure the provider is complying with these instructions. As a controller your organisation should be able to direct the means and purpose of the processing and tailor it to your requirements. If not, the AI provider is likely to be a controller or joint-controller.

5. Data minimisation

One of the core data protection principles is data minimisation. We should only collect and use personal information which is necessary for our purpose(s). The ICO’s audit found some AI tools collected far more personal information than necessary and retained it indefinitely to build large databases of potential candidates without their knowledge. What might make perfect sense to AI or the programmers creating such technology might not be compliant with data protection law!

Recruiters need to make sure the AI tools they use only collect the minimum amount of personal information required to achieve your purpose(s). (A purpose/purposes which should be clearly defined in your DPIA and, where relevant, your LIA).

There is also an obligation to make sure the personal details candidates are providing are not used for other incompatible purposes. Remember, if the AI provider is retaining data and using this information for its own purposes, it will not be a processor.

6. Information security and integrity

As part of the procurement process, recruiters need to undertake meaningful due diligence. This means asking the AI provider for evidence that appropriate technical and organisational controls are in place. These technical and organisational controls should also be documented in the contract. The ICO recommends regular compliance checks are undertaken while the contract is in place, to make sure effective controls remain in place.

7. Fairness and mitigating bias risks

Recruiters need to be confident the outputs from AI tools are accurate, fair and unbiased. The ICO’s audit of AI recruitment providers found evidence tools were not processing personal information fairly. For example, in some cases they allowed for recruiters to filter out candidates with protected characteristics. (Protected characteristics include; age, disability, race, ethnic or national origin, religion or belief, sex and sexual orientation). This should be a red flag.

You should seek clear assurances from the AI provider they have mitigated bias, asking to see any relevant documentation. The ICO has published guidance on this: How to we ensure fairness in AI?

8. Transparency

Are candidates aware an AI tool will used to process their personal details? Clear privacy information needs to be provided to job seekers which explains how and why the AI tool is being used. The ICO says this should extend to explain the ‘logic involved in making predictions or producing outputs which may affect people’. Candidates should also be told how they can challenge any automated decisions made by the tool.

The regulator recommends producing a privacy notice specifically for candidates on your AI platform which covers relevant UK GDPR requirements.

9. Human involvement in decision-making

There are strict rules under GPDR for automated decision-making (including profiling). Automated decision-making is the process of making a decision by automated means without any human involvement. A recruitment process wouldn’t be considered solely automated if someone (i.e. a human in the recruitment team) weighs up and interprets the result of an automated decision before applying it to the individual.

There needs to be meaningful human involvement in the process to prevent solely automated decisions being made about candidates. The ICO recommendations for recruiters include:

Ensure that recruiting managers do not use AI outputs (particularly ‘fit’ or suitability scores) to make automated recruitment decisions, where AI tools are not designed for this purpose.
Offer a simple way for candidates to object to or challenge automated decisions, where AI tools make automated decisions.

10. Data Retention

Another core data protection principle is ‘storage limitation’. This means not keeping personal data for longer than necessary for the purpose(s) it was collected for. It’s important to assess how long the data inputted and generated from AI tools will be kept for. Information about retention periods should be provided in relevant privacy information provided to job applicants (e.g. in an Applicant Privacy Notice provided on your AI platform).

The ICO says data retention periods should be detailed in contracts, including how long each category of personal information is kept and why. Plus what action the AI provider must take at the end of the retention period.

Summary

The ICO acknowledges the benefits of AI and doesn’t want to stand in the way of those seeking to use AI driven solutions. It does, however, ask recruiters to consider the technology’s compatibility with data protection law.

AI is a complex area for many and it’s easy to see how unintended misuse of personal data, or unfairness and bias in candidate selection could ‘slip through the cracks’ in the digital pavement. HR  professionals and recruiters can avoid problems later down the line by addressing these as Day One issues when considering AI.

Fairness and respect for candidate privacy are central principles of HR best practice and necessary for data protection compliance. Applying these to new technological opportunities shouldn’t come as a surprise. Including your data protection team in the planning stage can help to mitigate and possibly eliminate some risks. A win-win which would leave organisations more confident in reaping the benefits AI offers.

DPN Legitimate Interests Guidance and LIA Template (v 3.0)

Published in November 2024 this third version of our established Legitimate Interests Guidance aims to help organisations assess whether they can rely on legitimate interests for a range of processing activities. Routine or more complex activities, such as those involving the use of AI. First published in 2017, this updated version includes an improved LIA template (in Excel) to use when conducting your own Legitimate Interests Assessments.

Legitimate Interests Guidance from the Data Protection Network

Many thanks to PrivacyX Consulting and Privacy Partnership Law for working with us on this latest version. We’d also like to thank the original Legitimate Interests Working Group of 2017/2018, comprising representatives from a wide range of companies and institutions, who collaborated to produce previous versions.

UK Data (Use & Access) Bill: Key Proposals

October 2024

What DPOs and data protection teams need to know 

The Government’s Data (Use & Access) Bill was introduced to Parliament and had its first reading in the House of Lords on 23 October. This is a new name for the Digital Information and Smart Data Bill, announced in the King’s Speech back in July. With the acronym DUA, this Bill revives some, but certainly not all aspects of the previous Government’s Data Protection & Digital Information Bill (DPDI) which fell by the wayside when the general election was announced.

At 262 pages it’s a lengthy document, so we’ve provided a summary of some key proposals likely to be of interest to those working in data protection-related roles. Of course, at this stage everything is subject to change as the Bill progresses through Parliament.

DATA PROTECTION (UK GDPR & DPA 2028)

1. Accountability requirements NOT changed

The previous DPDI’s controversial plans to amend accountability obligations under UK GDPR have not been carried over into DUA. There are no plans to remove the requirement for organisations which meet certain criteria to appoint a Data Protection Officer, nor are there any planned changes relating to Data Protection Impact Assessments or Records of Processing Activities.

Some organisations may be disappointed more flexibility is not planned in these areas. However, we’d stress UK GDPR is already littered with the words ‘proportionate’ and ‘appropriate’. Small-to-medium sized businesses are not currently expected to put in place as robust measures as larger organisations, unless the nature of their business activities and the sensitivity of the personal data they handle warrants it.

2. Data Subject Access Requests (DSARs)

In the main, the proposals in relation to the Right of Access (aka Data Subject Access Requests) aim to give a statutory footing to practices already commonly applied. Such as confirming:

Organisations can ask the requestee for details of the information or activities a DSAR relates to, and to pause the time period for responding to the request while seeking this information. For example, the ability to seek clarification when the organisation “processes a large amount of information concerning the data subject”.

The time period for compliance with a DSAR does not begin until the organisation is satisfied the requestee is who they say they are; i.e., any necessary proof of identity has been received.

The search for personal data in response to a DSAR would only need to be “reasonable and proportionate”.

Making these points crystal clear in law would create certainty for organisations, who currently rely on guidance from the Information Commissioner’s Office. Many organisations may be disappointed the concept of ‘vexatious’ requests has not been revived from the abandoned DPDI bill.

3. Privacy notices & the right to be informed

The DUA Bill proposes the obligation to provide privacy information to individuals under Articles 13 and 14 (e.g. via a privacy notice) will not apply if providing this information ‘is impossible or would involve disproportionate effort‘. This move could be viewed as an attempt to water down requirements to notify individuals of the processing taking place. This was a particular point of contention in the Experian vs ICO case. In relation to its processing of the Edited Electoral Roll, Experian argued it would be disproportionate effort to notify and provide privacy information to millions of people.

4. Recognised legitimate interests

The concept of ‘recognised legitimate interests’ is revived from the DPDI Bill. It’s proposed organisations would be exempt from conducting a full Legitimate Interests Assessment (LIA) for certain specified purposes; such as national security, emergency response, and safeguarding. The DUA Bill also looks to confirm legitimate interests as an acceptable lawful basis where necessary for direct marketing purposes. Clearly, legitimate interests will only be an option when the law doesn’t require consent, for example under the Privacy and Electronic Communications Regulations (PECR).

5. Automated decision-making

Noteworthy changes are proposed aimed at making it easier for organisations to use automated decision-making more widely. For example, using artificial intelligence (AI) systems. Currently, Article 22 of UK GDPR places strict restrictions on automated decision-making (including profiling) which could produce legal or similarly significant effects. The new Bill seeks to reduce the scope of Article 22 to only cover automated decisions made using special category data. There is likely to be concern this will have a negative impact on people’s rights in relation to automated-decisions made about them using other personal data. This also may put the UK out of kilter with the EU.

6. Data protection complaints procedure

It is proposed for organisations to be obligated to make sure they have clear procedures so people can raise complaints in connection with the use of their personal data. For example, organisations would need to:

Facilitate people’s ability to make complaints (for instance by providing a complaint form).

Respond to complaints within 30 days of receipt.

Notify the Information Commissioner of the number of complaints received in specified periods.

PRIVACY & ELECTRONIC COMMUNICATIONS REGULATIONS (PECR)

The following changes to PECR are proposed:

1. PECR Fines

Significantly increasing potential fines for infringements of PECR to bring them in line with the level of fines under UK GDPR. Currently, the maximum fine under PECR is capped at £500k.

2. Analytics cookies

Permitting the use of first-party cookies and similar technologies for website analytics without a requirement to collect consent. Also included is a provision to allow for the introduction of other circumstances in which cookie consent would not be required.

3. Spam emails and texts

Expanding what constitutes ‘spam’ to include emails and text messages which are sent but not received by anyone. This will mean the ICO can consider much larger volumes in any enforcement action. In conjunction with higher fines – SPAMMERS BEWARE!

THE INFORMATION COMMISSION

The plan is for the Information Commissioner’s Office to be replaced by an Information Commission. This would be structured in a similar way to the Financial Conduct Authority and the Competitions and Markets Authority, with an appointed Chief Executive. There’s also provision for the Government to have considerable influence over the operations of the new Commission. For example, this could include determining the number of Commission members and a requirement for the Commission to consult with the Government on the appointment of a Chief Executive.

SMART DATA SCHEMES

The Government announcement states: ‘the Bill will create the right conditions to support the future of open banking and the growth of new smart data schemes, models which allow consumers and businesses who want to safely share information about them with regulated and authorised third parties, to generate personalised market comparisons and financial advice to cut costs.

The Right to Portability under UK GDPR currently allows individuals to obtain and reuse their personal data. DUA aims to expand this to allow consumers to request their data is directly shared with authorised and regulated third parties. The hope is this will allow for the growth of smart data schemes to enable data sharing in areas such as energy, telecoms, mortgages, and insurance. It’s proposed this would be underpinned by a framework with data security at its core.

HEALTHCARE INFORMATION

Ever been to hospital and found your GP has no record of your treatment, or the hospital can’t access your GP’s notes? The government is hoping proposals in the Bill will support plans for a more uniform approach to information standards and technology infrastructure, so systems can ‘talk’ to each other. For example, allowing hospitals, GP surgeries, social care services, and ambulance services to have real-time access to information such as patient appointments, tests, and pre-existing conditions.

SCIENTIFIC RESEARCH

There are proposed changes to scientific research provisions, including clarifying the definition of scientific research and amending consent for scientific research. This is in part driven by a desire to make it easier for personal data collected for specific research to be reused for other scientific research purposes.

DIGITAL VERIFICATION SERVICES

There’s an aim to create a framework for trusted identity verification services, moving the country away from paper-based and in-person tasks. For example, proposals allow for digital verification services aimed at simplifying processes such as registering births and deaths, starting a new job, and renting a home.

In summary

The DUA Bill revives some old ideas and introduces some new ones. Some proposals more controversial than others. But unlike the DPDI, it does not present any significant softening of data protection compliance obligations under UK GDPR. All proposals will be scrutinised and could be amended before the Bill is enacted. However, unlike the previous Tory bill, this Bill is highly likely to become law.

In all of this, the Government will have a close eye on EU-UK adequacy. The European Commission’s adequacy decision for the UK is up for review in 2025 and there’s a recognition losing adequacy status would have a significantly negative impact on organisations which share data between the UK and EU. It will be hoped dropping controversial plans to dilute accountability requirements under UK GDPR will mean the European Commission will find the DUA Bill more palatable and less contentious.

The Bill as introduced can be found here. For quick reference these are the key parts of DUA:

Part 1: Access to customer data and business data
Part 2: Digital Information Services
Part 3: National underground asset register
Part 4: Registers of births and deaths
Part 5: Data Protection and Privacy
Chapter 1: Data Protection
Chapter 2: Privacy and Electronic Communications Regulations (PECR)
Part 6: The Information Commission
Part 7: Other Provisions
Part 8: Final Provisions

Why data protection matters

October 2024

How to make data protection engaging for others

I remember, many years ago, an exercise at school. The idea was to build confidence in public speaking. The teacher would give us a mundane object and say, ‘right, tomorrow you’re giving the class a two-minute talk on the biro or board rubber (I’m that old) or wastepaper bin. The surprising thing was how many people were genuinely good at it. One classmate had us laughing at the history of chalk, on the face of it not a particularly exciting topic. It hinged on delivery, yes, but also on explaining why an everyday object was remarkable in and of itself.

It’s entirely possible to do exactly the same thing with data protection. Two things though; (1) Data protection is usually more important than chalk, and (more controversially) (2) Data protection is more interesting than chalk!

So, if you’re a Data Protection Officer, or someone in your organisation given responsibility for data protection compliance, fear not. If you feel like you’re struggling to get people to take an interest or if you’re concerned they aren’t taking data protection seriously, you won’t be alone. The buzz around GDPR has fizzled out in the six long years since it was implemented.

It can be difficult to get traction, but the risks remain. The secret is to explain why it’s important, why it can be straightforward and (crucially) how data protection is a process to be worked with, not a straitjacket.

DPOs and privacy teams can’t do this on their own. As Claire Robson, Governance Director at the Chartered Insurance Institute says your people play a crucial role:

Data protection is all about us, as individuals. Therefore, it matters because our colleagues, customers, members, and stakeholders matter. We are in a position of trust, therefore we need to be trusted and to trust others, and if we don’t look after the personal information given to us in good faith, use it appropriately and keep it as safe and secure as possible, people could be subjected to harm. The best way to get others in the business engaged is to help them understand their rights as individuals, and the importance of their role as custodians of personal information. Ask them to put their “customer” (interchange this to suit your business!) hat on and think about it from the end user’s perspective. Most importantly, offer your support, understanding, and expertise to help them navigate through the maze of legislation and regulation, to find an end that supports the organisation to meet its purpose respectfully.

Matt Kay, Group DPO & Head of Privacy at Shawbrook Bank Ltd stresses the need to make data protection relevant to people’s day to day work:

With consumers becoming increasingly ‘tech-savvy’ and following several recent high-profile cyber-attacks and data losses, individuals are now acutely aware of the impact which mismanagement of personal data can have on their lives. Given the challenges posed and the increased regulatory scrutiny following the introduction of the GDPR, organisations must place a keen focus on compliance with applicable data protection laws. A key component of this is taking a pragmatic approach to risk management through understanding the needs of the business, the risks posed and how these impact on the rights and freedoms of individuals. Alongside this, it’s also essential to the requirements in language that your colleagues understand – make it simple, straightforward and applicable to their work.

So how can we breathe new life into our data protection programme? It can help to step back and remind people why we have data protection legislation in the first place.

Why data protection laws exist

GDPR has faced plenty of criticism for being a box-ticking exercise, but in reality, much of the legislation is about taken a proportionate approach and is based on sound principles. Principles which not just provide necessary protection and security, but also make good business sense. These principles are often based on past transgressions and mistakes.

Here’s where the point I started my article with comes in, because the reasons we have data protection are genuinely interesting (as is the Biro, Google it!). We all have a fundamental right to privacy – our customers, students, patients, employees, job applicants and so on. The ‘right to be left alone’ was written about as far back as 1890 by two US lawyers.

A key point came just after World War Two with the Universal Declaration of Human Rights including the 12th fundamental right – the Right to Privacy. It’s not hard to envisage why this was considered important in the 1940s. This is also where the concept of special category data stems from. People had been persecuted for their religion, their ethnicity, their sexual orientation and more. These characteristics needed, and indeed still need protecting.

Then came the development of rules, principles and country specific laws aimed at protecting people’s personal information and awarding people privacy rights. As technology advanced (personal computers, email, the internet, mobile phones…), new laws and regulations were introduced to protect us against new threats. Fast forward to 2018, and GDPR was seen as a game changer – not only cementing people’s fundamental privacy rights, but also making organisations more accountable for how they handle the personal data entrusted to them.

It can help if employees to see this through the prism of their own personal experience. We all have privacy rights and share data about ourselves with multiple organisations often in return for products or services. How do we expect others to look after our personal information, personal details of our children, our parents, our grandparents? Shouldn’t we apply the same standards to the personal data our organisation holds about others?

Let’s look at some core requirements under data protection legislation, and how we can ‘sell’ their importance.

Why data protection risk assessments are important

Yes, a Data Protection Impact Assessment (DPIA) will be mandatory for high-risk processing, a yes, they can take time to complete. But used well DPIAs are a really useful risk management tool. Started early, they’ll alert teams to potential risks before they materialise. Preventing unnecessary issues further down the line. DPIAs protect customers, employees and anyone else whose data is being handled, as well as protecting the organisation itself.

Why a Record of Processing Activities is not a box-ticking exercise

Yes, many organisations will need a Record of Processing Activities. Yes, there are a lot of fields to complete. BUT without a record of what data you hold, what it’s used for, what systems it sits on etc. it can be difficult, from the outset, to meet your legal obligations. How can you protect data you don’t know you have, or where it’s located? Also an up-to date ROPA has the following benefits:

Data breaches – a RoPA helps you to quickly locate the source, the systems, the data affected etc.
Retention – a RoPA helps you to clearly flag data which is no longer needed and can be deleted.
Privacy notices – if you don’t have a clear record of your purposes for processing, your lawful basis and the suppliers you use your privacy notice is unlikely to provide a true reflection of what you do.
Privacy Rights – a RoPA helps you to identify necessary search criteria for Data Subject Access Requests (DSARs) and helps locating data for erasure requests.

Why the right of access (aka DSAR) should be respected

Data Subject Access Requests can be time-consuming and sometimes downright tricky to fulfil. But let’s not forget this right empowers all of us to ask organisations what personal data they hold about us, and why. It gives us a level of control over our personal data. Where would society be without the power to exercise our legal privacy rights? While your staff may be handling requests, one day they might have a genuine wish to exercise this right themselves.

From a more straightforward point of view, DSARs also serve to remind us of the importance of good customer service. Happy customers seldom submit requests for a copy of their personal data!

Why data retention is important

There’s a legal requirement not to keep personal data longer than required under GDPR. Yes, this means having to have a retention schedule which is actually implemented in practice (tricky I know). There are also other solid benefits in meeting this core principle. Remind people of the risks of over-retention, or indeed not keeping personal data long enough:

The impact of a personal data breach could be significantly worse if personal data has been held on to for too long. Affecting more individuals, potentially leading to more severe enforcement action and raising the prospect of increased complaints (more DSARs and erasure requests!)
Certain personal data may need to be kept to meet contractual or commercial terms. The associated risks in not keeping this data include difficulty responding to complaints or litigation from customers, or regulatory enforcement.

Why privacy notices are important

We recognise the privacy notice is the Siberia of your website – uninviting, cold and seldom visited. But essentially it is your shop window. Done well a privacy notice clearly demonstrates your commitment to taking data protection seriously, and may be an indicator of how you act internally. Those who do take a peek may discover it’s not fit for purpose. That’s probably why they strapped on their snowshoes in the first place! This could be someone set to launch a complaint, or another business running due diligence. Your privacy notice is likely to be one of the first areas of scrutiny if subject to regulatory scrutiny. Details matter.

Why robust supplier management is important

Supply-chain breaches are becoming common. Too common. It can be helpful to remind ourselves why it’s important to make sure contractual terms with our processors are robust. This helps protect all parties up and down the supply chain.

When people give you their personal details, they are entrusting you to look after them appropriately. When you allow another company to access this data in order to provide you with a service, you’re exposing them to risk. GDPR requires organisations to put an agreement in place which protects individuals whose data is ‘transferred’ in the event your supplier suffers a data breach or otherwise violates the GDPR.

Think about an external payroll provider – all employees will want their data to be protected and for there to be legal recourse should something go wrong. Ultimately the law is in place to enshrine and fully protect the rights of individuals in all situations.

Making data protection relevant

Gerald Coppin, Deputy DPO at Springer Nature London says it’s important to make your people aware of the real-world implications should matters go wrong:

To engage others in the business, those in data protection roles can start by highlighting the real-world implications of data breaches. Sharing case studies and statistics about breaches that led to significant financial and reputational damage can serve as a wake-up call. By illustrating the potential consequences of negligence, data protection professionals can make the issue relatable and urgent. This approach helps colleagues see that data protection isn’t just a box to check, but an integral part of their daily responsibilities.

Gerald also suggests bringing data protection alive through games or competitions:

Incorporating gamification into training programs can also pique interest. By turning learning about data protection into a game or competition, organizations can foster a more engaging atmosphere. This approach not only makes the learning process enjoyable but also reinforces the importance of attention to data privacy in a memorable way. Recognizing and rewarding employees for their commitment to data protection can further encourage ongoing participation.

Policies, training and awareness

Data protection training plays an important part in getting core messages across, as long as the training content itself is engaging and fit for purpose. Policies and procedures play an important role as long as you make sure they’re easy to read and at hand to reference. For me, though, the key is raising awareness on an ongoing basis. This needn’t be too time consuming, but sharing internal near-misses and external cases which will resonate with your people is more likely to foster engagement and keep data protection top of mind. Share reminders in different formats, via the intranet or email newsletter. Experiment!

Ultimately as Robert Bond, Senior Counsel at Privacy Partnership Law says, we are all legally obliged to take this seriously:

Whether you are a UK business or a multinational, compliance with data protection law is essential, if not mandatory. Having an appropriate compliance programme demonstrates accountability and coupled with training helps to minimise loss of control of personal data. Remember that if data is the new oil of the internet, please don’t have a gusher.

Right, where’s that wastepaper bin? I’m doing a quick chat on the subject. Did you know bin collections were first suggested to English local councils in 1875?

Five top causes of data breaches

October 2024

And how to mitigate the risks

Data breaches are like booby traps in movies; some are like the huge stone ball that chases Indiana Jones down a tunnel. Some are sneaky, like the poisoned darts Indie dodges (before he gets chased by a big stone ball!). Nonetheless, like booby traps in Hollywood movies, there are common themes when it comes to data breaches. None of them, to my knowledge, involve being chased by a giant stone ball. And, unlike Indiana Jones, you don’t have to rely on supernatural luck and a sympathetic screenwriter to prevent these breaches occurring.

Back to the real world. While the threat of cyber-attacks continues to loom large, here’s an interesting fact; 75% of breaches reported to the Information Commissioner’s Office (ICO) are non-cyber related – caused by ‘human error’. Or, to put it another way, they’re often attributable to a lack of training and robust procedures to prevent someone making a mistake.

We’ve delved into ICO reporting figures, and put together a top five of the most common causes of data breaches, together with some top tips on how to mitigate the risk of these occurring in your organisation.

Our data breach countdown…

Number 5: Ransomware

Ransomware is a malicious software used by bad actors to encrypt an organisation’s system folders or files. Sometimes the data may be exfiltrated (exported) too. A ransom demand often follows, asking for payment. The attacker will say this can be paid in exchange for the decryption key and an assurance the data they claim to have will be deleted. In other words, it will not be published on the dark web or shared with others. But there are no guarantees even if you choose to pay the ransom. It’s worth noting the ICO and National Cyber Security Centre discourage paying ransoms.

Ransomware attacks can cause a personal data breach, but this may be only one of a number of risks to the business, such as financial, legal, commercial and reputational. These attacks are becoming increasingly sophisticated. It’s now possible for a bad actor to buy an ‘off the shelf’ cyber-attack via the dark web, or tailor a package to suit their needs.

How to mitigate ransomware risks

Appropriate steps need to be taken to protect systems from these types of attacks. Often this will mean investing more time and money into security measures. Here are just some of the ways to try and prevent attacks:

 Implementing Multifactor Authentication (MFA)
Installing antivirus software and firewalls
Use of complex passwords
Keeping all systems and software updated
Running regular cyber security and penetration testing
Monitoring logs to identify threats
Cyber awareness training

Also, crucially making sure you have up-to-date and separate backups is the most effective way of recovering quickly from a ransomware attack.

Number 4: Postal errors

This is a simple administrative error, which can have minor or significant consequences. An item containing personal data is posted to the wrong person. This could be an invoice sent to the incorrect person, exam results put in the wrong envelope or medical information sent to the wrong patient. Breaches of this nature can happen by:

using incorrect addresses
using old addresses
mistakenly including more than 1 letter in the same envelope
mistakenly attaching documents relating to another person to a letter

How to mitigate post breach risks

Robust training and regular reminders!
Using a check list e.g. Step 1) Check the address is correct when drafting a letter. Step 2) Check again after printing. Step 3) Check again before it does in the envelope.

Number 3: Unauthorised access

As the name suggests this is someone gaining access to personal information they shouldn’t have access to. This can be an external or internal threat. To give some examples;

Exploiting software vulnerabilities: Attackers can exploit software vulnerabilities to gain unauthorised access to applications, networks, and operating systems.
Password guessing: Cybercriminals can use special software to automate the guessing process, targeting details such as usernames, passwords and PINs.
Internal threats: Unauthorised access and use of personal data by employees or ex-employees.

Here are some real-life cases:

2022 – a former staff advisor for an NHS Foundation was found guilty of accessing patient records without a valid reason.
2023 – a former 111 call centre advisor was found guilty and fined for illegally accessing the medical records of a child and his family.
2024 – a former management trainee at a car rental company was found guilty and fined for illegally obtaining customer records. Accessing this data fell outside his role at the time.

How to mitigate unauthorised access risks

Here are just some of the ways of reducing your vulnerability to these types of breaches:

Applying the ‘principle of least privilege’ – this sets a rule that employees should have only the minimum access rights needed to perform their roles.
Strong password management e.g. make sure systems insist on complex passwords and prevent users sharing their access credentials.
Monitoring user activity

Number 2: Phishing attacks

Phishing is when attackers send scam emails or text messages containing links to malicious website. Often they try to trick users into revealing sensitive information (such as login credentials) or transferring money.

Any size of organisation is a potential target for phishing attacks. A mass campaign could indiscriminately target thousands of inboxes, an attack could specifically target your company or an individual employee.

Attacks are becoming increasingly sophisticated, and scam messages are made to look very realistic. Sometimes they will know who you do business with, and change just one letter in an email address, so you think it’s from an organisation you know.

Mitigating phishing attack risks

Here are a few tips for some of the ways you can reduce the risk of falling victim to a phishing attack.

Training and awareness to help employees identify spoof emails and texts
Setting up DMARC (Domain-based Message Authentication, Reporting and Conformance) to prevent bad actors spoofing your website domain

Also see NCSC phishing guidance

Number One: Email Errors

Yup, the top cause of data breaches is still email. Emails sent to the wrong recipient(s) or accidentally using CC for multiple recipients (thereby revealing their details to all recipients). A breach of this nature can be embarrassing, and/or can have serious consequences. To give an example:

The Central YMCA sent emails to individuals participating in a programme for people living with HIV. The CC field was used by accident, thereby revealing the email addresses to all recipients. People on the list could be identified or potentially identified from their email addresses and it could be inferred they were likely to be living with HIV.

Mitigating email breach risks

Here are some of the ways you can try and prevent email errors occurring:

Don’t broadcast to multiple people using BCC (it is too easy to make a mistake).Instead use alternative more secure bulk email solutions.
Set rules to provide alerts to warn employees when they us the CC field.
Turn off the auto-complete function to prevent the system suggesting recipients’ email addresses.
Set a delay, to allow time for errors to be corrected before the email is sent.
Make sure staff are trained about security measures when sending bulk communications

One of the biggest weapons in the data protection arsenal is training and awareness. We recently worked with a client who was using an excellent cyber-security training module, which staff had to complete not once, but twice a year. However, training on its own is unlikely to be enough. Regular reminders and updates are needed too. Near-misses and high-profile cases in the media can be used to get the message through.

Here’s a real-life example of a genuine disaster, one I would definitely share. You can just imagine how this happened. The Police Service of Northern Ireland (PSNI) experienced a horrendous, life-changing data breach entirely of its own making. Hidden fields in a spreadsheet disclosed in a Freedom of Information Request revealed the personal details of their entire workforce, including their job description and places of work. It was assumed the list subsequently fell into the hands of paramilitary organisations, leading to an enormously disruptive and expensive personal security review. ICO PSNI fine

The PSNI case also illustrates how some of the worst data protection hazards are those we set for ourselves. Not a big stone ball or poison darts. Simply a human error on a spreadsheet, an error adequate in-house procedures failed to prevent or identify.

How many such hazards are spread across your organisation?

ICO fine for Police Service of Northern Ireland

October 2024

What went wrong and what can we learn from this data breach?

You may recall the awful data breach last summer by the Police Service of Northern Ireland (PSNI). The personal details of its entire workforce (9,483 officers and staff) were accidentally exposed in response to a Freedom of Information request. The dreadful mistake left many fearing for their safety with an assumption the information shared got into the hands of dissident republicans.

This was a simple mistake involving a spreadsheet, which ALL organisations should take heed of.

The ICO has announced a £750,000 fine and says simple-to-implement procedures could have prevented this serious breach. If the ICO had not applied its discretionary approach for the public sector, the fine would otherwise have been £5.6 million. But in assessing the level of the fine, the current financial position of the PSNI and a desire not to divert public money from where it’s needed, were taken into account. A commercial organisation would have faced a much heftier financial penalty.

What went wrong?

The PSNI received two Freedom of Information requests in August 2023 from the same person. These came via WhatDoTheyKnow (WDTK); a platform which helps people submit requests and publishes responses. The requests were for information about the number of officers at each rank and number of staff at each grade, and some other details.

This information was downloaded in the form of an Excel file from the PSNI’s HR system and included personal data relating to all employees. During the analysis, multiple other worksheets were created within the same file. Once completed all visible worksheets were deleted.

But when the file was subsequently uploaded to the WDTK website, it emerged a hidden worksheet remained containing personal details. This had gone unnoticed, despite quality assurance. More detail is available in the ICO Penalty Notice.

In this case the evidence of the distress and harm caused by this data breach was evident. The ICO has published some of the comments from police officers affected, including: “How has this impacted on me? I don’t sleep at night. I continually get up through the night when I hear a noise outside to check that everything is ok. I have spent over £1000 installing modern CCTV and lighting around my home, because of the exposure.”

In announcing the penalty fine, John Edwards, UK Information Commissioner said: “I cannot think of a clearer example to prove how critical it is to keep personal information safe… Let this be a lesson learned for all organisations. Check, challenge and change your disclosure procedures to ensure you protect people’s personal information.”

What lessons can we learn?

While this is a particularly serious case, the ICO says mistakes when disclosing information via spreadsheets are nothing new. Public Authorities in particular are being urged to make sure robust measures are in place to make sure personal information is kept safe and the risk of human error is reduced. The regulator has published a useful checklist for any disclosures made using Excel:

Delete hidden columns, rows and worksheets that are not pertinent to the request
Remove any linked data from pivot tables, charts and formula which are not part of the request
Remove all personal data and special category data which is not necessary to provide to fulfil the request
Remove any meta data
 Make sure the file size is as you’d expect for the volume of data being disclosed
Convert files to CSV

More information is available in an ICO Advisory Note

Crucially, organisations need to make sure all staff involved in the disclosure process have been given appropriate training. It’s too easy to point the finger at individuals for making mistakes, when it’s often a lack of robust procedures, training and final ‘pre-send’ checks which are ultimately to blame.