Using AI tools for recruitment

November 2024

How to comply with GDPR

AI tools offer dynamic, efficient solutions for streamlining recruitment processes. AI is capable of speedily identifying and sourcing potential candidates, summarising their CVs and scoring their suitability for the role.

What’s not to like?

Nonetheless, these processes must be fair and lawful. Is there a potential for bias and/or inaccurate outputs? How else will AI providers use jobseekers’ personal details? What data protection compliance considerations are baked into the AI’s architecture?

The Information Commissioner’s Office (ICO) is calling on AI providers and recruiters to do more to make sure AI tools don’t adversely impact on applicants. People could be unfairly excluded from potential jobs and/or have their privacy comprised. Why undo the good work HR professionals undertake to satisfy legal and best practice by using questionable technology?

The ICO recently ran a consensual audit of several developers and providers of AI recruitment tools. Some of the findings included;

Excessive personal data being collected
Data being used for incompatible purposes
A lack of transparency for jobseekers about how AI uses their details

The AI Tools in Recruitment Audit Report provides several hundred recommendations. The unambiguous message is using AI in the recruitment processes shouldn’t be taken lightly. Of course, this doesn’t mean recruiters shouldn’t embrace new technologies, but does mean sensible checks and balances are required. Here’s a summary of key ICO recommendations, with some additional information and thoughts.

10 key steps for recruiters looking to engage AI providers

1. Data Protection Impact Assessment (DPIA)

DPIAs are mandatory under GDPR where a type of processing is likely to result in high risk. The ICO says ‘processing involving the use of innovative technologies, or the novel application of existing technologies (including AI)’ is an example of processing they would consider likely to result in a high risk.

Using AI tools for recruitment purposes squarely meets these criteria. A DPIA will help you to better understand, address and mitigate any potential privacy risks or harms to people. It should help you to ask the right questions of the AI provider. It’s likely your DPIA will need to be agile; revisited and updated as the processing and its potential impacts evolve.

ICO DPIA recommendations for recruiters:

Complete a DPIA before commencing processing that is likely to result in a high risk to the people’s rights and freedoms such as procuring an AI recruitment tool or other innovative technology.
Ensure DPIAs are comprehensive and detailed, including:
– the scope and purpose of the processing;
– a clear explanation of relationships and data flows between each party;
– how processing will comply with UK GDPR principles; and consideration of alternative approaches.
– Assess the risks to people’s rights and freedoms clearly in a DPIA, and identify and implement measures to mitigate each risk.
Follow a clear DPIA process that follows the recommendations above.

2. Lawful basis for processing

When recruiting organisations need to identify a lawful basis for this processing activity. You need to choose the most appropriate of the six lawful bases such as consent or legitimate interests.

To rely on legitimate interests you will need to:
1. Identify a legitimate interest
2. Assess the necessity
3. Balance your organisation’s interests with the interests, rights and freedoms of individuals.

This is known as the ‘3-stage test’. We’d highly recommend you conduct and document a Legitimate Interests Assessment. Our recently updated Legitimate Interests Guidance includes a LIA temple (in Excel). Your DPIA can be referenced in this assessment.

3. Special category data condition

If you will be processing special category data, such as health information or Diversity, Equity and Inclusion data (DE&I), alongside a lawful basis you’ll need to meet a specific special category condition (i.e. an Article 9 condition under UK GDPR).

It’s worth noting, some AI providers may infer people’s characteristics from candidate profiles rather than directly collecting it. This can include predicting gender and ethnicity. This type of information even if inferred, will be special category data. It also raises questions about ‘invisible’ processing (i.e. processing the individual is not aware of) and a lack of transparency. The ICO recommends not using inferred information in this way.

4. Controller, processor or joint controller

Both recruiters and AI providers have a responsibility for data protection compliance. It should be clear who is the controller or processor of the personal information. Is the AI provider a controller, joint-controller or processor? The ICO recommends this relationship is carefully scrutinised and clearly recorded in a contract with the AI provider.

If the provider is acting as a processor, the ICO says ‘explicit and comprehensive instructions must be provided for them to follow’. The regulator says this should include establishing how you’ll make sure the provider is complying with these instructions. As a controller your organisation should be able to direct the means and purpose of the processing and tailor it to your requirements. If not, the AI provider is likely to be a controller or joint-controller.

5. Data minimisation

One of the core data protection principles is data minimisation. We should only collect and use personal information which is necessary for our purpose(s). The ICO’s audit found some AI tools collected far more personal information than necessary and retained it indefinitely to build large databases of potential candidates without their knowledge. What might make perfect sense to AI or the programmers creating such technology might not be compliant with data protection law!

Recruiters need to make sure the AI tools they use only collect the minimum amount of personal information required to achieve your purpose(s). (A purpose/purposes which should be clearly defined in your DPIA and, where relevant, your LIA).

There is also an obligation to make sure the personal details candidates are providing are not used for other incompatible purposes. Remember, if the AI provider is retaining data and using this information for its own purposes, it will not be a processor.

6. Information security and integrity

As part of the procurement process, recruiters need to undertake meaningful due diligence. This means asking the AI provider for evidence that appropriate technical and organisational controls are in place. These technical and organisational controls should also be documented in the contract. The ICO recommends regular compliance checks are undertaken while the contract is in place, to make sure effective controls remain in place.

7. Fairness and mitigating bias risks

Recruiters need to be confident the outputs from AI tools are accurate, fair and unbiased. The ICO’s audit of AI recruitment providers found evidence tools were not processing personal information fairly. For example, in some cases they allowed for recruiters to filter out candidates with protected characteristics. (Protected characteristics include; age, disability, race, ethnic or national origin, religion or belief, sex and sexual orientation). This should be a red flag.

You should seek clear assurances from the AI provider they have mitigated bias, asking to see any relevant documentation. The ICO has published guidance on this: How to we ensure fairness in AI?

8. Transparency

Are candidates aware an AI tool will used to process their personal details? Clear privacy information needs to be provided to job seekers which explains how and why the AI tool is being used. The ICO says this should extend to explain the ‘logic involved in making predictions or producing outputs which may affect people’. Candidates should also be told how they can challenge any automated decisions made by the tool.

The regulator recommends producing a privacy notice specifically for candidates on your AI platform which covers relevant UK GDPR requirements.

9. Human involvement in decision-making

There are strict rules under GPDR for automated decision-making (including profiling). Automated decision-making is the process of making a decision by automated means without any human involvement. A recruitment process wouldn’t be considered solely automated if someone (i.e. a human in the recruitment team) weighs up and interprets the result of an automated decision before applying it to the individual.

There needs to be meaningful human involvement in the process to prevent solely automated decisions being made about candidates. The ICO recommendations for recruiters include:

Ensure that recruiting managers do not use AI outputs (particularly ‘fit’ or suitability scores) to make automated recruitment decisions, where AI tools are not designed for this purpose.
Offer a simple way for candidates to object to or challenge automated decisions, where AI tools make automated decisions.

10. Data Retention

Another core data protection principle is ‘storage limitation’. This means not keeping personal data for longer than necessary for the purpose(s) it was collected for. It’s important to assess how long the data inputted and generated from AI tools will be kept for. Information about retention periods should be provided in relevant privacy information provided to job applicants (e.g. in an Applicant Privacy Notice provided on your AI platform).

The ICO says data retention periods should be detailed in contracts, including how long each category of personal information is kept and why. Plus what action the AI provider must take at the end of the retention period.

Summary

The ICO acknowledges the benefits of AI and doesn’t want to stand in the way of those seeking to use AI driven solutions. It does, however, ask recruiters to consider the technology’s compatibility with data protection law.

AI is a complex area for many and it’s easy to see how unintended misuse of personal data, or unfairness and bias in candidate selection could ‘slip through the cracks’ in the digital pavement. HR  professionals and recruiters can avoid problems later down the line by addressing these as Day One issues when considering AI.

Fairness and respect for candidate privacy are central principles of HR best practice and necessary for data protection compliance. Applying these to new technological opportunities shouldn’t come as a surprise. Including your data protection team in the planning stage can help to mitigate and possibly eliminate some risks. A win-win which would leave organisations more confident in reaping the benefits AI offers.

DPN Legitimate Interests Guidance and LIA Template (v 3.0)

Published in November 2024 this third version of our established Legitimate Interests Guidance aims to help organisations assess whether they can rely on legitimate interests for a range of processing activities. Routine or more complex activities, such as those involving the use of AI. First published in 2017, this updated version includes an improved LIA template (in Excel) to use when conducting your own Legitimate Interests Assessments.

Legitimate Interests Guidance from the Data Protection Network

Many thanks to PrivacyX Consulting and Privacy Partnership Law for working with us on this latest version. We’d also like to thank the original Legitimate Interests Working Group of 2017/2018, comprising representatives from a wide range of companies and institutions, who collaborated to produce previous versions.

UK Data (Use & Access) Bill: Key Proposals

October 2024

What DPOs and data protection teams need to know 

The Government’s Data (Use & Access) Bill was introduced to Parliament and had its first reading in the House of Lords on 23 October. This is a new name for the Digital Information and Smart Data Bill, announced in the King’s Speech back in July. With the acronym DUA, this Bill revives some, but certainly not all aspects of the previous Government’s Data Protection & Digital Information Bill (DPDI) which fell by the wayside when the general election was announced.

At 262 pages it’s a lengthy document, so we’ve provided a summary of some key proposals likely to be of interest to those working in data protection-related roles. Of course, at this stage everything is subject to change as the Bill progresses through Parliament.

DATA PROTECTION (UK GDPR & DPA 2028)

1. Accountability requirements NOT changed

The previous DPDI’s controversial plans to amend accountability obligations under UK GDPR have not been carried over into DUA. There are no plans to remove the requirement for organisations which meet certain criteria to appoint a Data Protection Officer, nor are there any planned changes relating to Data Protection Impact Assessments or Records of Processing Activities.

Some organisations may be disappointed more flexibility is not planned in these areas. However, we’d stress UK GDPR is already littered with the words ‘proportionate’ and ‘appropriate’. Small-to-medium sized businesses are not currently expected to put in place as robust measures as larger organisations, unless the nature of their business activities and the sensitivity of the personal data they handle warrants it.

2. Data Subject Access Requests (DSARs)

In the main, the proposals in relation to the Right of Access (aka Data Subject Access Requests) aim to give a statutory footing to practices already commonly applied. Such as confirming:

Organisations can ask the requestee for details of the information or activities a DSAR relates to, and to pause the time period for responding to the request while seeking this information. For example, the ability to seek clarification when the organisation “processes a large amount of information concerning the data subject”.

The time period for compliance with a DSAR does not begin until the organisation is satisfied the requestee is who they say they are; i.e., any necessary proof of identity has been received.

The search for personal data in response to a DSAR would only need to be “reasonable and proportionate”.

Making these points crystal clear in law would create certainty for organisations, who currently rely on guidance from the Information Commissioner’s Office. Many organisations may be disappointed the concept of ‘vexatious’ requests has not been revived from the abandoned DPDI bill.

3. Privacy notices & the right to be informed

The DUA Bill proposes the obligation to provide privacy information to individuals under Articles 13 and 14 (e.g. via a privacy notice) will not apply if providing this information ‘is impossible or would involve disproportionate effort‘. This move could be viewed as an attempt to water down requirements to notify individuals of the processing taking place. This was a particular point of contention in the Experian vs ICO case. In relation to its processing of the Edited Electoral Roll, Experian argued it would be disproportionate effort to notify and provide privacy information to millions of people.

4. Recognised legitimate interests

The concept of ‘recognised legitimate interests’ is revived from the DPDI Bill. It’s proposed organisations would be exempt from conducting a full Legitimate Interests Assessment (LIA) for certain specified purposes; such as national security, emergency response, and safeguarding. The DUA Bill also looks to confirm legitimate interests as an acceptable lawful basis where necessary for direct marketing purposes. Clearly, legitimate interests will only be an option when the law doesn’t require consent, for example under the Privacy and Electronic Communications Regulations (PECR).

5. Automated decision-making

Noteworthy changes are proposed aimed at making it easier for organisations to use automated decision-making more widely. For example, using artificial intelligence (AI) systems. Currently, Article 22 of UK GDPR places strict restrictions on automated decision-making (including profiling) which could produce legal or similarly significant effects. The new Bill seeks to reduce the scope of Article 22 to only cover automated decisions made using special category data. There is likely to be concern this will have a negative impact on people’s rights in relation to automated-decisions made about them using other personal data. This also may put the UK out of kilter with the EU.

6. Data protection complaints procedure

It is proposed for organisations to be obligated to make sure they have clear procedures so people can raise complaints in connection with the use of their personal data. For example, organisations would need to:

Facilitate people’s ability to make complaints (for instance by providing a complaint form).

Respond to complaints within 30 days of receipt.

Notify the Information Commissioner of the number of complaints received in specified periods.

PRIVACY & ELECTRONIC COMMUNICATIONS REGULATIONS (PECR)

The following changes to PECR are proposed:

1. PECR Fines

Significantly increasing potential fines for infringements of PECR to bring them in line with the level of fines under UK GDPR. Currently, the maximum fine under PECR is capped at £500k.

2. Analytics cookies

Permitting the use of first-party cookies and similar technologies for website analytics without a requirement to collect consent. Also included is a provision to allow for the introduction of other circumstances in which cookie consent would not be required.

3. Spam emails and texts

Expanding what constitutes ‘spam’ to include emails and text messages which are sent but not received by anyone. This will mean the ICO can consider much larger volumes in any enforcement action. In conjunction with higher fines – SPAMMERS BEWARE!

THE INFORMATION COMMISSION

The plan is for the Information Commissioner’s Office to be replaced by an Information Commission. This would be structured in a similar way to the Financial Conduct Authority and the Competitions and Markets Authority, with an appointed Chief Executive. There’s also provision for the Government to have considerable influence over the operations of the new Commission. For example, this could include determining the number of Commission members and a requirement for the Commission to consult with the Government on the appointment of a Chief Executive.

SMART DATA SCHEMES

The Government announcement states: ‘the Bill will create the right conditions to support the future of open banking and the growth of new smart data schemes, models which allow consumers and businesses who want to safely share information about them with regulated and authorised third parties, to generate personalised market comparisons and financial advice to cut costs.

The Right to Portability under UK GDPR currently allows individuals to obtain and reuse their personal data. DUA aims to expand this to allow consumers to request their data is directly shared with authorised and regulated third parties. The hope is this will allow for the growth of smart data schemes to enable data sharing in areas such as energy, telecoms, mortgages, and insurance. It’s proposed this would be underpinned by a framework with data security at its core.

HEALTHCARE INFORMATION

Ever been to hospital and found your GP has no record of your treatment, or the hospital can’t access your GP’s notes? The government is hoping proposals in the Bill will support plans for a more uniform approach to information standards and technology infrastructure, so systems can ‘talk’ to each other. For example, allowing hospitals, GP surgeries, social care services, and ambulance services to have real-time access to information such as patient appointments, tests, and pre-existing conditions.

SCIENTIFIC RESEARCH

There are proposed changes to scientific research provisions, including clarifying the definition of scientific research and amending consent for scientific research. This is in part driven by a desire to make it easier for personal data collected for specific research to be reused for other scientific research purposes.

DIGITAL VERIFICATION SERVICES

There’s an aim to create a framework for trusted identity verification services, moving the country away from paper-based and in-person tasks. For example, proposals allow for digital verification services aimed at simplifying processes such as registering births and deaths, starting a new job, and renting a home.

In summary

The DUA Bill revives some old ideas and introduces some new ones. Some proposals more controversial than others. But unlike the DPDI, it does not present any significant softening of data protection compliance obligations under UK GDPR. All proposals will be scrutinised and could be amended before the Bill is enacted. However, unlike the previous Tory bill, this Bill is highly likely to become law.

In all of this, the Government will have a close eye on EU-UK adequacy. The European Commission’s adequacy decision for the UK is up for review in 2025 and there’s a recognition losing adequacy status would have a significantly negative impact on organisations which share data between the UK and EU. It will be hoped dropping controversial plans to dilute accountability requirements under UK GDPR will mean the European Commission will find the DUA Bill more palatable and less contentious.

The Bill as introduced can be found here. For quick reference these are the key parts of DUA:

Part 1: Access to customer data and business data
Part 2: Digital Information Services
Part 3: National underground asset register
Part 4: Registers of births and deaths
Part 5: Data Protection and Privacy
Chapter 1: Data Protection
Chapter 2: Privacy and Electronic Communications Regulations (PECR)
Part 6: The Information Commission
Part 7: Other Provisions
Part 8: Final Provisions

Why data protection matters

October 2024

How to make data protection engaging for others

I remember, many years ago, an exercise at school. The idea was to build confidence in public speaking. The teacher would give us a mundane object and say, ‘right, tomorrow you’re giving the class a two-minute talk on the biro or board rubber (I’m that old) or wastepaper bin. The surprising thing was how many people were genuinely good at it. One classmate had us laughing at the history of chalk, on the face of it not a particularly exciting topic. It hinged on delivery, yes, but also on explaining why an everyday object was remarkable in and of itself.

It’s entirely possible to do exactly the same thing with data protection. Two things though; (1) Data protection is usually more important than chalk, and (more controversially) (2) Data protection is more interesting than chalk!

So, if you’re a Data Protection Officer, or someone in your organisation given responsibility for data protection compliance, fear not. If you feel like you’re struggling to get people to take an interest or if you’re concerned they aren’t taking data protection seriously, you won’t be alone. The buzz around GDPR has fizzled out in the six long years since it was implemented.

It can be difficult to get traction, but the risks remain. The secret is to explain why it’s important, why it can be straightforward and (crucially) how data protection is a process to be worked with, not a straitjacket.

DPOs and privacy teams can’t do this on their own. As Claire Robson, Governance Director at the Chartered Insurance Institute says your people play a crucial role:

Data protection is all about us, as individuals. Therefore, it matters because our colleagues, customers, members, and stakeholders matter. We are in a position of trust, therefore we need to be trusted and to trust others, and if we don’t look after the personal information given to us in good faith, use it appropriately and keep it as safe and secure as possible, people could be subjected to harm. The best way to get others in the business engaged is to help them understand their rights as individuals, and the importance of their role as custodians of personal information. Ask them to put their “customer” (interchange this to suit your business!) hat on and think about it from the end user’s perspective. Most importantly, offer your support, understanding, and expertise to help them navigate through the maze of legislation and regulation, to find an end that supports the organisation to meet its purpose respectfully.

Matt Kay, Group DPO & Head of Privacy at Shawbrook Bank Ltd stresses the need to make data protection relevant to people’s day to day work:

With consumers becoming increasingly ‘tech-savvy’ and following several recent high-profile cyber-attacks and data losses, individuals are now acutely aware of the impact which mismanagement of personal data can have on their lives. Given the challenges posed and the increased regulatory scrutiny following the introduction of the GDPR, organisations must place a keen focus on compliance with applicable data protection laws. A key component of this is taking a pragmatic approach to risk management through understanding the needs of the business, the risks posed and how these impact on the rights and freedoms of individuals. Alongside this, it’s also essential to the requirements in language that your colleagues understand – make it simple, straightforward and applicable to their work.

So how can we breathe new life into our data protection programme? It can help to step back and remind people why we have data protection legislation in the first place.

Why data protection laws exist

GDPR has faced plenty of criticism for being a box-ticking exercise, but in reality, much of the legislation is about taken a proportionate approach and is based on sound principles. Principles which not just provide necessary protection and security, but also make good business sense. These principles are often based on past transgressions and mistakes.

Here’s where the point I started my article with comes in, because the reasons we have data protection are genuinely interesting (as is the Biro, Google it!). We all have a fundamental right to privacy – our customers, students, patients, employees, job applicants and so on. The ‘right to be left alone’ was written about as far back as 1890 by two US lawyers.

A key point came just after World War Two with the Universal Declaration of Human Rights including the 12th fundamental right – the Right to Privacy. It’s not hard to envisage why this was considered important in the 1940s. This is also where the concept of special category data stems from. People had been persecuted for their religion, their ethnicity, their sexual orientation and more. These characteristics needed, and indeed still need protecting.

Then came the development of rules, principles and country specific laws aimed at protecting people’s personal information and awarding people privacy rights. As technology advanced (personal computers, email, the internet, mobile phones…), new laws and regulations were introduced to protect us against new threats. Fast forward to 2018, and GDPR was seen as a game changer – not only cementing people’s fundamental privacy rights, but also making organisations more accountable for how they handle the personal data entrusted to them.

It can help if employees to see this through the prism of their own personal experience. We all have privacy rights and share data about ourselves with multiple organisations often in return for products or services. How do we expect others to look after our personal information, personal details of our children, our parents, our grandparents? Shouldn’t we apply the same standards to the personal data our organisation holds about others?

Let’s look at some core requirements under data protection legislation, and how we can ‘sell’ their importance.

Why data protection risk assessments are important

Yes, a Data Protection Impact Assessment (DPIA) will be mandatory for high-risk processing, a yes, they can take time to complete. But used well DPIAs are a really useful risk management tool. Started early, they’ll alert teams to potential risks before they materialise. Preventing unnecessary issues further down the line. DPIAs protect customers, employees and anyone else whose data is being handled, as well as protecting the organisation itself.

Why a Record of Processing Activities is not a box-ticking exercise

Yes, many organisations will need a Record of Processing Activities. Yes, there are a lot of fields to complete. BUT without a record of what data you hold, what it’s used for, what systems it sits on etc. it can be difficult, from the outset, to meet your legal obligations. How can you protect data you don’t know you have, or where it’s located? Also an up-to date ROPA has the following benefits:

Data breaches – a RoPA helps you to quickly locate the source, the systems, the data affected etc.
Retention – a RoPA helps you to clearly flag data which is no longer needed and can be deleted.
Privacy notices – if you don’t have a clear record of your purposes for processing, your lawful basis and the suppliers you use your privacy notice is unlikely to provide a true reflection of what you do.
Privacy Rights – a RoPA helps you to identify necessary search criteria for Data Subject Access Requests (DSARs) and helps locating data for erasure requests.

Why the right of access (aka DSAR) should be respected

Data Subject Access Requests can be time-consuming and sometimes downright tricky to fulfil. But let’s not forget this right empowers all of us to ask organisations what personal data they hold about us, and why. It gives us a level of control over our personal data. Where would society be without the power to exercise our legal privacy rights? While your staff may be handling requests, one day they might have a genuine wish to exercise this right themselves.

From a more straightforward point of view, DSARs also serve to remind us of the importance of good customer service. Happy customers seldom submit requests for a copy of their personal data!

Why data retention is important

There’s a legal requirement not to keep personal data longer than required under GDPR. Yes, this means having to have a retention schedule which is actually implemented in practice (tricky I know). There are also other solid benefits in meeting this core principle. Remind people of the risks of over-retention, or indeed not keeping personal data long enough:

The impact of a personal data breach could be significantly worse if personal data has been held on to for too long. Affecting more individuals, potentially leading to more severe enforcement action and raising the prospect of increased complaints (more DSARs and erasure requests!)
Certain personal data may need to be kept to meet contractual or commercial terms. The associated risks in not keeping this data include difficulty responding to complaints or litigation from customers, or regulatory enforcement.

Why privacy notices are important

We recognise the privacy notice is the Siberia of your website – uninviting, cold and seldom visited. But essentially it is your shop window. Done well a privacy notice clearly demonstrates your commitment to taking data protection seriously, and may be an indicator of how you act internally. Those who do take a peek may discover it’s not fit for purpose. That’s probably why they strapped on their snowshoes in the first place! This could be someone set to launch a complaint, or another business running due diligence. Your privacy notice is likely to be one of the first areas of scrutiny if subject to regulatory scrutiny. Details matter.

Why robust supplier management is important

Supply-chain breaches are becoming common. Too common. It can be helpful to remind ourselves why it’s important to make sure contractual terms with our processors are robust. This helps protect all parties up and down the supply chain.

When people give you their personal details, they are entrusting you to look after them appropriately. When you allow another company to access this data in order to provide you with a service, you’re exposing them to risk. GDPR requires organisations to put an agreement in place which protects individuals whose data is ‘transferred’ in the event your supplier suffers a data breach or otherwise violates the GDPR.

Think about an external payroll provider – all employees will want their data to be protected and for there to be legal recourse should something go wrong. Ultimately the law is in place to enshrine and fully protect the rights of individuals in all situations.

Making data protection relevant

Gerald Coppin, Deputy DPO at Springer Nature London says it’s important to make your people aware of the real-world implications should matters go wrong:

To engage others in the business, those in data protection roles can start by highlighting the real-world implications of data breaches. Sharing case studies and statistics about breaches that led to significant financial and reputational damage can serve as a wake-up call. By illustrating the potential consequences of negligence, data protection professionals can make the issue relatable and urgent. This approach helps colleagues see that data protection isn’t just a box to check, but an integral part of their daily responsibilities.

Gerald also suggests bringing data protection alive through games or competitions:

Incorporating gamification into training programs can also pique interest. By turning learning about data protection into a game or competition, organizations can foster a more engaging atmosphere. This approach not only makes the learning process enjoyable but also reinforces the importance of attention to data privacy in a memorable way. Recognizing and rewarding employees for their commitment to data protection can further encourage ongoing participation.

Policies, training and awareness

Data protection training plays an important part in getting core messages across, as long as the training content itself is engaging and fit for purpose. Policies and procedures play an important role as long as you make sure they’re easy to read and at hand to reference. For me, though, the key is raising awareness on an ongoing basis. This needn’t be too time consuming, but sharing internal near-misses and external cases which will resonate with your people is more likely to foster engagement and keep data protection top of mind. Share reminders in different formats, via the intranet or email newsletter. Experiment!

Ultimately as Robert Bond, Senior Counsel at Privacy Partnership Law says, we are all legally obliged to take this seriously:

Whether you are a UK business or a multinational, compliance with data protection law is essential, if not mandatory. Having an appropriate compliance programme demonstrates accountability and coupled with training helps to minimise loss of control of personal data. Remember that if data is the new oil of the internet, please don’t have a gusher.

Right, where’s that wastepaper bin? I’m doing a quick chat on the subject. Did you know bin collections were first suggested to English local councils in 1875?

Five top causes of data breaches

October 2024

And how to mitigate the risks

Data breaches are like booby traps in movies; some are like the huge stone ball that chases Indiana Jones down a tunnel. Some are sneaky, like the poisoned darts Indie dodges (before he gets chased by a big stone ball!). Nonetheless, like booby traps in Hollywood movies, there are common themes when it comes to data breaches. None of them, to my knowledge, involve being chased by a giant stone ball. And, unlike Indiana Jones, you don’t have to rely on supernatural luck and a sympathetic screenwriter to prevent these breaches occurring.

Back to the real world. While the threat of cyber-attacks continues to loom large, here’s an interesting fact; 75% of breaches reported to the Information Commissioner’s Office (ICO) are non-cyber related – caused by ‘human error’. Or, to put it another way, they’re often attributable to a lack of training and robust procedures to prevent someone making a mistake.

We’ve delved into ICO reporting figures, and put together a top five of the most common causes of data breaches, together with some top tips on how to mitigate the risk of these occurring in your organisation.

Our data breach countdown…

Number 5: Ransomware

Ransomware is a malicious software used by bad actors to encrypt an organisation’s system folders or files. Sometimes the data may be exfiltrated (exported) too. A ransom demand often follows, asking for payment. The attacker will say this can be paid in exchange for the decryption key and an assurance the data they claim to have will be deleted. In other words, it will not be published on the dark web or shared with others. But there are no guarantees even if you choose to pay the ransom. It’s worth noting the ICO and National Cyber Security Centre discourage paying ransoms.

Ransomware attacks can cause a personal data breach, but this may be only one of a number of risks to the business, such as financial, legal, commercial and reputational. These attacks are becoming increasingly sophisticated. It’s now possible for a bad actor to buy an ‘off the shelf’ cyber-attack via the dark web, or tailor a package to suit their needs.

How to mitigate ransomware risks

Appropriate steps need to be taken to protect systems from these types of attacks. Often this will mean investing more time and money into security measures. Here are just some of the ways to try and prevent attacks:

 Implementing Multifactor Authentication (MFA)
Installing antivirus software and firewalls
Use of complex passwords
Keeping all systems and software updated
Running regular cyber security and penetration testing
Monitoring logs to identify threats
Cyber awareness training

Also, crucially making sure you have up-to-date and separate backups is the most effective way of recovering quickly from a ransomware attack.

Number 4: Postal errors

This is a simple administrative error, which can have minor or significant consequences. An item containing personal data is posted to the wrong person. This could be an invoice sent to the incorrect person, exam results put in the wrong envelope or medical information sent to the wrong patient. Breaches of this nature can happen by:

using incorrect addresses
using old addresses
mistakenly including more than 1 letter in the same envelope
mistakenly attaching documents relating to another person to a letter

How to mitigate post breach risks

Robust training and regular reminders!
Using a check list e.g. Step 1) Check the address is correct when drafting a letter. Step 2) Check again after printing. Step 3) Check again before it does in the envelope.

Number 3: Unauthorised access

As the name suggests this is someone gaining access to personal information they shouldn’t have access to. This can be an external or internal threat. To give some examples;

Exploiting software vulnerabilities: Attackers can exploit software vulnerabilities to gain unauthorised access to applications, networks, and operating systems.
Password guessing: Cybercriminals can use special software to automate the guessing process, targeting details such as usernames, passwords and PINs.
Internal threats: Unauthorised access and use of personal data by employees or ex-employees.

Here are some real-life cases:

2022 – a former staff advisor for an NHS Foundation was found guilty of accessing patient records without a valid reason.
2023 – a former 111 call centre advisor was found guilty and fined for illegally accessing the medical records of a child and his family.
2024 – a former management trainee at a car rental company was found guilty and fined for illegally obtaining customer records. Accessing this data fell outside his role at the time.

How to mitigate unauthorised access risks

Here are just some of the ways of reducing your vulnerability to these types of breaches:

Applying the ‘principle of least privilege’ – this sets a rule that employees should have only the minimum access rights needed to perform their roles.
Strong password management e.g. make sure systems insist on complex passwords and prevent users sharing their access credentials.
Monitoring user activity

Number 2: Phishing attacks

Phishing is when attackers send scam emails or text messages containing links to malicious website. Often they try to trick users into revealing sensitive information (such as login credentials) or transferring money.

Any size of organisation is a potential target for phishing attacks. A mass campaign could indiscriminately target thousands of inboxes, an attack could specifically target your company or an individual employee.

Attacks are becoming increasingly sophisticated, and scam messages are made to look very realistic. Sometimes they will know who you do business with, and change just one letter in an email address, so you think it’s from an organisation you know.

Mitigating phishing attack risks

Here are a few tips for some of the ways you can reduce the risk of falling victim to a phishing attack.

Training and awareness to help employees identify spoof emails and texts
Setting up DMARC (Domain-based Message Authentication, Reporting and Conformance) to prevent bad actors spoofing your website domain

Also see NCSC phishing guidance

Number One: Email Errors

Yup, the top cause of data breaches is still email. Emails sent to the wrong recipient(s) or accidentally using CC for multiple recipients (thereby revealing their details to all recipients). A breach of this nature can be embarrassing, and/or can have serious consequences. To give an example:

The Central YMCA sent emails to individuals participating in a programme for people living with HIV. The CC field was used by accident, thereby revealing the email addresses to all recipients. People on the list could be identified or potentially identified from their email addresses and it could be inferred they were likely to be living with HIV.

Mitigating email breach risks

Here are some of the ways you can try and prevent email errors occurring:

Don’t broadcast to multiple people using BCC (it is too easy to make a mistake).Instead use alternative more secure bulk email solutions.
Set rules to provide alerts to warn employees when they us the CC field.
Turn off the auto-complete function to prevent the system suggesting recipients’ email addresses.
Set a delay, to allow time for errors to be corrected before the email is sent.
Make sure staff are trained about security measures when sending bulk communications

One of the biggest weapons in the data protection arsenal is training and awareness. We recently worked with a client who was using an excellent cyber-security training module, which staff had to complete not once, but twice a year. However, training on its own is unlikely to be enough. Regular reminders and updates are needed too. Near-misses and high-profile cases in the media can be used to get the message through.

Here’s a real-life example of a genuine disaster, one I would definitely share. You can just imagine how this happened. The Police Service of Northern Ireland (PSNI) experienced a horrendous, life-changing data breach entirely of its own making. Hidden fields in a spreadsheet disclosed in a Freedom of Information Request revealed the personal details of their entire workforce, including their job description and places of work. It was assumed the list subsequently fell into the hands of paramilitary organisations, leading to an enormously disruptive and expensive personal security review. ICO PSNI fine

The PSNI case also illustrates how some of the worst data protection hazards are those we set for ourselves. Not a big stone ball or poison darts. Simply a human error on a spreadsheet, an error adequate in-house procedures failed to prevent or identify.

How many such hazards are spread across your organisation?

ICO fine for Police Service of Northern Ireland

October 2024

What went wrong and what can we learn from this data breach?

You may recall the awful data breach last summer by the Police Service of Northern Ireland (PSNI). The personal details of its entire workforce (9,483 officers and staff) were accidentally exposed in response to a Freedom of Information request. The dreadful mistake left many fearing for their safety with an assumption the information shared got into the hands of dissident republicans.

This was a simple mistake involving a spreadsheet, which ALL organisations should take heed of.

The ICO has announced a £750,000 fine and says simple-to-implement procedures could have prevented this serious breach. If the ICO had not applied its discretionary approach for the public sector, the fine would otherwise have been £5.6 million. But in assessing the level of the fine, the current financial position of the PSNI and a desire not to divert public money from where it’s needed, were taken into account. A commercial organisation would have faced a much heftier financial penalty.

What went wrong?

The PSNI received two Freedom of Information requests in August 2023 from the same person. These came via WhatDoTheyKnow (WDTK); a platform which helps people submit requests and publishes responses. The requests were for information about the number of officers at each rank and number of staff at each grade, and some other details.

This information was downloaded in the form of an Excel file from the PSNI’s HR system and included personal data relating to all employees. During the analysis, multiple other worksheets were created within the same file. Once completed all visible worksheets were deleted.

But when the file was subsequently uploaded to the WDTK website, it emerged a hidden worksheet remained containing personal details. This had gone unnoticed, despite quality assurance. More detail is available in the ICO Penalty Notice.

In this case the evidence of the distress and harm caused by this data breach was evident. The ICO has published some of the comments from police officers affected, including: “How has this impacted on me? I don’t sleep at night. I continually get up through the night when I hear a noise outside to check that everything is ok. I have spent over £1000 installing modern CCTV and lighting around my home, because of the exposure.”

In announcing the penalty fine, John Edwards, UK Information Commissioner said: “I cannot think of a clearer example to prove how critical it is to keep personal information safe… Let this be a lesson learned for all organisations. Check, challenge and change your disclosure procedures to ensure you protect people’s personal information.”

What lessons can we learn?

While this is a particularly serious case, the ICO says mistakes when disclosing information via spreadsheets are nothing new. Public Authorities in particular are being urged to make sure robust measures are in place to make sure personal information is kept safe and the risk of human error is reduced. The regulator has published a useful checklist for any disclosures made using Excel:

Delete hidden columns, rows and worksheets that are not pertinent to the request
Remove any linked data from pivot tables, charts and formula which are not part of the request
Remove all personal data and special category data which is not necessary to provide to fulfil the request
Remove any meta data
 Make sure the file size is as you’d expect for the volume of data being disclosed
Convert files to CSV

More information is available in an ICO Advisory Note

Crucially, organisations need to make sure all staff involved in the disclosure process have been given appropriate training. It’s too easy to point the finger at individuals for making mistakes, when it’s often a lack of robust procedures, training and final ‘pre-send’ checks which are ultimately to blame.

Data Protection Officers Myth Buster

September 2024

Why we don't ALL need a DPO!

Most small organisations, and many medium-sized businesses don’t have to appoint a Data Protection Officer. This is only a mandatory requirement under GDPR, and it’s British spin-off UK GDPR, if your organisation’s activities meet certain criteria.

However, this doesn’t mean you can’t voluntarily choose to appoint a DPO. However, it is worth bearing in mind the role of a Data Protection Officer is clearly defined in law. EU/UK GDPR sets out the position of a DPO, specific tasks they’re responsible for, and how the organisation has a duty to support the DPO to fulfil their responsibilities.

The DPO Confusion!

I believe GDPR (perhaps inadvertently, through media coverage and elsewhere) created a degree of confusion about who needed a DPO and what the role actually entails. It led many businesses to voluntarily appoint one, thinking they really should. It led clients to include ‘do you have a DPO?’ in their due diligence questionnaires. Suppliers to think, ‘oh we better have one.’

Some organisations understood the DPO requirements, others perhaps less so. Many will have correctly informed the ICO (or relevant EU data protection authority) who their DPO is, others won’t.

Some DPOs will be striving to fulfil their designated tasks, others won’t have the resources to do this, some may be blissfully unaware of the legal obligations the role carries with it.

When is it mandatory to have a DPO?

The law tells us you NEED to appoint a DPO if you’re a Controller or a Processor and the following apply:

  • you’re a public authority or body (except for courts acting in their judicial capacity); or
  • your core activities require large-scale, regular and systematic monitoring of individuals (for example, online behaviour tracking); or
  • your core activities consist of large-scale processing of special categories of data or data relating to criminal convictions and offences.

This raises questions about what’s meant by ‘large-scale’ and what happens if you are found not to have appointed a DPO when you should have.  The truth is many smaller businesses and not-for-profits don’t have to have one.

When it comes to interpreting ‘large-scale’ the European Data Protection Board Guidelines on Data Protection Officers, provide some useful examples.

What are your  options if you don’t fall under mandatory requirements?

The ICO tells us all organisations need to have ‘sufficient staff and resources to meet the organisation’s obligations under the GDPR’. So, if you assess that you don’t fall under the mandatory requirement, you have a choice:

  • voluntarily appoint a DPO, or
  • have a team or individual responsible for overseeing data protection, in a proportionate way based on the size or your organisation and the nature of the personal data you handle.

What is the ‘position’ of the DPO?

If you appoint a DPO, EU/UK GDPR tells us they must:

  • report directly to the highest level of management
  • be given the independence and autonomy to perform their tasks
  • be given sufficient resources to be able to perform their tasks
  • be an expert in data protection
  • be involved, in a timely manner, in all issues relating to data protection.

In short, not just anybody can be your DPO. They can be an internal or external appointment.  In some cases a single DPO can be appointed for represent several organisations. They can perform other tasks, but there shouldn’t be a conflict of interests.  For example a Head of Marketing also being the DPO might be an obvious conflict.

A DPO must also be easily accessible, for individuals, employees and the ICO.  Their contact details should be published (e.g. in your privacy notice – btw this doesn’t have to be their name) and the ICO should be informed who they are.

What tasks should a DPO fulfil?

The DPO role has a formal set of accountabilities and duties, laid down within the GDPR.

  • Duty to inform and advise the organisation and its employees about their obligations under GDPR and other data protection laws. This includes laws in other jurisdictions which are relevant to the organisation’s operations.
  • Duty to monitor the organisation’s compliance with the GDPR and other data protection laws. This includes ensuring suitable data protection polices are in place, training staff (or overseeing this), managing data protection activities, conducting internal reviews & audits and raising awareness of data protection issues & concerns so they can be tackled effectively.
  • Duty to advise on, and to monitor data protection impact assessments (DPIAs).
  • Duty to be the first point of contact for individuals whose data is processed, and for liaison with the ICO.

It’s also worth noting, if you don’t listen to the advice of your DPO you should document why you didn’t follow up on their recommended actions. Also a DPO cannot be dismissed or penalised for performing their duties.

Solving the GDPR puzzle

September 2024

Winston Churchill famously described Russian foreign policy as, ‘a riddle wrapped in a mystery inside an enigma.’

I’m sure those entrusted with data protection for their organisation may harbour similar thoughts about GDPR! Especially small-to-medium sized businesses and start-ups.

As a piece of legislation, UK GDPR has lots of moving parts. As a consultant dedicated to helping organisations understand data protection, here’s my round up of things we at DPN find most commonly misconstrued.

UK GDPR & Data Protection Act 2018

The UK GDPR and the Data Protection Act 2018 are not the same thing.

UK GDPR was implemented in 2020 and largely mirrors its EU namesake. Post-Brexit, the UK flavour of GDPR was created to make it fit for purpose in a UK-specific context. For example, removing all the bits which referenced ‘member state law’.

The Data Protection Act 2018 supplements UK GDPR. For example, it provides more detailed provisions in relation to special category data, child consent, the public interest lawful basis and individual privacy rights exemptions.

The DPA 2018 also includes distinct provisions for processing by law enforcement and intelligence services.

The Privacy and Electronic Communications Regulations (PECR)

It’s PECR not UK GDPR which sets out the rules for direct marketing by electronic means, and for cookies and similar technologies.

PECR has been around since 2003, and is derived from the ePrivacy EU Directive 2002. In 2011 there was a significant update to this piece of legislation with the so called ‘cookie law’.

UK GDPR and PECR sit alongside each other. Organisations need to comply with both when personal data is collected and used for electronic marketing purposes, or collected and used via the deployment of cookies and similar technologies. UK GDPR, marketing & cookies

There’s further interplay, for example, when consent is required under PECR, the consent collected needs to meet the UK GDPR standard for valid consent. This means, to give one example, the required consent for non-essential cookies must be ‘freely given, specific, informed and unambiguous’ and must be given by a ‘clear affirmative action by the data subject’. Getting consent right

Controller and processor

UK GDPR tells us a controller means ‘the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data’.

For example, a sole trader, a charity, a limited company, a PLC or a local authority can be a controller. An individual within an organisation such as a CEO or Data Protection Officer (more on DPOs in a bit) is not a controller – a point some companies get wrong in their privacy notice and internal data protection policies.

A controller decides how personal data is collected and used, and the organisation’s senior management is accountable. Furthermore the controller decides which service providers (aka ‘suppliers’ / ‘vendors’) to use. Which brings me onto….

A processor – which means ‘a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller’.

Routinely processors will be companies which provide a service, and in providing this service handle their clients’ data. The key is the processor won’t use this client data for their own business purposes.

To give some common examples of processors – outsourced payroll provider, external cloud services, marketing platforms, communications providers, website hosts, IT support services, software and application providers, and so much more.

Some organisations which primarily act as a processor (service provider) may also act as a controller for certain activities. For example, to handle their own employee’s personal data. Controller or Processor – what are we?

Controller, processor and ‘sub processor’ contracts

A key change ushered in by GDPR was the concept of processor liability flowing right down the data supply chain. The law decrees there must be a contractual agreement between a controller and a processor, and gives very specific requirements for what this should cover. These are often found in a Data Processing Agreement (DPA), which may be an appendix or addendum to an existing or new contract.

The law aims to make sure individuals’ rights are protected at all times as data flows down and back up the supply chain. As well as a contract between a controller and processor, the processor should have similar contractual terms flowing down to other processors they engage to deliver their services – commonly known as sub-processors. For example, the obligation to keep the controller’s personal data secure at all times. A point which can often get overlooked. Supplier contracts

International data transfers include granting ‘access to’ personal data

(aka ‘restricted transfers’ or ‘cross border transfers’)

An international data transfer refers to the act of sending or transferring personal data from one country to another. Crucially this includes when an organisation makes personal data available or accessible to another entity (‘third party’) located in another country. In other words, the personal data can be accessed from overseas.

To give a couple of examples;

⚑  your UK-based organisation engages a website hosting service based in the United States, which also provides support services. Employees of this service provider can access your customer data on the back end of your website.

⚑ Your UK-based organisation provides a payroll service to clients, to provide this service you use a sub-contractor based in India. The sub-contractor can view your clients’ employee payment records.

In both of the above situations an international data transfer is taking place, and the law tells us specific safeguards are necessary. These rules exist because in the above two cases, customers and employees risk losing control of their personal data when it is ‘transferred’ outside the UK.

For more detail see our International Data Transfers Guide and the ICO International Data Transfer Guidance

Consent should not be your default lawful basis

(aka ‘legal grounds’)

Under UK GDPR there are six lawful bases for processing personal data. No single lawful basis is ’better’ or more important than the other and you must determine your lawful basis for each processing activity. Pick whichever one of the six is most appropriate to the activity.

Sometimes consent will be the most appropriate basis to rely on, but certainly not always and consent should only be used when you can give people a genuine choice. Quick guide to lawful bases

A privacy notice is simply a notification, not something people have to agree to

(aka ‘privacy policy’)

People have a fundamental right to be informed and one of the main ways organisations can meet this is by publishing a privacy notice. All businesses need an external facing privacy notice if they’re collecting and handling people’s personal information. And despite a common misconception, this doesn’t just relate to data gathered via a website.

A privacy notice is a notification about ALL the different ways in which you’ll handle people’s personal details (your processing of ‘personal data’). It’s a method of providing necessary and legally mandated information. Although often still referred to as a ‘privacy policy’ it isn’t really policy (it’s a notification only) and isn’t something people should have to confirm they agree to. Privacy Notices Quick Guide & ICO Right to be Informed Guidance 

Not every organisation must have a Data Protection Officer

Many small organisations, and many medium-sized business don’t fall under the mandatory requirement to appoint a DPO. It’s only mandatory if your activities meet certain criteria;

✓ you’re a public authority or body (except for courts acting in their judicial capacity); or
✓ your core activities require large-scale, regular and systematic monitoring of individuals (for example, online behaviour tracking); or
✓ your core activities consist of large-scale processing of special categories of data or data relating to criminal convictions and offences.

It can sometimes be difficult to assess whether your organisation falls under the mandatory requirement or not. And of course it’s perfectly acceptable to voluntarily appoint one – a good DPO can be a huge benefit. But if you don’t appoint a DPO you’ll still need someone (or a team) who have responsibility for data protection.

It is worth bearing in mind the role of a Data Protection Officer is clearly defined in law. UK GDPR sets out the position of a DPO, specific tasks they’re responsible for, and how the organisation has a duty to support the DPO to fulfil their responsibilities. DPO Myth Buster

Not all Personal Data Breaches need to be reported

You’ve accidentally sent an email to the wrong person. This included limited personal information about someone else. You’ve apologised. The person you accidentally sent it to is a trusted person and has confirmed it’s been deleted. It’s unlikely this type of minor breach needs to be reported to the ICO.

When a personal data breach has occurred (or is suspected), it’s important to quickly establish the likelihood and severity of risk and potential harms to those affected. You only need to report a breach to ICO if you assess the breach represents a risk to them. It can prove invaluable to have a clear methodology for assessing the risk posed. Data Breach Guide

The right of access (aka DSAR or SAR) is not a right to documentation

People have the right to submit a request to a controller asking for a copy of their personal data – a Data Subject Access Request. They can ask for ALL the personal data you hold about them. But this doesn’t mean the organisation is obliged to provide complete documents just because the individual’s name is referenced at some point. The same applies to emails. Requestees are not entitled to receive the full content of every email their name or email address appears in (unless all of the email content is personal data relating to them). DSAR Guide

Sensitive vs special category data

Certain types of personal data require higher levels of protection. Under the previous DPA 1998 the term ‘sensitive data’ was used, but under GDPR the revised term for this is ‘special categories of personal data’ commonly referred to as Special Category Data.

This includes (but isn’t limited to) racial or ethnic origin, biometrics, political opinions, sexual orientation and data concerning health or sex life. This doesn’t mean other types of data aren’t ‘sensitive’, and shouldn’t be handled securely – such as bank details, national insurance numbers, date of birth and so on.

It can be helpful to remember the root of special category data lies in human rights and data protection principles which emerged in Europe after World War Two – a war in which individuals were persecuted for their ethnic background, religious beliefs or indeed sexual orientation. Understanding and handling special category data

I’m going to finish off with another, but very different, quote. As Douglas Adams wrote in The Hitchhiker’s Guide to the Galaxy, ‘DON’T PANIC!’ There’s plenty of help available (this article, for starters 😉 ) and the ICO has published plenty of guidance, including a dedicated SME Hub.