Data Subject Access Requests – what are people entitled to?

February 2025

I’m often asked what’s in scope when responding the Right of Access – aka Data Subject Access Requests (DSAR/SAR). What are organisations obliged to provide, and what can they legitimately exclude? I’ve taken a look at some questions which routinely come up. But first a quick summary of what the law says…

The Right of Access is a fundamental right under data protection legislation in the UK and EU. There are similar rights in other jurisdictions, but I’m focusing here on the right under UK GDPR and the Data Protection Act (DPA 2018).

The law gives people the right to receive and copy of their personal data, and other supplementary information from any organisation acting as a controller. Controller or processor – what are we?

Personal data is any information which could directly or indirectly identify the requestee. To give some examples, this could include images, voice and video recordings, demographic information, profiles, order history, marketing preferences, HR records, performance reviews, opinions expressed about the requestee, other personal identifiers … and the list goes on.

Now, on to the FAQs…

Q: Do we need to provide information the requestee already has, or is obvious to them?

The short answer is, yes. Based on UK case law, organisations can’t refuse to disclose information on the grounds personal data is already known to the individual. (Case: Lttihadieh v 5-11 Cheyne Gardens, 2017). However, it wouldn’t need to be included if the person has made it clear they don’t want this information. You can always ask them.

Q: Are they entitled to full documents?

It isn’t a right to documentation. Just because someone’s name appears in a report, spreadsheet, meeting notes or any other document doesn’t mean they’re entitled to the whole document, if the rest doesn’t relate to them. It may prove easier and relevant to provide full documents, but you would be justified in not doing so. You can extract the necessary information, or redact the irrelevant information. But remember what you provide must be meaningful and have context.

Q: Are they entitled to the full content of email correspondence?

Linked to the question above, people are only entitled to a copy of their personal data. So just because their email address or email signature appears in an email (or email chain) doesn’t make this their personal data. For example, routine business as usual emails, where the content is solely about business related matters will not be the individual’s personal data. It can be really helpful to explain this from the start.

Q: Are handwritten notes in scope?

Personal data which is not part (or intended to be part) of a structured filing system is not in scope. For example handwritten notes in a personal notepad where there’s no intention to formally file these notes would not need to be included. However, if for example, employees write notes in ‘day books’ which are intended to be kept as a record of conversations, these would be in scope.

Q: How much effort is required?

Organisations are expected to make all reasonable efforts to search, identify and retrieve all the personal data being requested. The ICO would expect systems to be well-designed and maintained so information can be efficiently located (including carrying out searches) and extracted. The right of access is not new. It was around long before GDPR came into force in 2018, so organisations would be expected to be well prepared to handle requests.

Q: Can we refuse to comply with a request?

Sometimes it may seem obvious the requestee has an ulterior motive for submitting a DSAR. In general, an individual’s motives shouldn’t affect their right to obtain a copy of their personal data, or the organisation’s duty to respond. Organisations can however refuse to comply with a request, either partially or fully, where they judge it to be manifestly unfounded or manifestly excessive.

A request might be considered manifestly unfounded if, for example, the individual…

 has no real intention of exercising their right
offers to withdraw their request in return for some kind of benefit
explicitly states they want to cause disruption
makes unsubstantiated accusations or allegations
is targeting a specific employee due to a grudge
sends regular and targeted requests as part of a concerted campaign

A request might be considered manifestly excessive if it’s clearly or obviously unreasonable or would involve disproportionate effort. In assessing whether it would involve disproportionate effort, you should consider the following factors:

the nature of the requested information;
the context of the request, and the relationship between you and the individual;
whether a refusal to provide the information or even acknowledge if you hold it may cause substantive damage to the individual;
your available resources;
whether the request largely repeats previous requests and a reasonable interval hasn’t elapsed; or
whether it overlaps with other requests (although if it relates to a completely separate set of information it is unlikely to be excessive).

If you rely on either of these grounds, be sure to document your decision, the rationale behind it and explain this to the individual.

To give an example, quite a few years ago I worked on a request from a disgruntled former employee where, among everything else, they asked for all CCTV footage of them. The business operated CCTV which captured employees as they entered and exited the main office. We asked the individual if there were specific dates and times they were interested in. They responding just reiterating the request for all CCTV footage. I think understandably we judged this to be an manifestly excessive request, requiring disproportionate effort and that it would not cause any damage to the individual not to receive this.

Q: What can be excluded or redacted?

Once all the information relating to the individual has been retrieved, the data collated often includes information which doesn’t need to be disclosed. There may be justifiable grounds for excluding information or redacting documents, emails, video recordings and so on.

Information relating to others: the person making the request has a right to receive a copy of their personal data, they’re not entitled to personal data about other people. The DPA 2018 confirms you do not need to include certain information if it means disclosing information which identifies someone else, unless the other person has given their consent or it’s reasonable to disclose without the other person’s consent.

Confidential information: A duty of confidence may arise when another individual has genuinely shared ‘confidential’ information with the expectation it remains confidential. Confidentiality cannot be automatically assumed and needs to be assessed on a case-by-case basis. Other information which may also be considered confidential includes, but is not limited to; trade secrets, information made confidential under another law, internal costs or commercial rates, intellectual property and information covered as part of a non-disclosure agreement

Other exemptions: The DPA 2018 provides a number of further exemptions which may apply depending on the nature of your business and the context of the specific request. These don’t always apply in the same way. Sometimes you might be obliged to rely on an exemption (i.e. it would break another law), other times it will be a choice. Commonly used exemptions include; legal professional privilege, crime and taxation, management information, research and statistics, confidential references and journalism.

The ICO says exemptions should not be routinely relied upon or applied in a blanket fashion. And remember, you may be required to demonstrate how an exemption applies and your rationale for relying on it. The ICO has published guidance on exemptions and how they apply.

These are just some questions I get asked and I’m afraid to say there are plenty more. Responding to DSARs can be very time-consuming, with nuanced considerations and can feel a minefield if you don’t receive many requests or out of the blue receive your first one. Our DSAR Guide provides more information about how to prepare and fulfil requests. Also see the ICO’s detailed Right of Access Guidance.

Why record keeping is the cornerstone of data protection

January 2025

Records of Processing Activities

No one ever wrote a thriller about record keeping. Denzel, Keanu, Keira and Brad are not required on set. But here’s why we should give it due attention.

Put simply, without adequate records it’s difficult to demonstrate compliance with data protection legislation (GDPR and UK GDPR). Records are core to meeting the accountability principle, i.e. being ready and able to demonstrate evidence of compliance.

Let’s step back for a moment. Each organisation needs to know what personal data they hold, where it’s located and what purposes it’s being used for. Only then can you be sure what you’re using it for is fair and lawful, and gain confidence you’re meeting other GDPR obligations.

To put it another way, how confident is your organisation in answering the following questions?

  • Do we know what personal data we hold, it’s sensitivity and all the systems it’s sitting on – including data shared with third parties?
  • Do we know all purposes for processing?
  • Have we determined an appropriate lawful basis for each purpose? And are we meeting the specific requirements for that basis?
  • When handling special category data, have we also identified a special category condition?
  • Have we confirmed how long we need to keep the data for each purpose?

All of the above feed into transparency requirements, and what we tell people in our privacy notices.

In my opinion, you can’t answer these questions with confidence unless you map your organisation’s use of personal data and maintain a central record. This may be in the form of a Records of Processing Activity (RoPA).

Okay, so the absence of data protection records might only come to light if your organisation is subject to regulatory scrutiny. But not putting this cornerstone in place could result in gaps and risks being overlooked – which could potentially materialise into a serious infringement.

In my view, a RoPA is a sensible and valuable asset for most organisations. I fully appreciate creating and maintaining a RoPA can feel like a Herculean task, especially if resources are overstretched. That’s why we often recommend taking a proportionate and achievable approach, focussing on special category data use and higher risk activities first. Then build on this foundation when you can.

RoPA requirements under GDPR & UK GDPR

The requirements apply to both controllers and processors and include keeping records covering:

  • the categories of personal data held
  • the purposes of processing
  • any data sharing
  • details of transfers to third countries, including a record of the transfer mechanism safeguards in place;
  • retention periods
  • the technical and organisational measures used to protect the data

and more…

Do you employ less than 250 people?

If so, record keeping requirements may be less stringent. But you’ll still be required to maintain a RoPA if:

  • your processing of personal data is not occasional
  • your processing is likely to result in risk to the rights and freedoms of individuals
  • you process special category data (e.g. health data, ethnicity, trade union membership, biometrics and more)
  • you process personal data relating to criminal convictions and offences.

You can read more about the requirements in ICO records of processing guidance.

Benefits of Record Keeping (RoPA)

Here are just some of the benefits you can get from your RoPA.

1. Understanding the breadth and sensitivity of your data processing.

2. Visibility of where data protection risks lie. This will help establish priorities and focus efforts to tackle key risks.

3. Confidence your activities are lawful and meet specific regulatory requirements.

4. Tackle over retention of data – it’s a common challenge. By establishing your purposes for processing personal data, you can determine how long you need to keep that data. Then you can take practical steps to delete any data you no longer need.

5. Transparency – An up-to-date RoPA feeds into your privacy notice, making sure the information you provide accurately reflects what you are really doing.

6. Data breaches – Your RoPA should be the ‘go to’ place if you suffer a data breach. It can help you to quickly identify what personal data may have been exposed and how sensitive the data is, which processors might be involved and so on. Helping you to make a rapid risk assessment (within 72 hours) and helping you make positive decisions to mitigate risks to protect individuals.

7. Supply chain – Keeping a record of your suppliers (‘processors’) is a key aspect of supplier management along with due diligence, contractual requirements and international data transfers.

8. Privacy rights – If you receive a Data Subject Access Request, your records can help to locate and access the specific data required to fulfil the request. If you receive an erasure request, you can quickly check your lawful basis for processing and see if the right applies, and efficiently locate what systems the data needs to be deleted from.

Tips to get started

Here are a few very quick tips on how to commence a RoPA project or breathe new life into an outdated spreadsheet you last looked at in 2018!

Who?

No DPO or data protection team can create and maintain these records their own – they need support from others. Enlist the support of your Senior Leadership Team, as you’ll need them to back you and drive this forward.

Confirm who is or should be is accountable for business activities which use personal data within all your key business functions – the data owners. For example, Human Resources (employment & recruitment activities), Sales & Marketing (customer/client activities), Procurement (suppliers), Finance, and so on. Data owners are usually best placed to tell you what data they hold and what it’s currently used for, so get them onside.

What?

Make sure you’re capturing all the right information. The detail of what needs to be recorded is slightly different if you act as a controller or processor (or indeed both). If you need to check take look at the ICO guidance on documentation.

When?

There’s always some new system, new activity and/or change of supplier, isn’t there? You should aim to update your records whenever you identify new processing or changes to existing processing – including identifying when you need carry out a Data Protection Impact Assessment or Legitimate Interests Assessment. Good stakeholder relations can really help with this.

In conclusion, record keeping might not win many Oscars, but it really is the cornerstone of data protection compliance. Adequate records, even if not massively detailed, can be really beneficial in so many ways, not just if the ICO (or another Data Protection Authority) comes calling.

Controller or processor? What are we?

January 2025

The importance of establishing if an organisation is acting as a processor or controller

On paper the definitions of controller and processor under GDPR (& UK GDPR) may seem straight-forward, but deciding whether you’re acting as a controller, joint-controller or processor can sometimes be a contentious area.  Many a debate has been had between DPOs and lawyers when trying to classify the relationship between different parties.

It’s not unusual for it to be automatically assumed all suppliers providing a service are acting as processors, but this isn’t always the case. Sometimes joint controllership, or separate distinct controllers, is more appropriate. Or perhaps a company is simply providing a service, and is not processing the client’s personal data (other than minimal contact details for a couple of employees).

It’s worth noting service providers (aka suppliers or vendors) will often act as both, acting as controller and processor for different processing tasks. For example, most will be a controller for at least their own employee records, and often for their own marketing activities too.

What GDPR says about controllers and processors

The GDPR tells us a controller means ‘the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data’.

A processor means ‘a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller’.

How to decide if we’re a controller or processor

There are some questions you can ask to help reach a conclusion:

Do we decide how and what personal data is collected?
Are we responsible for deciding the purposes for which the personal data is used?
Do we use personal data received from a client/partner for our own business purposes?
Do we decide the lawful basis for the processing tasks we are carrying out?
Are we responsible for making sure people are informed about the processing? (Is it our privacy notice people should see?)

If you’re answering ‘yes’, to some or all of these questions, it’s highly likely you’re a controller.

The ICO makes it clear it doesn’t matter if a contract describes you as a processor; “organisations that determine the purposes and means of processing will be controllers regardless of how they are described in any contract about processing services”.

A processor only processes a controllers’ personal data on their behalf and crucially doesn’t use this data for its own business purposes. While a processor may make its own day-to-day operational decisions, it should only process the data in line with the controller’s instructions, unless required to do otherwise by law.

Sometimes overlooked is the fact even if a handful of employees of a service provider only have access to a controller’s personal data it still means the service provider is ‘processing’ the data, and will be a processor.

Why it’s important to confirm your status

Controllers have a higher level of accountability. They are obliged to comply with all data protection principles, such as ensuring the lawfulness of processing, being transparent (e.g. privacy notices), fulfilling privacy rights requests and so on.

Processors do have a number of direct obligations, such as being required to implement appropriate  technical and organisation measures to protect personal data. A processor is also responsible for ensuring the compliance of any sub-processors it may use to fulfil their services to a controller. In fact processors are liable for the sub-processors.

The ICO issued a £3m fine to a software company in March 2025 for failing to implement sufficient measures, which you can read about here.

Data processing agreements

There’s a requirement to have an appropriate agreement in place between a controller and a processor.  Article 28 of EU / UK GDPR sets out specific requirements for what must be included in the contractual terms.

Such terms are often covered in a Data Processing Agreement/Addendum, but sometimes will be covered in a specific section on data protection within the main contract. (If there’s no DPA, no addendum and no section on data protection that’s a massive red flag!)

Often overlooked is the need to have clear documented instructions from the controller. It can be helpful to have these as an annex to the main contract (or master services agreement), so they can be updated if the processing changes. We’ve written more about the detail of what needs to be covered in contractual terms here. Another area which can get forgotten is sub-processors and international data transfers.

There are times where you’re looking to engage the services of a household name, a well-known and widely used processor. This sometimes leads to limited or no flexibility to negotiate contractual terms. In such cases, it pays to check the terms and, if necessary, take a risk-based view on whether you wish to proceed or not.

Before even looking at the terms, due diligence on prospective processors is a ‘must do’ for controllers, while taking an approach proportionate to the level of risk the outsourced processing poses. And for their part processors need to be prepared to prove their data protection and information security credentials.

Court of Appeal rejects appeal against ICO fine

December 2024

The very first fine the ICO issued under the GDPR was back in 2019. It was to pharmacy, for storing unlocked boxes containing sensitive medical information in the yard behind its offices. More than five years later, the fine has yet to be paid.

The initial penalty notice was for £275,000 against Doorstep Dispensaree, a pharmacy in Edgware, North London. The company appealed, arguing the ICO’s actions were disproportionate and failed to take into consideration the firm’s financial hardship. It also argued less personal information was affected than originally thought. 67,000 documents were involved, rather than the 500,000 the original enforcement notice cited. Furthermore, the pharmacy claimed their backyard storage area was largely secure from public access.

The fine was subsequently reduced to £92,000.

As an aside, I’d suggest this is still a huge number of records stored in unlocked boxes. The data concerned involved customer’s names, addresses, dates of birth, NHS numbers, medical information and prescriptions.

This wasn’t the end of it. Doorstep Dispensaree raised a subsequent appeal, arguing the judge in the previous appeal failed to recognise the burden of proof lay with the ICO, and that undue weight had been given to the ICO’s reasons for opposing and setting a penalty.

In a decision welcomed by the ICO, the Court of Appeal has now dismissed this appeal. It ruled the burden of proof should lie with the appellant, Doorstep Dispensaree, and subsequent tribunals and appeals aren’t required to ignore original monetary penalty notices when making decisions.

Responding to the news, Information Commissioner John Edwards said, “I welcome the Court of Appeal’s judgment in this case as it provides clarity for future appeals. We defended our position robustly and are pleased that the court has agreed with our findings.”

The ICO has been much criticised for its lack of enforcement action under GDPR. It’s issued multiple fines under the Privacy and Electronic Communications Regulations (PECR), but fewer under GPDR (now UK GDPR). This may be due to the fact violating the PECR rules can be more clearcut. While much of the criticism may be fair, I believe this case demonstrates the legal hurdles the Regulator can face when taking enforcement action. However, the more cases we get, the more case law we’ll have for UK GDPR.

2024’s Data Protection Milestones

December 2024

Each year sees significant developments across the data protection landscape. I’ve asked industry insiders for their ONE most significant data protection or ePrivacy related milestone of 2024. Interestingly, everyone offered a different take. And all of these milestones will remain significant well into 2025.

UK GENERAL ELECTION AND DATA BILLS

Chris Combemale, Chair of the Data and Marketing Association

The most significant event for me was the General Election. For three years the DMA worked hard with the former government to ensure key reforms were included in the DPDI Bill, including certainty around legitimate interest as a lawful basis for direct marketing. At the time the election was called, DPDI was in the final stages of passage in the House of Lords. The DMA campaigned throughout the election to persuade the new government to pick up the mantle, including a joint letter to all political parties from the DMA, Tech UK and other members of the Business Advisory Group which I chaired. Our efforts paid off and the Data (Use and Access) Bill is now at Committee Stage in the House of Lords. DUA brings forward the best parts of DPDI while dropping the most controversial reforms, salvaging years of work and creating a better Bill that will transform public services, contribute to growth in the economy and maintain high levels of data protection.

Simon Blanchard, Data Protection Consultant, DPN Associates

The DUA Bill includes plans for Smart Data Schemes which allow consumers and businesses to safely share personal information with regulated and authorised third parties, for example, to generate personalised market comparisons. There are plans to create a framework for trusted identity verification services which could simplify processes like starting a new job, renting a home, as well as registering births and deaths. For me it’s significant there are now no plans to dilute accountability obligations under UK GDPR (e.g. remove the Data Protection Officer role and no changes to DPIA and RoPA requirements). DUA will give a statutory footing for many commonly used practices regarding Data Subject Access Requests. Certain legitimate interests will become ‘recognised’, such as national security, safeguarding and emergency response. The Bill’s progress is definitely one to watch in 2025. Updated DPN Legitimate Interests Guidance v3

DOORS OPENED TO EU PRIVACY ‘CLASS ACTIONS’

Fedelma Good, Data Protection and ePrivacy Consultant

Top of my list was definitely going to be the news Australia had introduced a law banning social media use for under 16s, not least because of all the attendant concerns that have been expressed it will actually backfire, driving teenagers to the dark web, or making them feel more isolated. Well, at least this was top of my list right up until the announcement on 3rd December that the privacy rights group noyb had been approved in Austria and Ireland – but with validity throughout the EU – as a so-called ‘Qualified Entity’ to bring collective redress actions in courts throughout the European Union. I would really love to have a crystal ball to be able to see if a few years from now we will see Max Schrem’s, chair of nyob, comment that “So far, collective redress is not really on the radar of many – but it has the potential to be a game changer,” as the understatement of the decade.

AI & DATA PROTECTION COMPLIANCE

Steve Wood, Consultant and Researcher, Privacy X and former Deputy Commissioner, ICO

In 2024 our community has dug deeper into the key implications of AI for data protection compliance. We’ve seen a range of consultations from data protection regulators globally. Addressing issues such as whether large language models are classed as personal data, when legitimate interests can apply as a lawful basis, how data subjects’ rights apply to AI models and what safeguards to mitigate DP risks. Given the pivotal role the EU GDPR plays in global data protection governance the key event for me will come right at the end of the year, just before the 23 December (some Xmas holiday reading!) when the EDPB will release their GDPR Article 64(2) Opinion and AI models, requested by the Irish Data Protection Authority. The Opinion will provide a significant regulatory framing for the approach companies need to take to AI governance for the coming years, noting the breadth of application of the GDPR compared to the focus of the EU AI Act on high-risk systems.

GLOBAL ADOPTION OF DATA PROTECTION PRINCIPLES

Robert Bond, Senior Counsel, Privacy Partnership Law

The one most significant data protection event in 2024 for me was the number of countries around the world who were passing and updating their data protection law significantly influenced by the GDPR. From Kenya to Sri Lanka, from Australia to Saudi Arabia and from China to many States in the USA, the similarities around data protection principles, data subject rights and data transfer restrictions are considerable. Whilst these global developments may not apply to smaller organisations, in the case of multinationals, the ROI for all the hard work invested in complying with the GDPR is that complying with data protection laws in other parts of the world is getting somewhat easier.

UNLAWFUL INTERNATIONAL DATA TRANSFERS

Eduardo Ustaran, Partner Hogan Lovells International LLP

An issue which has returned as a top priority for regulators is cross-border data transfers. Due to geopolitical tensions, the resulting increase in surveillance and the populist appeal of data localisation, the legal restrictions on international data transfers have attracted implacable scrutiny and enforcement. A worrying concern in this area is that there seems to be no room for a balanced assessment of the risk in practice, as the mere possibility of access to data by law enforcement or intelligence agencies is leading regulators to conclude that such transfers are unlawful. This regulatory line of thinking poses a real test for everyone seeking to apply a pragmatic, risk-based approach to legitimising global data flows.

CASE LAW & THE DEFINITION OF ‘PROCESSING’

Claire Robson, Governance Director, Chartered Insurance Institute

An interesting development in case law came in the decision of the Court of Appeal in Farley v Paymaster (trading as Equiniti), a case about infringement of data protection rights in postal misdirection. Over 450 current and former police officers took action against their pension administrator, after statements were sent to out-of-date addresses. The High Court dismissed many of the claims, stating there was not enough evidence to show the post (pension benefits statements) had been seen by a third party, so no processing had occurred. The Court of Appeal overturned this, granting permission for claimants to appeal. It felt there was prospect of success in claiming processing had taken place through extraction of the information from the database, electronic transfer of data to the paper document, along with the mistaken address and was not necessary to rely on a third party reading the statement. An interesting one for Data Controllers to watch in how this develops and what it means for the definition of, and limits to ‘processing’.

LACK OF ICO ENFORCEMENT

Emma Butler, Data Protection Consultant, Creative Privacy

For me, sadly, the most significant event of 2024 has been the decline of data protection enforcement. Yes, we have seen fines for marketing breaches and some enforcement notices, but there has been a long list of serious compliance breaches with significant impacts on people that have only received a reprimand. This leads me to wonder how bad it has to get before there is serious enforcement action to change behaviours. I have seen a corresponding lessening of the importance of compliance among organisations in terms of increased risk appetites for non-compliance, and feeling they can ‘get away with’ practices because ‘everyone else is doing it’ and they see no consequences from the ICO. I have also noticed a decrease in DPO / senior roles and more combining of the DP role with other functions, as well as low salaries for the roles that exist. Not a vintage year.

REJECT ALL COOKIES

For my part, a significant change this year has been the ‘reject all’ button springing up on so many UK websites. Giving people a clear option to reject all non-essential cookies. (Albeit this is certainly not universal and I’m not sure clicking ‘reject all’ always works in practice). This change followed an ICO warning late in 2023 to the operators of some of the country’s most popular websites, demanding compliance with the cookie rules. Particularly focused on advertising/targeting cookies, website operators were told they had to make it as easy to reject all, as it is to accept all. We then saw some websites moving to the controversial consent or pay model; which gives users a choice 1) pay for an ad-free service 2) consent to cookies, or 3) walk away. I’ll be watching closely for the ICO’s hotly awaited views on the legitimacy of this approach. I’m also pleased it looks like the DUA Bill will pave the way for first party website analytics cookies to be permitted without consent.

As you can see, from the DUA Bill to AI, global privacy laws to data transfers and the real possibility of EU ‘class actions’, these milestones are likely to keep the industry busy well into 2025 and beyond. And we’ll continue to keep you updated of the most significant developments as they happen.

Meeting prospective clients’ due diligence demands

December 2024

Proving your data protection and information security credentials

Many businesses provide a service to other businesses, and once the pitch is done and you’re getting closer to signing that vital and lucrative contract, there can be a hurdle to overcome. Namely, meeting the client’s due diligence and supplier set up requirements.

For bigger well-known service providers this can be a breeze, but often small-to-medium sized organisations can find themselves grappling to prove their credentials. Requests can sometimes feel exasperatingly detailed, irrelevant or over-zealous.

Once you’ve got through the questions about sustainability, environmental impact, modern slavery, diversity, equality and inclusion, there will often be the need to answer questions about your approach to data protection and information security.

This will almost certainly be the case where your company’s services involve handling your prospective client’s personal data on their behalf. To use data protection terminology, if the client is the ‘controller’ and your organisation will act as their ‘processor’.

It’s important this relationship is clear, as there are specific contractual requirements for controllers-to-processors relationships under EU/UK GDPRs. Both parties need to meet their obligations. Are we a controller or processor?

So how can you get ahead of the game and be well-prepared? I’ve put together some key questions you may need to cover off. Some of these points will need to be included in any Controller-Processor Data Processing Agreement.

1. Do you have a Data Protection Officer?

Not all businesses need to appoint a DPO (despite most questionnaires expecting you to). If you don’t have a DPO, you may need to explain who in the organisation is responsible for data protection, and may need to be ready to justify why you don’t need a DPO. DPO Myth Buster

2. Do you have a dedicated Information Security team?

As well as being able to provide details of where responsibility for information security rests within your organisation, you’re also likely to be required to provide details of the security measures and controls you have in place to protect client data. This could for example be restricted access controls, use of encryption or pseudonymisation, back-ups, and so on. You may be asked if you have any form of security certification or accreditation.

Note: For contractual terms, such as a Data Processing Agreement/Addendum it’s likely you’ll need to include a summary of your security measures.

3. What data protection related policies do you have?

The most common requirement is being able to demonstrate you have a Data Protection Policy. Namely an internal policy which sets out data protection requirements, and your expectations and standards for your staff. A client could ask to see a copy of this. They might also ask if you have more detailed policies or procedures covering specific areas such as a data retention, individual privacy rights and so on.

4. Where will your processing of client personal data take place?

Many clients will be looking to understand if an international data transfer (what’s known as a restricted transfer) will be taking place. Whether this is happening will be dependent on your client’s location and your own location – including the locations of any servers you’ll process client data on.

The client may want to confirm there are necessary ‘safeguards’ in place for any restricted transfers, to ensure such transfers meet legal requirements. Examples of these include an adequacy decision, Standard Contractual Clauses (with the UK Addendum if relevant) or a UK International Data Transfer Agreement. They may also ask you about Transfer Impact Assessments. International Data Transfers Guide

5. Do you sub-contract services to third-parties?

You need to be prepared to share details of any third-party companies you use to provide your services which involve the handling, including access to, your client’s personal data. These are often referred to as ‘sub processors’. They’re also likely to ask you to confirm in which country these sub-processors are based (i.e. the geographical location where the ‘processing’ takes place).

Note: International data transfers and working with sub-processors are key elements of the GDPR mandated contractual terms between a controller and processor.

6. What procedures do you have in place for handling a personal data breach?

You may be asked if you’ve suffered a data breach in recent years, and to provide details of your procedures for handling a data breach. We’d recommend all businesses have a data breach plan/procedure/playbook.

If you’re acting as a processor for your client, you’ll need to inform them ‘without undue delay’ (often within 24 or 48 hours of becoming aware of the breach). Plus be ready to provide them with all relevant information about the incident rapidly, so they can assess their own data risks and report it to the relevant Data Protection Authority (such as the Information Commissioner’s Office) if appropriate.

7. Do you have a disaster recovery plan and backups?

The GDPR doesn’t detail specific requirements around resilience and disaster recovery – this will depend on the nature and sensitivity of the processing. But if you suffer a data breach (particularly a ransomware attack) you’ll want to make your systems have integrity and are fully operational again very quickly after the event. Your clients will expect this if their data could be affected, so expect to be asked tricky questions.

8. Do you have a Record of Processing Activities?

You may be asked to confirm you have a Record of Processing Activities and might be asked more detailed questions about your record keeping.

9. Procedures for handling client individual privacy rights requests

If you are a processor, handling personal data on behalf of your client, it won’t be your responsibility to respond to privacy rights requests (such as Data Subject Access Requests or erasure requests). However, you may need to assist your client in fulfilling requests relating to the client data you hold. And if you receive a request relating to client data, this must be swiftly sent on to the client. They may ask for evidence of a robust process for doing this.

10. Privacy information

Don’t forget your Privacy Notice (aka Privacy Policy). Before a prospective client works with you, they may look at your website and take a peek at the privacy information you provide. If this is off the mark and fails to meet key legal requirements, it could be a warning sign for them that you don’t take your data protection obligations seriously. Privacy Notices Quick Guide

The above is by no means an exhaustive list but should help you to be prepared for some of the key areas you may be questioned about.

At DPN, we often suggest processors prepare a factsheet or FAQ in advance of receiving these due diligence questionnaires. This can really help put your business on the front foot and demonstrate to your clients you’re on the ball for both data protection and information security. Crucially it speeds up the decision-making and onboarding process, as by being well prepared you no longer have to scrabble around at the last minute. So you can start work for your new client more quickly.

Using AI tools for recruitment

November 2024

How to comply with GDPR

AI tools offer dynamic, efficient solutions for streamlining recruitment processes. AI is capable of speedily identifying and sourcing potential candidates, summarising their CVs and scoring their suitability for the role.

What’s not to like?

Nonetheless, these processes must be fair and lawful. Is there a potential for bias and/or inaccurate outputs? How else will AI providers use jobseekers’ personal details? What data protection compliance considerations are baked into the AI’s architecture?

The Information Commissioner’s Office (ICO) is calling on AI providers and recruiters to do more to make sure AI tools don’t adversely impact on applicants. People could be unfairly excluded from potential jobs and/or have their privacy comprised. Why undo the good work HR professionals undertake to satisfy legal and best practice by using questionable technology?

The ICO recently ran a consensual audit of several developers and providers of AI recruitment tools. Some of the findings included;

Excessive personal data being collected
Data being used for incompatible purposes
A lack of transparency for jobseekers about how AI uses their details

The AI Tools in Recruitment Audit Report provides several hundred recommendations. The unambiguous message is using AI in the recruitment processes shouldn’t be taken lightly. Of course, this doesn’t mean recruiters shouldn’t embrace new technologies, but does mean sensible checks and balances are required. Here’s a summary of key ICO recommendations, with some additional information and thoughts.

10 key steps for recruiters looking to engage AI providers

1. Data Protection Impact Assessment (DPIA)

DPIAs are mandatory under GDPR where a type of processing is likely to result in high risk. The ICO says ‘processing involving the use of innovative technologies, or the novel application of existing technologies (including AI)’ is an example of processing they would consider likely to result in a high risk.

Using AI tools for recruitment purposes squarely meets these criteria. A DPIA will help you to better understand, address and mitigate any potential privacy risks or harms to people. It should help you to ask the right questions of the AI provider. It’s likely your DPIA will need to be agile; revisited and updated as the processing and its potential impacts evolve.

ICO DPIA recommendations for recruiters:

Complete a DPIA before commencing processing that is likely to result in a high risk to the people’s rights and freedoms such as procuring an AI recruitment tool or other innovative technology.
Ensure DPIAs are comprehensive and detailed, including:
– the scope and purpose of the processing;
– a clear explanation of relationships and data flows between each party;
– how processing will comply with UK GDPR principles; and consideration of alternative approaches.
– Assess the risks to people’s rights and freedoms clearly in a DPIA, and identify and implement measures to mitigate each risk.
Follow a clear DPIA process that follows the recommendations above.

2. Lawful basis for processing

When recruiting organisations need to identify a lawful basis for this processing activity. You need to choose the most appropriate of the six lawful bases such as consent or legitimate interests.

To rely on legitimate interests you will need to:
1. Identify a legitimate interest
2. Assess the necessity
3. Balance your organisation’s interests with the interests, rights and freedoms of individuals.

This is known as the ‘3-stage test’. We’d highly recommend you conduct and document a Legitimate Interests Assessment. Our recently updated Legitimate Interests Guidance includes a LIA temple (in Excel). Your DPIA can be referenced in this assessment.

3. Special category data condition

If you will be processing special category data, such as health information or Diversity, Equity and Inclusion data (DE&I), alongside a lawful basis you’ll need to meet a specific special category condition (i.e. an Article 9 condition under UK GDPR).

It’s worth noting, some AI providers may infer people’s characteristics from candidate profiles rather than directly collecting it. This can include predicting gender and ethnicity. This type of information even if inferred, will be special category data. It also raises questions about ‘invisible’ processing (i.e. processing the individual is not aware of) and a lack of transparency. The ICO recommends not using inferred information in this way.

4. Controller, processor or joint controller

Both recruiters and AI providers have a responsibility for data protection compliance. It should be clear who is the controller or processor of the personal information. Is the AI provider a controller, joint-controller or processor? The ICO recommends this relationship is carefully scrutinised and clearly recorded in a contract with the AI provider.

If the provider is acting as a processor, the ICO says ‘explicit and comprehensive instructions must be provided for them to follow’. The regulator says this should include establishing how you’ll make sure the provider is complying with these instructions. As a controller your organisation should be able to direct the means and purpose of the processing and tailor it to your requirements. If not, the AI provider is likely to be a controller or joint-controller.

5. Data minimisation

One of the core data protection principles is data minimisation. We should only collect and use personal information which is necessary for our purpose(s). The ICO’s audit found some AI tools collected far more personal information than necessary and retained it indefinitely to build large databases of potential candidates without their knowledge. What might make perfect sense to AI or the programmers creating such technology might not be compliant with data protection law!

Recruiters need to make sure the AI tools they use only collect the minimum amount of personal information required to achieve your purpose(s). (A purpose/purposes which should be clearly defined in your DPIA and, where relevant, your LIA).

There is also an obligation to make sure the personal details candidates are providing are not used for other incompatible purposes. Remember, if the AI provider is retaining data and using this information for its own purposes, it will not be a processor.

6. Information security and integrity

As part of the procurement process, recruiters need to undertake meaningful due diligence. This means asking the AI provider for evidence that appropriate technical and organisational controls are in place. These technical and organisational controls should also be documented in the contract. The ICO recommends regular compliance checks are undertaken while the contract is in place, to make sure effective controls remain in place.

7. Fairness and mitigating bias risks

Recruiters need to be confident the outputs from AI tools are accurate, fair and unbiased. The ICO’s audit of AI recruitment providers found evidence tools were not processing personal information fairly. For example, in some cases they allowed for recruiters to filter out candidates with protected characteristics. (Protected characteristics include; age, disability, race, ethnic or national origin, religion or belief, sex and sexual orientation). This should be a red flag.

You should seek clear assurances from the AI provider they have mitigated bias, asking to see any relevant documentation. The ICO has published guidance on this: How to we ensure fairness in AI?

8. Transparency

Are candidates aware an AI tool will used to process their personal details? Clear privacy information needs to be provided to job seekers which explains how and why the AI tool is being used. The ICO says this should extend to explain the ‘logic involved in making predictions or producing outputs which may affect people’. Candidates should also be told how they can challenge any automated decisions made by the tool.

The regulator recommends producing a privacy notice specifically for candidates on your AI platform which covers relevant UK GDPR requirements.

9. Human involvement in decision-making

There are strict rules under GPDR for automated decision-making (including profiling). Automated decision-making is the process of making a decision by automated means without any human involvement. A recruitment process wouldn’t be considered solely automated if someone (i.e. a human in the recruitment team) weighs up and interprets the result of an automated decision before applying it to the individual.

There needs to be meaningful human involvement in the process to prevent solely automated decisions being made about candidates. The ICO recommendations for recruiters include:

Ensure that recruiting managers do not use AI outputs (particularly ‘fit’ or suitability scores) to make automated recruitment decisions, where AI tools are not designed for this purpose.
Offer a simple way for candidates to object to or challenge automated decisions, where AI tools make automated decisions.

10. Data Retention

Another core data protection principle is ‘storage limitation’. This means not keeping personal data for longer than necessary for the purpose(s) it was collected for. It’s important to assess how long the data inputted and generated from AI tools will be kept for. Information about retention periods should be provided in relevant privacy information provided to job applicants (e.g. in an Applicant Privacy Notice provided on your AI platform).

The ICO says data retention periods should be detailed in contracts, including how long each category of personal information is kept and why. Plus what action the AI provider must take at the end of the retention period.

Summary

The ICO acknowledges the benefits of AI and doesn’t want to stand in the way of those seeking to use AI driven solutions. It does, however, ask recruiters to consider the technology’s compatibility with data protection law.

AI is a complex area for many and it’s easy to see how unintended misuse of personal data, or unfairness and bias in candidate selection could ‘slip through the cracks’ in the digital pavement. HR  professionals and recruiters can avoid problems later down the line by addressing these as Day One issues when considering AI.

Fairness and respect for candidate privacy are central principles of HR best practice and necessary for data protection compliance. Applying these to new technological opportunities shouldn’t come as a surprise. Including your data protection team in the planning stage can help to mitigate and possibly eliminate some risks. A win-win which would leave organisations more confident in reaping the benefits AI offers.

DPN Legitimate Interests Guidance and LIA Template (v 3.0)

Published in November 2024 this third version of our established Legitimate Interests Guidance aims to help organisations assess whether they can rely on legitimate interests for a range of processing activities. Routine or more complex activities, such as those involving the use of AI. First published in 2017, this updated version includes an improved LIA template (in Excel) to use when conducting your own Legitimate Interests Assessments.

Legitimate Interests Guidance from the Data Protection Network

Many thanks to PrivacyX Consulting and Privacy Partnership Law for working with us on this latest version. We’d also like to thank the original Legitimate Interests Working Group of 2017/2018, comprising representatives from a wide range of companies and institutions, who collaborated to produce previous versions.