AI: Risk and Regulation

February 2025

Artificial Intelligence is an epoch-defining opportunity. The biggest game-changer since the Internet. Governments, businesses and other organisations fear losing out, unless they embrace innovation and change. Yet the benefits of AI carry with them ethical challenges. There’s the risk of considerable harms to individuals, wider society and the environment. Organisations also have to navigate risks; reputational, commercial and regulatory.

To regulate, or not to regulate AI?

The AI regulatory landscape’s far from settled – in fact, it’s a new frontier. On the one hand, the first phase of the EU AI Act comes into effect – the world’s first comprehensive AI regulation. On the other, President Trump has ‘ripped up’ Joe Biden’s AI Executive Order of 2023. The new US administration wants to remove barriers it claims stifle innovation. All previous policies, directives, regulations and orders in relation to AI are under review. The focus is on making sure America is a global leader in AI technology.

In the UK, an EU-style regulation looks unlikely. For the time-being a ‘principles-based framework’ is supported for sector specific regulators to interpret and apply. Specific legislation for those developing the most powerful AI models looks the most likely direction of travel.

John Edwards, the UK Information Commissioner penned a letter to the Prime Minister (in response to a request from Government to key regulators, to set out how they’ll support economic growth) in which he says; “regulatory uncertainty risks being a barrier to businesses investing and adopting transformative technology”. The Commissioner says his office will “develop rules for those developing and using AI products, to make it easier to innovate and invest responsibly”. Interestingly, the Commissioner supports the idea of a statutory Code of Practice for AI, saying this would give regulatory certainty to businesses wanting to invest in AI in the UK.

AI regulation has supporters and critics in equal measure. The EU’s strict approach has led to fears Europe will languish behind the rest of the world. Others argue it’s crucial to enforce an ethical and responsible approach to AI – in the absence of regulation, the argument goes, AI could be more malevolent than benign.

The divisions were crystal clear at a high-level AI Summit in Paris on 11 February, as the US and UK refused to sign President Macron’s declaration calling for open and ethical AI.

The potential is there for the UK to find a sweet spot, positioning its approach between its cautious European neighbours on one side and the ‘Wild West’ on the other?

EU AI ACT – first phase now applicable

The AI Act was implemented in August 2024, and is coming into effect in stages. On 2nd February rules in relation to AI literacy requirements, definition of an AI system and a limited number of prohibited AI use cases, which the EU determines pose an unacceptable risk, came into effect.

Like GDPR, the AI Act has extra-territorial scope, meaning it applies to organisations based outside the EU, where they place AI products on the market or put them into service in the EU, and/or where outputs produced by AI applications are used by people within the EU. We’ve already seen how EU regulation has led to organisations like Meta and Google excluding the EU from use of its new AI products.

In brief the prohibited uses under the AI Act are:

 Facial recognition – the use of AI systems which create or expand facial recognition databased through the untargeted scraping of images from the internet or CCTV footage. Social scoring – AI systems which evaluate and score people on their behaviour or characteristics, where this might lead to detrimental or unfavourable treatment in an unrelated context. Or where it could lead to detrimental or unfavourable treatment which is unjustified or disproportionate.

Predictive criminal risk assessments based on profiling.

Subliminal manipulation or other deceptive techniques which distort people’s behaviour and cause them to take decisions they wouldn’t have otherwise taken which are likely to cause significant harm.

Exploitation of vulnerabilities – such as someone’s age, disability or social/economic disadvantage.

Inferring emotions in the workplace or educational setting.

Biometric categorisation which infers special category data.

Real-time remote biometric identification for law enforcement purposes.

The European Commission has published guidelines alongside these prohibited practices coming into effect. Guidelines on Prohibited PracticesGuidelines on Definition of AI System

EU AI Act – what comes next?

The rules are complex and organisations which fall within the scope of the AI Act will need to comply with tiered requirements dependent on risk, which at a very top level involves;

For AI systems classified as high-risk there will be core requirements, such as mandatory Fundamental Rights Impact Assessments (FRIA), registration on a public EU database, data governance and transparency requirements, human oversight and more.
General-purpose AI (GPAI) systems, and the GPAI models they are based on, will be required to adhere to transparency requirements, including technical documentation, compliance with EU copyright law and detailed summaries about content used for training AI systems.
For Generative AI applications, people will have to be informed when they are interacting with AI, for example a Chatbot.

It’s worth bearing in mind an AI system could, for example, be both high-risk and GPAI.

Managing AI use

While compliance will be a key factor for many organisations, protecting the organisation’s reputation may be an even bigger concern. So, how do we ensure AI is used in an efficient, ethical and responsible way?

Organisations already utilising AI are likely to have embedded robust governance, enabling smart investment and innovation to take place within a clear framework to mitigate potential pitfalls. For others, here are some points to consider:

Senior leadership oversight
Establish your organisation’s approach to AI; your strategy and risk-appetite.

Key stakeholders
Identify key individuals and/or departments likely to play a role in governing how AI is developed, customised and/or used.

Roles and responsibilities
Determine who is responsible and accountable for each AI system.

Knowledge of AI use
Understand and record what AI systems are already in use across the business, and why.

Policies and procedures
Develop appropriate policies and procedures, or update existing policies so people understand internal standards and relevant regulatory requirements.

Training, awareness and AI literacy
Provide appropriate training, consider if this should be role specific. Remember, already in effect under the EU ACT is a requirement for providers and developers of AI systems to make sure their staff have sufficient levels of AI literacy)

Risk assessments
Develop a clear process for assessing and mitigating potentials AI risks. While a Data Protection Impact Assessment (DPIA) may be required, this is unlikely to be sufficient on its own.

Supplier management
Embed appropriate due diligence processes when looking to adopt (and indeed customise) third-party AI SAAS solutions.

AI security risks

Appropriate security measures are of critical importance. Vulnerabilities in AI models can be exploited, input data can be manipulated, malicious attacks can target training datasets, unauthorised parties may access sensitive, personal and/or confidential data. Data can be leaked via third party AI solutions. We also need to be mindful of how online criminals exploit AI to create ever more sophisticated and advance malware, for example, to automate phishing attacks. On this point, the UK Government has recently published a voluntary AI cyber security code of practice.

AI is here. It’s genuinely transformative and far-reaching; organisations unable or unwilling to embrace change – and properly manage the risks – will be left behind. To take the fullest advantage of AI’s possibilities, agile and effective governance is key.

Data protection and employment records

February 2025

How to manage personal data relating to employees

Data protection compliance efforts are often focused on commercial or public-facing aspects of an organisation’s activities. Making sure core data protection principles and requirements are met when collecting and handling the data of customers, members, supporters, students, patients, and so on. However the personal data held relating to employees and job applicants doesn’t always get the same level of attention.

Handling employees’ personal information is an essential part of running a business, and organisations need to be aware and mindful of their obligations under the UK GDPR and Data Protection Act 2018. As well as, of course, obligations under employment law, health and safety law, and any other relevant legislation or sector specific standards.

A personal data breach could affect employee records. Employees can raise complaints about an organisation’s employment activities and employees (or former employees) can raise Data Subject Access Requests which can sometimes be complex to respond to. All of which can expose gaps in compliance with data protection laws. In some organisations employee records may represent the highest privacy risk.

Employee records are likely to include special category data and more sensitive information such as:

DE&I information (such as information relating to race, ethnicity, religion, gender, age, sexual orientation, etc)
disabilities and/or medical conditions
health and safety records
absence and sickness records
performance reviews and development plans
disciplinary and grievance records
occupational health referrals
financial information required for payroll

Alongside the core HR records, employees may be present on other records – such as CCTV, any tracking of computer / internet use, and so on. All of which need careful consideration from a data protection standpoint. Also see monitoring employees.

In my experience, while the security of employee records may often be taken into consideration, other core data protection principles might sometimes be overlooked, such as:

Lawfulness

It’s necessary to have a lawful basis for each processing activity. Many activities may be necessary to perform a legal obligation or covered under the contract of employment with the individual. However, the contract may not cover every activity an organisation has requiring the use of employee data. It should be clearly determined where legal obligation or the contract is appropriate for any given activity and confirm any activities where you may instead need to rely on other lawful bases, such as legitimate interests or consent.

Special category data

To handle medical information, trade union membership and diversity, equity and inclusion (DE&I) activities, and any other uses of special category data, it’s necessary to determine a lawful basis, plus a separate condition for processing under Article 9. Handling special category data

Data minimisation

The principle of data minimisation requires employers to take steps to minimise the amount of personal information about their employees to what is necessary for their activities and not hold additional personal information ‘just in case’ they might need it.

Data retention

Employee’s data should not be kept longer than necessary. There are statutory retention requirements for employment records in the UK (and many other jurisdictions), which set out how long they must be kept. But these laws may not cover all types of activities you may have for employment data. Once you set these retention periods, they need to be implemented in practice, i.e. regular reviews of the data you hold for specific purposes and securely destroy records you no longer need. These may be electronic records on IT systems or perhaps physical HR records languishing in boxes in a storeroom! You may wish to refer to our Data Retention Guidance

Transparency

Employees are entitled to know the ways in which their employer uses their personal data, the lawful bases, the retention periods and so on. The requirements for privacy notices must be applied to employees, just like external audiences. This necessary privacy information may be provided in an Employee Privacy Notice or via an Employee Handbook.

Risk assessments

Data Protection Impact Assessments are mandatory in certain circumstances. In other cases they might be helpful to conduct. Organisations mustn’t overlook DPIA requirements in relation to employee activities. For example, any monitoring of employees which might be considered intrusive or the use of biometric data for identification purposes.

Record keeping

Appropriate measures need to be in place to make sure employee records are being handled lawfully, fairly and transparently and in line with other core data protection principles. It’s difficult to do this without mapping employee data and maintaining clear records of the purposes you are using it for, the lawful bases, special category conditions and so on, i.e. your Record of Processing Activities (RoPA). The absence adequate records will make the creating a comprehensive privacy notice rather challenging.

Training

Whilst we’re on the topic of employees, let’s also give a mention to training. All employees handling personal data should receive appropriate information security and data protection training. It’s likely those in HR / People teams handling employee data on a daily basis will benefit from specialist training beyond the generic online training modules aimed at all staff.

To help you navigate data protection obligations the ICO has published new guidance on handling employee records, which provides more detail on what the law requires and regulatory expectations.

Finally, don’t forget data protection compliance efforts need to extend beyond employees to job applicants, contractors, volunteers and others who perform work-related duties for the organisation.

Data Subject Access Requests – what are people entitled to?

February 2025

I’m often asked what’s in scope when responding the Right of Access – aka Data Subject Access Requests (DSAR/SAR). What are organisations obliged to provide, and what can they legitimately exclude? I’ve taken a look at some questions which routinely come up. But first a quick summary of what the law says…

The Right of Access is a fundamental right under data protection legislation in the UK and EU. There are similar rights in other jurisdictions, but I’m focusing here on the right under UK GDPR and the Data Protection Act (DPA 2018).

The law gives people the right to receive and copy of their personal data, and other supplementary information from any organisation acting as a controller. Controller or processor – what are we?

Personal data is any information which could directly or indirectly identify the requestee. To give some examples, this could include images, voice and video recordings, demographic information, profiles, order history, marketing preferences, HR records, performance reviews, opinions expressed about the requestee, other personal identifiers … and the list goes on.

Now, on to the FAQs…

Q: Do we need to provide information the requestee already has, or is obvious to them?

The short answer is, yes. Based on UK case law, organisations can’t refuse to disclose information on the grounds personal data is already known to the individual. (Case: Lttihadieh v 5-11 Cheyne Gardens, 2017). However, it wouldn’t need to be included if the person has made it clear they don’t want this information. You can always ask them.

Q: Are they entitled to full documents?

It isn’t a right to documentation. Just because someone’s name appears in a report, spreadsheet, meeting notes or any other document doesn’t mean they’re entitled to the whole document, if the rest doesn’t relate to them. It may prove easier and relevant to provide full documents, but you would be justified in not doing so. You can extract the necessary information, or redact the irrelevant information. But remember what you provide must be meaningful and have context.

Q: Are they entitled to the full content of email correspondence?

Linked to the question above, people are only entitled to a copy of their personal data. So just because their email address or email signature appears in an email (or email chain) doesn’t make this their personal data. For example, routine business as usual emails, where the content is solely about business related matters will not be the individual’s personal data. It can be really helpful to explain this from the start.

Q: Are handwritten notes in scope?

Personal data which is not part (or intended to be part) of a structured filing system is not in scope. For example handwritten notes in a personal notepad where there’s no intention to formally file these notes would not need to be included. However, if for example, employees write notes in ‘day books’ which are intended to be kept as a record of conversations, these would be in scope.

Q: How much effort is required?

Organisations are expected to make all reasonable efforts to search, identify and retrieve all the personal data being requested. The ICO would expect systems to be well-designed and maintained so information can be efficiently located (including carrying out searches) and extracted. The right of access is not new. It was around long before GDPR came into force in 2018, so organisations would be expected to be well prepared to handle requests.

Q: Can we refuse to comply with a request?

Sometimes it may seem obvious the requestee has an ulterior motive for submitting a DSAR. In general, an individual’s motives shouldn’t affect their right to obtain a copy of their personal data, or the organisation’s duty to respond. Organisations can however refuse to comply with a request, either partially or fully, where they judge it to be manifestly unfounded or manifestly excessive.

A request might be considered manifestly unfounded if, for example, the individual…

 has no real intention of exercising their right
offers to withdraw their request in return for some kind of benefit
explicitly states they want to cause disruption
makes unsubstantiated accusations or allegations
is targeting a specific employee due to a grudge
sends regular and targeted requests as part of a concerted campaign

A request might be considered manifestly excessive if it’s clearly or obviously unreasonable or would involve disproportionate effort. In assessing whether it would involve disproportionate effort, you should consider the following factors:

the nature of the requested information;
the context of the request, and the relationship between you and the individual;
whether a refusal to provide the information or even acknowledge if you hold it may cause substantive damage to the individual;
your available resources;
whether the request largely repeats previous requests and a reasonable interval hasn’t elapsed; or
whether it overlaps with other requests (although if it relates to a completely separate set of information it is unlikely to be excessive).

If you rely on either of these grounds, be sure to document your decision, the rationale behind it and explain this to the individual.

To give an example, quite a few years ago I worked on a request from a disgruntled former employee where, among everything else, they asked for all CCTV footage of them. The business operated CCTV which captured employees as they entered and exited the main office. We asked the individual if there were specific dates and times they were interested in. They responding just reiterating the request for all CCTV footage. I think understandably we judged this to be an manifestly excessive request, requiring disproportionate effort and that it would not cause any damage to the individual not to receive this.

Q: What can be excluded or redacted?

Once all the information relating to the individual has been retrieved, the data collated often includes information which doesn’t need to be disclosed. There may be justifiable grounds for excluding information or redacting documents, emails, video recordings and so on.

Information relating to others: the person making the request has a right to receive a copy of their personal data, they’re not entitled to personal data about other people. The DPA 2018 confirms you do not need to include certain information if it means disclosing information which identifies someone else, unless the other person has given their consent or it’s reasonable to disclose without the other person’s consent.

Confidential information: A duty of confidence may arise when another individual has genuinely shared ‘confidential’ information with the expectation it remains confidential. Confidentiality cannot be automatically assumed and needs to be assessed on a case-by-case basis. Other information which may also be considered confidential includes, but is not limited to; trade secrets, information made confidential under another law, internal costs or commercial rates, intellectual property and information covered as part of a non-disclosure agreement

Other exemptions: The DPA 2018 provides a number of further exemptions which may apply depending on the nature of your business and the context of the specific request. These don’t always apply in the same way. Sometimes you might be obliged to rely on an exemption (i.e. it would break another law), other times it will be a choice. Commonly used exemptions include; legal professional privilege, crime and taxation, management information, research and statistics, confidential references and journalism.

The ICO says exemptions should not be routinely relied upon or applied in a blanket fashion. And remember, you may be required to demonstrate how an exemption applies and your rationale for relying on it. The ICO has published guidance on exemptions and how they apply.

These are just some questions I get asked and I’m afraid to say there are plenty more. Responding to DSARs can be very time-consuming, with nuanced considerations and can feel a minefield if you don’t receive many requests or out of the blue receive your first one. Our DSAR Guide provides more information about how to prepare and fulfil requests. Also see the ICO’s detailed Right of Access Guidance.

Why record keeping is the cornerstone of data protection

January 2025

Records of Processing Activities

No one ever wrote a thriller about record keeping. Denzel, Keanu, Keira and Brad are not required on set. But here’s why we should give it due attention.

Put simply, without adequate records it’s difficult to demonstrate compliance with data protection legislation (GDPR and UK GDPR). Records are core to meeting the accountability principle, i.e. being ready and able to demonstrate evidence of compliance.

Let’s step back for a moment. Each organisation needs to know what personal data they hold, where it’s located and what purposes it’s being used for. Only then can you be sure what you’re using it for is fair and lawful, and gain confidence you’re meeting other GDPR obligations.

To put it another way, how confident is your organisation in answering the following questions?

  • Do we know what personal data we hold, it’s sensitivity and all the systems it’s sitting on – including data shared with third parties?
  • Do we know all purposes for processing?
  • Have we determined an appropriate lawful basis for each purpose? And are we meeting the specific requirements for that basis?
  • When handling special category data, have we also identified a special category condition?
  • Have we confirmed how long we need to keep the data for each purpose?

All of the above feed into transparency requirements, and what we tell people in our privacy notices.

In my opinion, you can’t answer these questions with confidence unless you map your organisation’s use of personal data and maintain a central record. This may be in the form of a Records of Processing Activity (RoPA).

Okay, so the absence of data protection records might only come to light if your organisation is subject to regulatory scrutiny. But not putting this cornerstone in place could result in gaps and risks being overlooked – which could potentially materialise into a serious infringement.

In my view, a RoPA is a sensible and valuable asset for most organisations. I fully appreciate creating and maintaining a RoPA can feel like a Herculean task, especially if resources are overstretched. That’s why we often recommend taking a proportionate and achievable approach, focussing on special category data use and higher risk activities first. Then build on this foundation when you can.

RoPA requirements under GDPR & UK GDPR

The requirements apply to both controllers and processors and include keeping records covering:

  • the categories of personal data held
  • the purposes of processing
  • any data sharing
  • details of transfers to third countries, including a record of the transfer mechanism safeguards in place;
  • retention periods
  • the technical and organisational measures used to protect the data

and more…

Do you employ less than 250 people?

If so, record keeping requirements may be less stringent. But you’ll still be required to maintain a RoPA if:

  • your processing of personal data is not occasional
  • your processing is likely to result in risk to the rights and freedoms of individuals
  • you process special category data (e.g. health data, ethnicity, trade union membership, biometrics and more)
  • you process personal data relating to criminal convictions and offences.

You can read more about the requirements in ICO records of processing guidance.

Benefits of Record Keeping (RoPA)

Here are just some of the benefits you can get from your RoPA.

1. Understanding the breadth and sensitivity of your data processing.

2. Visibility of where data protection risks lie. This will help establish priorities and focus efforts to tackle key risks.

3. Confidence your activities are lawful and meet specific regulatory requirements.

4. Tackle over retention of data – it’s a common challenge. By establishing your purposes for processing personal data, you can determine how long you need to keep that data. Then you can take practical steps to delete any data you no longer need.

5. Transparency – An up-to-date RoPA feeds into your privacy notice, making sure the information you provide accurately reflects what you are really doing.

6. Data breaches – Your RoPA should be the ‘go to’ place if you suffer a data breach. It can help you to quickly identify what personal data may have been exposed and how sensitive the data is, which processors might be involved and so on. Helping you to make a rapid risk assessment (within 72 hours) and helping you make positive decisions to mitigate risks to protect individuals.

7. Supply chain – Keeping a record of your suppliers (‘processors’) is a key aspect of supplier management along with due diligence, contractual requirements and international data transfers.

8. Privacy rights – If you receive a Data Subject Access Request, your records can help to locate and access the specific data required to fulfil the request. If you receive an erasure request, you can quickly check your lawful basis for processing and see if the right applies, and efficiently locate what systems the data needs to be deleted from.

Tips to get started

Here are a few very quick tips on how to commence a RoPA project or breathe new life into an outdated spreadsheet you last looked at in 2018!

Who?

No DPO or data protection team can create and maintain these records their own – they need support from others. Enlist the support of your Senior Leadership Team, as you’ll need them to back you and drive this forward.

Confirm who is or should be is accountable for business activities which use personal data within all your key business functions – the data owners. For example, Human Resources (employment & recruitment activities), Sales & Marketing (customer/client activities), Procurement (suppliers), Finance, and so on. Data owners are usually best placed to tell you what data they hold and what it’s currently used for, so get them onside.

What?

Make sure you’re capturing all the right information. The detail of what needs to be recorded is slightly different if you act as a controller or processor (or indeed both). If you need to check take look at the ICO guidance on documentation.

When?

There’s always some new system, new activity and/or change of supplier, isn’t there? You should aim to update your records whenever you identify new processing or changes to existing processing – including identifying when you need carry out a Data Protection Impact Assessment or Legitimate Interests Assessment. Good stakeholder relations can really help with this.

In conclusion, record keeping might not win many Oscars, but it really is the cornerstone of data protection compliance. Adequate records, even if not massively detailed, can be really beneficial in so many ways, not just if the ICO (or another Data Protection Authority) comes calling.

Court of Appeal rejects appeal against ICO fine

December 2024

The very first fine the ICO issued under the GDPR was back in 2019. It was to pharmacy, for storing unlocked boxes containing sensitive medical information in the yard behind its offices. More than five years later, the fine has yet to be paid.

The initial penalty notice was for £275,000 against Doorstep Dispensaree, a pharmacy in Edgware, North London. The company appealed, arguing the ICO’s actions were disproportionate and failed to take into consideration the firm’s financial hardship. It also argued less personal information was affected than originally thought. 67,000 documents were involved, rather than the 500,000 the original enforcement notice cited. Furthermore, the pharmacy claimed their backyard storage area was largely secure from public access.

The fine was subsequently reduced to £92,000.

As an aside, I’d suggest this is still a huge number of records stored in unlocked boxes. The data concerned involved customer’s names, addresses, dates of birth, NHS numbers, medical information and prescriptions.

This wasn’t the end of it. Doorstep Dispensaree raised a subsequent appeal, arguing the judge in the previous appeal failed to recognise the burden of proof lay with the ICO, and that undue weight had been given to the ICO’s reasons for opposing and setting a penalty.

In a decision welcomed by the ICO, the Court of Appeal has now dismissed this appeal. It ruled the burden of proof should lie with the appellant, Doorstep Dispensaree, and subsequent tribunals and appeals aren’t required to ignore original monetary penalty notices when making decisions.

Responding to the news, Information Commissioner John Edwards said, “I welcome the Court of Appeal’s judgment in this case as it provides clarity for future appeals. We defended our position robustly and are pleased that the court has agreed with our findings.”

The ICO has been much criticised for its lack of enforcement action under GDPR. It’s issued multiple fines under the Privacy and Electronic Communications Regulations (PECR), but fewer under GPDR (now UK GDPR). This may be due to the fact violating the PECR rules can be more clearcut. While much of the criticism may be fair, I believe this case demonstrates the legal hurdles the Regulator can face when taking enforcement action. However, the more cases we get, the more case law we’ll have for UK GDPR.

2024’s Data Protection Milestones

December 2024

Each year sees significant developments across the data protection landscape. I’ve asked industry insiders for their ONE most significant data protection or ePrivacy related milestone of 2024. Interestingly, everyone offered a different take. And all of these milestones will remain significant well into 2025.

UK GENERAL ELECTION AND DATA BILLS

Chris Combemale, Chair of the Data and Marketing Association

The most significant event for me was the General Election. For three years the DMA worked hard with the former government to ensure key reforms were included in the DPDI Bill, including certainty around legitimate interest as a lawful basis for direct marketing. At the time the election was called, DPDI was in the final stages of passage in the House of Lords. The DMA campaigned throughout the election to persuade the new government to pick up the mantle, including a joint letter to all political parties from the DMA, Tech UK and other members of the Business Advisory Group which I chaired. Our efforts paid off and the Data (Use and Access) Bill is now at Committee Stage in the House of Lords. DUA brings forward the best parts of DPDI while dropping the most controversial reforms, salvaging years of work and creating a better Bill that will transform public services, contribute to growth in the economy and maintain high levels of data protection.

Simon Blanchard, Data Protection Consultant, DPN Associates

The DUA Bill includes plans for Smart Data Schemes which allow consumers and businesses to safely share personal information with regulated and authorised third parties, for example, to generate personalised market comparisons. There are plans to create a framework for trusted identity verification services which could simplify processes like starting a new job, renting a home, as well as registering births and deaths. For me it’s significant there are now no plans to dilute accountability obligations under UK GDPR (e.g. remove the Data Protection Officer role and no changes to DPIA and RoPA requirements). DUA will give a statutory footing for many commonly used practices regarding Data Subject Access Requests. Certain legitimate interests will become ‘recognised’, such as national security, safeguarding and emergency response. The Bill’s progress is definitely one to watch in 2025. Updated DPN Legitimate Interests Guidance v3

DOORS OPENED TO EU PRIVACY ‘CLASS ACTIONS’

Fedelma Good, Data Protection and ePrivacy Consultant

Top of my list was definitely going to be the news Australia had introduced a law banning social media use for under 16s, not least because of all the attendant concerns that have been expressed it will actually backfire, driving teenagers to the dark web, or making them feel more isolated. Well, at least this was top of my list right up until the announcement on 3rd December that the privacy rights group noyb had been approved in Austria and Ireland – but with validity throughout the EU – as a so-called ‘Qualified Entity’ to bring collective redress actions in courts throughout the European Union. I would really love to have a crystal ball to be able to see if a few years from now we will see Max Schrem’s, chair of nyob, comment that “So far, collective redress is not really on the radar of many – but it has the potential to be a game changer,” as the understatement of the decade.

AI & DATA PROTECTION COMPLIANCE

Steve Wood, Consultant and Researcher, Privacy X and former Deputy Commissioner, ICO

In 2024 our community has dug deeper into the key implications of AI for data protection compliance. We’ve seen a range of consultations from data protection regulators globally. Addressing issues such as whether large language models are classed as personal data, when legitimate interests can apply as a lawful basis, how data subjects’ rights apply to AI models and what safeguards to mitigate DP risks. Given the pivotal role the EU GDPR plays in global data protection governance the key event for me will come right at the end of the year, just before the 23 December (some Xmas holiday reading!) when the EDPB will release their GDPR Article 64(2) Opinion and AI models, requested by the Irish Data Protection Authority. The Opinion will provide a significant regulatory framing for the approach companies need to take to AI governance for the coming years, noting the breadth of application of the GDPR compared to the focus of the EU AI Act on high-risk systems.

GLOBAL ADOPTION OF DATA PROTECTION PRINCIPLES

Robert Bond, Senior Counsel, Privacy Partnership Law

The one most significant data protection event in 2024 for me was the number of countries around the world who were passing and updating their data protection law significantly influenced by the GDPR. From Kenya to Sri Lanka, from Australia to Saudi Arabia and from China to many States in the USA, the similarities around data protection principles, data subject rights and data transfer restrictions are considerable. Whilst these global developments may not apply to smaller organisations, in the case of multinationals, the ROI for all the hard work invested in complying with the GDPR is that complying with data protection laws in other parts of the world is getting somewhat easier.

UNLAWFUL INTERNATIONAL DATA TRANSFERS

Eduardo Ustaran, Partner Hogan Lovells International LLP

An issue which has returned as a top priority for regulators is cross-border data transfers. Due to geopolitical tensions, the resulting increase in surveillance and the populist appeal of data localisation, the legal restrictions on international data transfers have attracted implacable scrutiny and enforcement. A worrying concern in this area is that there seems to be no room for a balanced assessment of the risk in practice, as the mere possibility of access to data by law enforcement or intelligence agencies is leading regulators to conclude that such transfers are unlawful. This regulatory line of thinking poses a real test for everyone seeking to apply a pragmatic, risk-based approach to legitimising global data flows.

CASE LAW & THE DEFINITION OF ‘PROCESSING’

Claire Robson, Governance Director, Chartered Insurance Institute

An interesting development in case law came in the decision of the Court of Appeal in Farley v Paymaster (trading as Equiniti), a case about infringement of data protection rights in postal misdirection. Over 450 current and former police officers took action against their pension administrator, after statements were sent to out-of-date addresses. The High Court dismissed many of the claims, stating there was not enough evidence to show the post (pension benefits statements) had been seen by a third party, so no processing had occurred. The Court of Appeal overturned this, granting permission for claimants to appeal. It felt there was prospect of success in claiming processing had taken place through extraction of the information from the database, electronic transfer of data to the paper document, along with the mistaken address and was not necessary to rely on a third party reading the statement. An interesting one for Data Controllers to watch in how this develops and what it means for the definition of, and limits to ‘processing’.

LACK OF ICO ENFORCEMENT

Emma Butler, Data Protection Consultant, Creative Privacy

For me, sadly, the most significant event of 2024 has been the decline of data protection enforcement. Yes, we have seen fines for marketing breaches and some enforcement notices, but there has been a long list of serious compliance breaches with significant impacts on people that have only received a reprimand. This leads me to wonder how bad it has to get before there is serious enforcement action to change behaviours. I have seen a corresponding lessening of the importance of compliance among organisations in terms of increased risk appetites for non-compliance, and feeling they can ‘get away with’ practices because ‘everyone else is doing it’ and they see no consequences from the ICO. I have also noticed a decrease in DPO / senior roles and more combining of the DP role with other functions, as well as low salaries for the roles that exist. Not a vintage year.

REJECT ALL COOKIES

For my part, a significant change this year has been the ‘reject all’ button springing up on so many UK websites. Giving people a clear option to reject all non-essential cookies. (Albeit this is certainly not universal and I’m not sure clicking ‘reject all’ always works in practice). This change followed an ICO warning late in 2023 to the operators of some of the country’s most popular websites, demanding compliance with the cookie rules. Particularly focused on advertising/targeting cookies, website operators were told they had to make it as easy to reject all, as it is to accept all. We then saw some websites moving to the controversial consent or pay model; which gives users a choice 1) pay for an ad-free service 2) consent to cookies, or 3) walk away. I’ll be watching closely for the ICO’s hotly awaited views on the legitimacy of this approach. I’m also pleased it looks like the DUA Bill will pave the way for first party website analytics cookies to be permitted without consent.

As you can see, from the DUA Bill to AI, global privacy laws to data transfers and the real possibility of EU ‘class actions’, these milestones are likely to keep the industry busy well into 2025 and beyond. And we’ll continue to keep you updated of the most significant developments as they happen.

Using AI tools for recruitment

November 2024

How to comply with GDPR

AI tools offer dynamic, efficient solutions for streamlining recruitment processes. AI is capable of speedily identifying and sourcing potential candidates, summarising their CVs and scoring their suitability for the role.

What’s not to like?

Nonetheless, these processes must be fair and lawful. Is there a potential for bias and/or inaccurate outputs? How else will AI providers use jobseekers’ personal details? What data protection compliance considerations are baked into the AI’s architecture?

The Information Commissioner’s Office (ICO) is calling on AI providers and recruiters to do more to make sure AI tools don’t adversely impact on applicants. People could be unfairly excluded from potential jobs and/or have their privacy comprised. Why undo the good work HR professionals undertake to satisfy legal and best practice by using questionable technology?

The ICO recently ran a consensual audit of several developers and providers of AI recruitment tools. Some of the findings included;

Excessive personal data being collected
Data being used for incompatible purposes
A lack of transparency for jobseekers about how AI uses their details

The AI Tools in Recruitment Audit Report provides several hundred recommendations. The unambiguous message is using AI in the recruitment processes shouldn’t be taken lightly. Of course, this doesn’t mean recruiters shouldn’t embrace new technologies, but does mean sensible checks and balances are required. Here’s a summary of key ICO recommendations, with some additional information and thoughts.

10 key steps for recruiters looking to engage AI providers

1. Data Protection Impact Assessment (DPIA)

DPIAs are mandatory under GDPR where a type of processing is likely to result in high risk. The ICO says ‘processing involving the use of innovative technologies, or the novel application of existing technologies (including AI)’ is an example of processing they would consider likely to result in a high risk.

Using AI tools for recruitment purposes squarely meets these criteria. A DPIA will help you to better understand, address and mitigate any potential privacy risks or harms to people. It should help you to ask the right questions of the AI provider. It’s likely your DPIA will need to be agile; revisited and updated as the processing and its potential impacts evolve.

ICO DPIA recommendations for recruiters:

Complete a DPIA before commencing processing that is likely to result in a high risk to the people’s rights and freedoms such as procuring an AI recruitment tool or other innovative technology.
Ensure DPIAs are comprehensive and detailed, including:
– the scope and purpose of the processing;
– a clear explanation of relationships and data flows between each party;
– how processing will comply with UK GDPR principles; and consideration of alternative approaches.
– Assess the risks to people’s rights and freedoms clearly in a DPIA, and identify and implement measures to mitigate each risk.
Follow a clear DPIA process that follows the recommendations above.

2. Lawful basis for processing

When recruiting organisations need to identify a lawful basis for this processing activity. You need to choose the most appropriate of the six lawful bases such as consent or legitimate interests.

To rely on legitimate interests you will need to:
1. Identify a legitimate interest
2. Assess the necessity
3. Balance your organisation’s interests with the interests, rights and freedoms of individuals.

This is known as the ‘3-stage test’. We’d highly recommend you conduct and document a Legitimate Interests Assessment. Our recently updated Legitimate Interests Guidance includes a LIA temple (in Excel). Your DPIA can be referenced in this assessment.

3. Special category data condition

If you will be processing special category data, such as health information or Diversity, Equity and Inclusion data (DE&I), alongside a lawful basis you’ll need to meet a specific special category condition (i.e. an Article 9 condition under UK GDPR).

It’s worth noting, some AI providers may infer people’s characteristics from candidate profiles rather than directly collecting it. This can include predicting gender and ethnicity. This type of information even if inferred, will be special category data. It also raises questions about ‘invisible’ processing (i.e. processing the individual is not aware of) and a lack of transparency. The ICO recommends not using inferred information in this way.

4. Controller, processor or joint controller

Both recruiters and AI providers have a responsibility for data protection compliance. It should be clear who is the controller or processor of the personal information. Is the AI provider a controller, joint-controller or processor? The ICO recommends this relationship is carefully scrutinised and clearly recorded in a contract with the AI provider.

If the provider is acting as a processor, the ICO says ‘explicit and comprehensive instructions must be provided for them to follow’. The regulator says this should include establishing how you’ll make sure the provider is complying with these instructions. As a controller your organisation should be able to direct the means and purpose of the processing and tailor it to your requirements. If not, the AI provider is likely to be a controller or joint-controller.

5. Data minimisation

One of the core data protection principles is data minimisation. We should only collect and use personal information which is necessary for our purpose(s). The ICO’s audit found some AI tools collected far more personal information than necessary and retained it indefinitely to build large databases of potential candidates without their knowledge. What might make perfect sense to AI or the programmers creating such technology might not be compliant with data protection law!

Recruiters need to make sure the AI tools they use only collect the minimum amount of personal information required to achieve your purpose(s). (A purpose/purposes which should be clearly defined in your DPIA and, where relevant, your LIA).

There is also an obligation to make sure the personal details candidates are providing are not used for other incompatible purposes. Remember, if the AI provider is retaining data and using this information for its own purposes, it will not be a processor.

6. Information security and integrity

As part of the procurement process, recruiters need to undertake meaningful due diligence. This means asking the AI provider for evidence that appropriate technical and organisational controls are in place. These technical and organisational controls should also be documented in the contract. The ICO recommends regular compliance checks are undertaken while the contract is in place, to make sure effective controls remain in place.

7. Fairness and mitigating bias risks

Recruiters need to be confident the outputs from AI tools are accurate, fair and unbiased. The ICO’s audit of AI recruitment providers found evidence tools were not processing personal information fairly. For example, in some cases they allowed for recruiters to filter out candidates with protected characteristics. (Protected characteristics include; age, disability, race, ethnic or national origin, religion or belief, sex and sexual orientation). This should be a red flag.

You should seek clear assurances from the AI provider they have mitigated bias, asking to see any relevant documentation. The ICO has published guidance on this: How to we ensure fairness in AI?

8. Transparency

Are candidates aware an AI tool will used to process their personal details? Clear privacy information needs to be provided to job seekers which explains how and why the AI tool is being used. The ICO says this should extend to explain the ‘logic involved in making predictions or producing outputs which may affect people’. Candidates should also be told how they can challenge any automated decisions made by the tool.

The regulator recommends producing a privacy notice specifically for candidates on your AI platform which covers relevant UK GDPR requirements.

9. Human involvement in decision-making

There are strict rules under GPDR for automated decision-making (including profiling). Automated decision-making is the process of making a decision by automated means without any human involvement. A recruitment process wouldn’t be considered solely automated if someone (i.e. a human in the recruitment team) weighs up and interprets the result of an automated decision before applying it to the individual.

There needs to be meaningful human involvement in the process to prevent solely automated decisions being made about candidates. The ICO recommendations for recruiters include:

Ensure that recruiting managers do not use AI outputs (particularly ‘fit’ or suitability scores) to make automated recruitment decisions, where AI tools are not designed for this purpose.
Offer a simple way for candidates to object to or challenge automated decisions, where AI tools make automated decisions.

10. Data Retention

Another core data protection principle is ‘storage limitation’. This means not keeping personal data for longer than necessary for the purpose(s) it was collected for. It’s important to assess how long the data inputted and generated from AI tools will be kept for. Information about retention periods should be provided in relevant privacy information provided to job applicants (e.g. in an Applicant Privacy Notice provided on your AI platform).

The ICO says data retention periods should be detailed in contracts, including how long each category of personal information is kept and why. Plus what action the AI provider must take at the end of the retention period.

Summary

The ICO acknowledges the benefits of AI and doesn’t want to stand in the way of those seeking to use AI driven solutions. It does, however, ask recruiters to consider the technology’s compatibility with data protection law.

AI is a complex area for many and it’s easy to see how unintended misuse of personal data, or unfairness and bias in candidate selection could ‘slip through the cracks’ in the digital pavement. HR  professionals and recruiters can avoid problems later down the line by addressing these as Day One issues when considering AI.

Fairness and respect for candidate privacy are central principles of HR best practice and necessary for data protection compliance. Applying these to new technological opportunities shouldn’t come as a surprise. Including your data protection team in the planning stage can help to mitigate and possibly eliminate some risks. A win-win which would leave organisations more confident in reaping the benefits AI offers.

DPN Legitimate Interests Guidance and LIA Template (v 3.0)

Published in November 2024 this third version of our established Legitimate Interests Guidance aims to help organisations assess whether they can rely on legitimate interests for a range of processing activities. Routine or more complex activities, such as those involving the use of AI. First published in 2017, this updated version includes an improved LIA template (in Excel) to use when conducting your own Legitimate Interests Assessments.

Legitimate Interests Guidance from the Data Protection Network

Many thanks to PrivacyX Consulting and Privacy Partnership Law for working with us on this latest version. We’d also like to thank the original Legitimate Interests Working Group of 2017/2018, comprising representatives from a wide range of companies and institutions, who collaborated to produce previous versions.