Data Protection and what the Labour Government should do

July 2024

What should Kier Starmer’s team do about data protection?

After the Conservative Party’s crushing defeat on July 4th, we now have a Labour administration. As the General Election was called, the Data Protection and Digital Information Bill was progressing through Parliament. Although many thought it might be just pass before an Election, the decision by Rishi Sunak to gamble everything on an early election led to the Bill’s abandonment.

The Bill itself was controversial, proposing a mixed bag of changes to data protection and ePrivacy laws. Views within the industry were, it is fair to say, divided.

I’ve asked industry insiders the question; What should the new Government do with UK GDPR, the Privacy and Electronic Communications Regulations (PECR), and AI? Here’s what they say.

Steve Wood, Founder & Director, PrivacyX Consulting and former UK Deputy Information commissioner

“The New Government should firstly take a step back to consider its approach to public engagement on data and AI, particularly with civil society. As they seek to use AI to transform the public sector, a planned and long-term approach to meaningful transparency and engagement is vital. There are good foundations to build on for AI policy and the new Government should look at options to put AI principles on a statutory footing and what additional oversight and coordination is needed to make them effective.

There is scope for a focused AI and Data Bill, learning the lessons of the complexity and confusion in the DPDI Bill and what will really improve the outcomes of the data protection regime – for people and organisations. Changes to GDPR that should remain on table include the new Information Commission reforms, the data protection test for internation transfers and an exemption for analytics cookies.”

You can read more of Steve’s thoughts in his Substack blog – A Digital Policy Memo for the Minister’s Red Box

Chris Combemale, CEO, Data & Marketing Association (DMA)

“The DMA continues to believe that reforming the data protection regime in the UK is fundamental to driving growth, innovation, and wealth creation in the country. Doing so would be a strong sign of the new Government’s commitment to the industry and business.  Amongst the most important reforms for DMA members are:

1. Reforms that establish greater certainty for the use of legitimate Interests as a lawful basis particularly attracting and retaining new customers
2. Reforms that clarify how data can be better used to support scientific research and technology development
3. Reforms that reduce bureaucracy for small business
4. Reforms that enable Smart Data schemes to be introduced in appropriate sectors
5. Reforms that reduce the consent requirements for non-intrusive cookies
6. Reforms that update the law to enable beneficial update of automated decision-making like AI while maintaining strong safeguards

These reforms are consistent with the Labour Policy Forum position and indeed were supported by Labour during scrutiny of the former government’s DPDI Bill. The DMA will work closely with the incoming government to ensure these reforms become law.”

Read Chris’ Open Letter to all political parties

Robert Bond, Senior Counsel, Privacy Partnership Law and Chair, DPN Advisory Group

“The new Government needs to ensure that any changes it makes to our data protection regime do not harm our “adequacy” with the EU. However, I would welcome a review of the reliance on Legitimate Interest as a lawful ground for processing to bolster this useful ground. I would like to see a review of PECR and a proactive focus on practical AI legislation.”

Gerald Coppin, Deputy Group Data Protection Officer, Springer Nature

“I feel a Labour government should work on an international effort to harmonise the data privacy laws across major jurisdictions, it could make it easier for businesses to manage regulatory requirements. They could recommend or mandate techniques like differential privacy, federated learning, and synthetic data generation to enable AI development without compromising individual privacy. As well as expanding regulatory sandboxes that allow companies to test innovative AI applications in a controlled environment, while ensuring privacy safeguards are in place. A reduction in paperwork to prove compliance with the different laws would be MOST welcome!!”

Debbie Evans, Managing Director, FTI Consulting

“I want to be optimistic about change however, it’s not going to be without challenge. Whilst I’m not proclaiming any particular political persuasion – my personal hope is that individual rights are given more visibility. Businesses consequently will need to take compliance more seriously as laws strengthen.”

Eduardo Ustaran, Partner, Hogan Lovells

“My view is that the new UK Government should aim to realise the opportunity to place the UK as a global leader in these areas. The UK is in an ideal sweet spot because it is close enough to the EU’s policy objectives of providing the highest levels of protection for personal data and human rights in the face of today’s AI revolution, but also understands the crucial importance of technological innovation for growth and prosperity. That combination is particularly attractive for responsible global businesses to model their regulatory compliance strategies for privacy, cybersecurity and AI. This is a crucial issue for the UK Government to get right and support its primary goal of growing the economy.”

Charles Ping, Managing Director, Europe, Winterberry Group

“Labour has a big task ahead, and by its own admission, limited resources. So using the eco-friendly mantra of reduce, reuse and recycle they should take all three aspects into evolving our data protection legislation. Reduce the wasted time on devising new policy objectives in this area when there was cross party consensus on the currently lifeless Data Protection of Digital Information Bill. Reuse, because the bill is pretty much “oven ready”, if that phrase hasn’t been rendered entirely valueless by a previous administration.

Recycle the old bill and ensure an expedited path through the corridors and meeting rooms of Westminster. I can’t see a new administration (or country) wanting a traditional summer recess, so this legislation should have time to whistle through and start making a difference.”

Eleonor Duhs, Partner and Head of Data & Privacy, Wells Bates LLP

“I think the new Labour Government, as a priority, should deal with the uncertainty created by the Retained EU Law (Revocation and Reform) Act 2023 (“REULA”) about how to interpret the UK’s data protection frameworks. REULA has turned the statue book on its head, with domestic law (whenever enacted) taking precedence over any law that was previously EU law (including UK GDPR). An example of the unintended consequences of this is in the area of exemptions from data subject rights. The Open Rights case (brought before REULA came into force) required the government to provide EU-standard protections for migrants when exercising data subject rights. But because of the reversal of the relationship between the UK GDPR and the Data Protection Act 2018 every other group in society now has a lower standard of protection for their data subject rights, compared with migrants.

This outcome was clearly not anticipated. In order to ensure data protection standards in the UK remain high the new Labour government should bring forward legislation. It could either use the powers in REULA to reintroduce deleted principles in order to bring clarity and legal certainty. Alternatively, the best course of action may be to enact bring forward primary legislation to ensure that the UK statute book is stabilised. Powers to update our data protection frameworks should also be considered to ensure it continues to be current and tracks accepted EU and international standards. This would support growth and avoid the risk of losing the UK’s data adequacy decision which is due to be reviewed next year.”

You can read more from Eleonor on the REULA here

While I appreciate reforming data protection law may prove not to be a high priority for the new Starmer Government, to offer my tuppence, if Labour does nothing else, I’d urge them to revise PECR. It’s desperately out of date, first introduced over 20 years ago, and then updated back in 2009 with the ‘cookie law’. The world has moved on. There were some proposed changes to PECR under the DPDI Bill which I favoured. In particular, a change allowing not-for-profits to take advantage of the so-called soft opt-in exemption to consent for marketing emails / texts. This is currently only available in a commercial context, which I feel is unfair. As others have mentioned, I’d also like to see a revision of the consent rules for website analytics cookies.

Understanding and handling Special Category Data

July 2024

Why is it special and what does data protection law tell us we need to do?

There is a distinct subset of personal data which is awarded ‘special’ protection under data protection law. This subset includes information for which people have been persecuted in the past, or suffered unfair treatment or discrimination, and still could be. These special categories of personal data are considered higher risk, and organisations are legally obliged to meet additional requirements when they collect and use it.

Employees need to be aware special category data should only be collected and used with due consideration. Sometimes there will be a clear and obvious purpose for collecting this type of information; such as a travel firm needing health information from customers, or an event organiser requesting accessibility requirements to facilitate people’s attendance. In other situations it will be more nuanced.

What’s special category data?

Special Categories of Personal Data under UK GDPR (and it’s EU equivalent), are commonly referred to as special category data, and are defined as personal data revealing:

  • Racial or ethnic origin e.g. diversity and inclusion data
  • Political opinions
  • Religious or philosophical beliefs
  • Trade union membership

The definition also covers:

  • Genetic data
  • Biometric data (where this is used for identification purposes)
  • Data concerning health e.g. medical records, sickness records, accessibility requirements and so on.
  • Data concerning a person’s sex life or their sexual orientation. E.g. diversity and inclusion data

Inferring special category data

Sometimes your teams might not realise they’re collecting and using special category data, but they might well be.

It’s likely if you have inferred or made any assumptions based on what you know about someone, for example they’re likely to have certain political opinions, or likely to suffer from a certain health condition, this will mean you are handling special category data.

There was an interesting ICO investigation into an online retailer which found it was targeting customers who’d bought certain products, assuming from this they were likely to be arthritis sufferers. This assumption meant the retailer was judged to be processing special category data.

If you collect information about dietary requirements these could reveal religious beliefs, for example halal and kosher. It’s also worth noting in 2020 a judge ruled that ethical veganism qualifies as a philosophical belief under the Equality Act 2010.

Other ‘sensitive’ data

There’s sometimes confusion surrounding what might be considered ‘sensitive’ data and what constitutes special category data. I hear people say “why is  financial data not considered as sensitive as health data or ethnic origin?’ Of course, people’s financial details are sensitive and organisations do still need to make sure they’ve got appropriate measures in place to protect such information and keep it secure. However, UK GDPR (and EU) sets out specific requirements for special category data which don’t directly apply to financial data.

To understand why, it’s worth noting special protection for data such as ethnicity, racial origin, religious beliefs and sexual orientation was born in the 1950s, under the European Convention on Human Rights, after Europe had witnessed people being persecuted and killed.

Special Category Data Requirements

In a similar way to all personal data, any handling of special category data must be lawful, fair and transparent. Organisations need to make sure their collection and use complies with all the core data protection principles and requirements of UK GDPR. For example;

  • Do you have a clear purpose and reason for collecting/using special category data?
  • Have you identified a lawful basis? For example:
    • is this data necessary in order for you to fulfil a contract you have with the individual?
    • Are you legally obliged to hold this data?
    • Should you be seeking their consent?
    • Or is there another appropriate lawful basis?  Quick Guide to Lawful Bases.
  • Have you told people what their special category data will be used for? What does your Privacy Notice tell people? Have people seen your Privacy Notice?
  • Can you minimise the amount of special category data you are collecting?
  • Have you decided how long this data will be kept for?
  • How will you make sure this data is not used for another different purpose?
  • What security measures will you put in place? e.g. can you limit who has access to this data?

What makes special category data unique is it will be considered a higher risk than other types of data, and also requires you to choose a special category condition.

Other key considerations and requirements

Risk Assessments

Confirm whether you need to conduct a Data Protection Impact Assessment for your planned activities using special category data. DPIAs are mandatory for any type of processing which is likely to be high risk. This means a DPIA is more likely to be needed when handling special category data. That’s not to say it will always be essential, it really will depend on the necessity, nature, scale and your purpose for using this data.

Special Category Condition

Alongside a lawful basis, there’s an additional requirement to consider your purpose(s) for processing this data and to select a special category condition. These conditions are set out in Article 9, UK GDPR.

(a) Explicit consent
(b) Employment, social security and social protection (if authorised by law)
(c) Vital interests
(d) Not-for-profit bodies
(e) Made public by the data subject
(f) Legal claims or judicial acts
(g) Reasons of substantial public interest (with a basis in law)
(h) Health or social care (with a basis in law)
(i) Public health (with a basis in law)
(j) Archiving, research and statistics (with a basis in law)

Associated condition in UK Law

Five of the above conditions are solely set out in Article 9. The others require specific authorisation or a basis in law, and you’ll need to meet additional conditions set out in the Data Protection Act 2018.

If you are relying on any of the following you also need to meet the associated condition in UK law. This is set out in Part 1, Schedule 1 of the DPA 2018.

  • Employment, social security and social protection
  • Health of social care
  • Public health
  • Archiving, research and statistics.

If you are relying on the substantial public interest condition you also need to meet one of 23 specific substantial public interest conditions set out in Part 2 of Schedule 1 of the DPA 2018.

The ICO tells us for some of these conditions, the substantial public interest element is built in. For others, you need to be able to demonstrate that your specific processing is ‘necessary for reasons of substantial public interest’, on a case-by-case basis. The regulator says we can’t have a vague public interest argument, we must be able to ‘make specific arguments about the concrete wide benefits’ of what we are doing.

Appropriate Policy Document (APD)

Almost all of the substantial public interest conditions, plus the condition for processing employment, social security and social protection data, require you to have an APD in place. The ICO Special Category Guidance in includes a template appropriate policy document.

Privacy Notice

A privacy notice should explain your purposes for processing and the lawful basis being relied on in order to collect and use people’s personal data, including any special category data. Remember, if you’ve received special category data from a third party, this should be transparent and people should be provided with your privacy notice.

Data breach reporting

You only have to report a breach to the ICO if it is likely to result in a risk to the rights and freedoms of individuals, and if left unaddressed the breach is likely to have a significant detrimental effect on individuals. Special category data is considered higher risk data, and therefore if a breach involves data of this nature, it is more likely to reach the bar for reporting. It is also more likely to reach the threshold of needing to notify those affected.

In summary, training and raising awareness are crucial to make sure employees understand what special category data is, how it might be inferred, and to know that collecting and using this type of data must be done with care.

Why the Tory app data breach could happen to anyone

June 2024

Shakespeare wrote (I hope I remembered this correctly from ‘A’ level English), ‘When sorrows come, they come not single spies but in battalions.’ He could’ve been writing about the UK Conservative Party which, let’s be honest, hasn’t been having a great time recently.

The Telegraph is reporting the party suffered it’s second data breach in a month. An error with an app led to the personal information of leading Conservative politicians – some in high government office – being available to all app users.

Launched in April, the ‘Share2Win’ app was designed as a quick and easy way for activists to share party content online. However, a design fault meant users could sign up to the app using just an email address. Then, in just a few clicks, they were able to access the names, postcodes and telephone numbers of all other registrants.

This follows another recent Tory Party email blunder in May, where all recipients could see each other’s details. Email data breaches.

In the heat of a General Election, some might put these errors down to ‘yet more Tory incompetence’. I’d say, to quote another famous piece of writing, ‘He that is without sin among you, let him first cast a stone’! There are plenty of examples where other organisations have failed to take appropriate steps to make sure privacy and security are baked into their app’s architecture. And this lack of oversight extends beyond apps to webforms, online portals and more. It’s a depressingly common, and easily avoided.

In April, a Housing Associate was reprimanded by the ICO after launching an online customer portal which allowed users to access documents (revealing personal data) they shouldn’t have been able to see. These related to, of all things, anti social behaviour. In March the ICO issued a reprimand to the London Mayor’s Office after users of a webform could in click on a button and see every other query submitted. And the list goes on. This isn’t a party political issue. It’s a lack of due process and carelessness issue.

It’s easy to see how it happens, especially (such as in a snap election) when there’s a genuine sense of urgency. Some bright spark has a great idea, senior management love it, and demand it’s implemented pronto! Make it happen! Be agile! Be disruptive! (etc).

But there’s a sound reason why the concept of data proteciton by design and by default is embedded into data protection legislation, and it’s really not that difficult to understand. As the name suggests, data protection by design means baking data protection into business practices from the outset; considering the core data protection principles such as data minimisation and purpose limitation as well as integrity & confidentiality. Crucially, it means not taking short-cuts when it comes to security measures.

GDPR may have it’s critics, but this element is just common sense. Something most people would get onboard with. A clear and approved procedure for new systems, services and products which covers data protection and security is not a ‘nice to have’ – it’s a ‘must have’. This can go a long way to protect individuals and mitigate the risk of unwelcome headlines further down the line, when an avoidable breach puts your customers’, clients’ or employees’ data at risk.

Should we conduct a DPIA?

A clear procedure can also alert those involved to when a Data Protection Impact Assessment is required. A DPIA is mandatory is certain circumstances where activities are higher risk, but even when not strictly required it’s a handy tool for picking up on any data protection risks and agreeing measures to mitigate them from Day One of your project. Many organisations would also want to make sure there’s oversight by their Information Security or IT team, in the form of an Information Security Assessment for any new applications.

Developers, the IT team and anyone else involved need to be armed with the information they need to make sound decisions. Data protection and information security teams need to work together to develop apps (or other new developments) which aren’t going to become a leaky bucket. Building this in from the start actually saves time too.

In all of this, don’t forget your suppliers. If you want to outsource the development of an app to a third-party supplier, you need to check their credentials and make sure you have necessary controller-to-processor contractual arrangements and assessment procedures in place – especially if once the app goes live, the developer’s team still has access to the personal data it collects. Are your contractors subbing work to other third party subcontractors? Do they work overseas? Will these subcontractors have access to personal data?

The good news? There’s good practice out there. I remember a data protection review DPN conducted a few years back. One of the areas we looked at was an app our client developed for students to use. It was a pleasure to see how the app had been built with data protection and security at its heart. We couldn’t fault with the team who designed it – and as such the client didn’t compromise their students, face litigation, look foolish or be summoned to see the Information Commissioner!

In conclusion? Yes, be fast. Innovate! Just remember to build your data protection strategy into the project from Day One.

DSAR ruling and other people’s data

June 2024

High Court judgement in Harrison vs Cameron case

A recent high court ruling concerning a Data Subject Access Request reveals some interesting points relating to how organisations comply with people’s right to know the identity of the recipients of their personal data, and how organisations apply the ‘third-party exemption’.

The right of access gives people the right to receive a copy of their own personal data, it doesn’t give them the right to receive personal data relating to others. However, often other people’s details are intertwined as part of the data retrieved.

In this particular case, the focus was on other people the requester’s data had been shared with, and whether the requester had the right to know the identity of these recipients.

The ‘third party exemption’ frequently comes up for debate when handling DSARs and this case sheds light on how this exemption should be applied.

In the ruling the Judge found that it’s necessary to apply a ‘balancing test’ when considering the third-party exemption. It was also acknowledged that the controller is the ‘primary decision maker’ when assessing whether it is reasonable or not to disclose personal data relating to others, and has a ‘wide margin of discretion’ in this decision.

Here’s some background to two of the key points of law in this case:

What’s the third-party exemption?

The third-party exemption is set out in the UK Data Protection Act 2018 and says organisations (controllers) do not have to comply with a DSAR, if in doing so this would mean disclosing information which identifies another individual. Organisations can disclose such information if the third party has given their consent, or if it’s reasonable to disclose without their consent.

What about the recipients of personal data?

Along with the right to receive a copy of their personal data, when an individual submits a DSAR they are also entitled to receive other supplementary information. This includes details of any ‘recipients’ or ‘categories of recipients’ the organisation has, or will, disclose their personal data to.

The Harrison vs Cameron case

Mr Harrison, Chief Executive of a real estate investment company was covertly recorded making threats to Mr Cameron, the owner of a gardening business. Here’s a summary of what happened next:

  • Mr Cameron shared the recording with some of his employees, members of his family and friends.
  • Mr Cameron sent the recording to twelve people in total, and it was then shared on to a further three people.
  • Mr Harrison claimed the recordings had been shared more widely and damaged his business.
  • Mr Harrison submitted a DSAR to Mr Cameron in a personal capacity (I’ll come back to this) and submitted similar requests to others, including employees at the gardening business. He demanded to know the identity of the people who’d received the recording.
  • Mr Cameron and others declined his request, and the case ended up in the High Court.

The Court decided Mr Cameron was not himself a controller of Mr Harrison’s data, and that he’d made the recordings in his capacity as a director of the gardening company. Therefore the company, not Mr Cameron was the controller and responsible for fulfilling the request.

According to the judge, a person’s rights extend to being provided with details of the specific recipients of their personal data, including the names of individuals who’ve received their data. The rationale behind this is to enable the individual to check the lawfulness of how their personal data is being handled. This is a potentially worrying development as organisations may have previously viewed this as an either provide the names of specific recipients, or provide just the categories of recipient. This ruling makes it clear this is the requester’s choice, not the controller’s decision.

However, in this case the judge found the gardening company could rely on the third-party exemption and not disclose the identity of the recipients. Why? None of the fifteen recipients consented to their names being disclosed to Mr Harrison, due in part to concerns this may expose them to abusive and threatening behaviour. Due to these safety concerns the judge ruled it would not be reasonable to disclose people’s names, without their consent.

Ultimately this ruling makes it clear it is the controller’s decision to make; is it reasonable or not to disclose information which identifies other people?

Third-party balancing test

The ICO’s Right of Access guidance provides helpful pointers on how to conduct a balancing test when considering the third-party exemption. There isn’t a blanket rule, a balanced decision is required on whether it’s appropriate in the circumstances to disclose information relating to others, or withhold it.

1. Can you redact or not provide?

Consider if it’s possible to comply with the request without revealing information that relates to, and identifies another individual. For example, can this third-party information be redacted, or can you separate out the requestor’s personal data?

Sometimes, even redacting other people’s names doesn’t render them unidentifiable. There may be situations where you can reasonably assume the requester will be able to work out whose name has been redacted.

2. Can you seek consent?

If you can get the consent of another individual to disclose their details, it’s a problem solved. I’ve been involved in cases where the consent of other employees has been sought in employee related requests and they’ve given it.

However, you’re not obliged to seek consent and it may not be appropriate to do so. You might not have contact details for the third-party, you might not want to share information with them, or let them know a particular individual has submitted a DSAR.

3. Reasonable to disclose without consent?

Where the information about other individuals if fairly innocuous and you can’t identify any negative impact on them, you may choose to disclose the information without consent. In assessing whether this is reasonable to do, you need to take account of:

  • the type of information you intend to disclose
  • whether it was possible to seek consent or not
  • whether consent was declined
  • any duty of confidentiality

Any potential repercussions for the third-party if their data is disclosed (or they are identifiable from what you provide) can be considered.  As this case shows concerns for a person’s safety can be justification for applying the third-party exemption.

I’ve worked on many cases where this has been debated, situations where redaction wouldn’t render the third-party unidentifiable and it wasn’t appropriate to seek consent. The context is crucial, sometimes it has been reasonable to disclose, other times we had justified concerns and chose to withhold.

It’s important to be clear with the requester about what you are giving them in your response to their DSAR. If you rely on the third-party exemption, you should tell them, and explain why. I’d also highly recommend documenting your decision-making just in case it’s challenged.

Data Sharing Checklist

June 2024

Controller to Controller Data Sharing

Data protection law doesn’t stop us sharing personal data with other organisations, but does place on us a requirement to do so lawfully, transparently and in line with other key data protection principles.

Organisations often need to share personal data with other parties. This could be reciprocal, one-way, a regular activity, ad-hoc or a one off.

Quick Data Sharing Checklist

Here’s a quick list of questions to get you started on how to share personal data compliantly.

(The focus here is on sharing data with other controllers. There are separate considerations when sharing data with processors, such as suppliers and service providers).

1. Is it necessary?

It may be possible to achieve your objective without sharing personal data at all, or perhaps the data could be anonymised.

2. Do we need to conduct a risk assessment?

Check if what you’re planning to do falls under the mandatory requirement to complete a Data Protection Impact Assessment. Depending on the nature and sensitivity of the data it might be a good idea to conduct one anyway. Quick DPIA Guide.

3. Do people know their data is being shared?

Transparency is key, so it’s important to make sure sure people know their personal details are being shared. Would they reasonably expect their personal data to be shared in this way?

4. Is it lawful?

To be lawful we need a lawful basis and we need to meet the relevant conditions of the basis we’ve chosen. For example, if we’re relying on consent is this specific, informed and an unambiguous indication of the person’s wishes. If we’re relying on legitimate interests, have we balanced our interests with those of the people whose data we’re sharing? Quick guide to lawful bases.

5. Can we reduce the amount of data being shared?

Check what data the other organisation actually needs, you may not need to share a whole dataset, a sub-set may suffice.

6. Is it secure?

Agree appropriate security measures to protect the personal data, both when it’s share and at rest. This includes security measures where the other organisation is being given access to your systems. Are controls in place to make sure only those who need access, have access?

7. Can people still exercise their privacy rights?

Both parties should be clear about their responsibilities to fulfil privacy rights, and it should be easy for people to exercise them.

8. How long with the personal data be kept for?

Consider if it’s appropriate to have specific arrangements in place for the shared data to be destroyed after a certain period of time.

9. Is the data being shared with an organisation overseas?

If the personal data is being shared with a business located outside the UK, it will be necessary to consider the international data transfer rules.

10. Do we need a data sharing agreement?

UK GDPR does not specify a legal requirement to have a agreement in place when data is shared between organisations acting as controllers. However, the UK ICO considers it ‘good practice’ as and agreement can set out what happens to the data at each stage, and agreed standards, roles and responsibilities. ICO Data Sharing Agreement guidance.

Other data sharing considerations 

Are we planning to share children’s data?

Proceed with care if you are sharing children’s data. You need to carefully assess how to protect children from the outset, and will need a compelling reason to share data relating to under 18s. This is likely to be a clear case of conduct a DPIA!

Is the other organisation using data for a ‘compatible purpose’?

Consider the original purpose the data was collected for, and whether the organisation you’re sharing it with will use it for a similar purpose. It’s worth noting the UK Department of Education came a cropper for sharing data for incompatible purposes.

Is data being shared as part of a merger or acquisition?

If data is being shared as part of a merger or acquisition, the people the data relates to should be made aware this is happening. You’d want to be clear the data should be used for a similar purpose. Robust due diligence is a must, and perhaps a DPIA to assess and mitigate any risks.

Is it an emergency situation?

We’ve all heard the tales about people being scared they’ll be breaching data protection rules if they share personal data with paramedics, doctors or others in emergency situations. The ICO is clear on this point: in an emergency you should go ahead and share data as is necessary and proportionate.

The ICO has a Data Sharing Code of Practice, full useful information about how the Regulator would expect organisations to approach this.

What would you change about GDPR?

June 2024

Any regrets about the demise of the UK Data Protection and Digital Information Bill?

Data reform in the UK is dead, well at least for the time-being, and possibly permanently. The announcement of a 4th July General Election means the DPDI Bill has been dropped.

The Bill was controversial. Some feared it would weaken data protection laws in the UK and risked the European Commission overturning the much valued ‘adequacy’ decision for the UK. Others welcomed a more flexible, business-friendly approach. Some saw it as mixed bag of good, bad and indifferent ideas, including changes seemingly made for the sake of demonstrating change.

The text of GDPR was finalised eight years ago. It’s spin-off the UK GDPR is pretty much the same as its EU counterpart and there are those in both the UK and EU who feel it may be time to update and refresh the legislation.

Here are some thoughts from data protection practitioners on nuggets in the DPDI Bill they wished had been passed, or an aspect of GDPR they would change if they could.

DPDI regrets

Fedelma Good, Data Protection and ePrivacy Specialist

Putting aside all the hours spent reading and assessing all the proposed changes, my biggest regret is that with the demise of the DPDI we will lose the harmonisation of language between the GDPR and the Privacy and Electronic Communications Regulations (PECR) as well as some of the common-sense changes which were being proposed in relation to analytic cookies. It’s sad too, to see that charities will not get the promised access to soft opt-in for their fund-raising activities. Additionally, I feel for the ICO where a huge amount of effort must have already been put into preparing for the proposed changes to their operating model.

Simon Blanchard, Data Protection Network Associates

I liked the concept of ‘recognised’ legitimate interests, where there would be an exemption from the requirement to conduct a Legitimate Interests Assessment in certain situations where there is a clear and compelling benefit – such as national security, public security, defence, emergencies, preventing crime and safeguarding.

Sachiko Scheuing, European Privacy Officer, Acxiom

The Bill proposed giving legal certainty to legitimate interest as a legal ground for the use of data for marketing purposes, by bringing the existing Recital 47 into the main articles. This would have been a welcome move.

Philippa Donn, Data Protection Network Associates

I supported the ‘vexatious and excessive requests’ DPDI proposal – allowing organisations to assess if a DSAR was intended to cause distress, made in bad faith or was an abuse of power. In my experience on occasion this right is exploited. If I’m allowed to dream? I’d advocate for leeway around the time organisations are given to respond to requests – at least a ‘pause the clock’ for bank holidays and Christmas! I think urgency is good, but making busy organisations rush a request is bad.

Ideas for data protection reform

Robert Bond, Senior Counsel, Privacy Partnerships Law

I would change Article 8 of the GDPR to make the protection of children and their personal data applicable to all controllers and not just those that supply information society services. Article 8 only impacts information society service providers in relation to the obtaining of consent of a child, but I feel the provision of any services to a child require a greater degree of compliance. The ICO’s Children’s Code is valuable, and more controllers need to be focused on the protection of the fundamental rights of the child.

Dominic Batchelor, Head of IP & Privacy, Royal Mail Group

I would update the types of data afforded special protection to reflect modern sensibilities better. Many people would be surprised that data revealing trade union membership, or veganism (if viewed as a philosophical belief), are more tightly regulated than financial data, and that specific parental oversight applies to children’s consent to processing for online services but not necessarily any processing of their data (and that even this control doesn’t apply over the age of 13).

Emma Butler, Creative Privacy

I would take the controller-processor obligations and accountability principle and merge them to create an accountability obligation on all organisations to achieve certain outcomes: the principles, risk assessment, rights, security, transfers and DP by design. All parties in a chain would be legally obliged to understand and determine (and put in a contract) who is doing what with what data, who has which obligations, and who has what liability to whom. Organisations could make arrangements based on facts rather than be shoehorned into a definition based on a legal fiction.

Claire Robson, Governance Director, Chartered Insurance Institute

I would like to see the reintroduction of the term “data controllers in common”. In practice, I found this to be a helpful description which differentiated those circumstances where two organisations held shared data but needed to retain independence of their processing. Without this distinction, I have found myself in many a complex conversation explaining why we are not entering into a joint data controller relationship!

Redouane Serroukh, Head of Information Governance and Risk / DPO, NHS Hertfordshire and West Essex ICB

I’d welcome clarity on the wording surrounding the right of access. Specifically, on its apparent purpose (‘to be aware of, and verify, the lawfulness of processing’, recital 63) and the ability to refuse a request if it is deemed to be ‘manifestly unfounded or excessive’, art 12(5). Why? Currently there is no requirement for a data subject to provide a reason or motive to make Subject Access Request and therefore makes it difficult for a data controller to confidently challenge a request or use the provisions above. While some guidance/interpretation exists, there appears to be a regulatory gap in the wording.

Mark Roebuck, Prove Privacy

The current regulation is not effective enough to ensure that the regulators are consistent in their approach to sanctions. For example, it is widely discussed on professional social media the UK’s ICO is ineffective in applying sanctions to UK organisations compared with other EU regulators. Article 63 provides for a ‘consistency mechanism’ but is itself only one paragraph long and provides no binding commitment on regulators to align enforcement.

So there you go! Some ideas from the coalface should data reform ever rear its head again, either in the UK or EU.

Tackling AI and data protection

Raising staff awareness of data protection risks from their use of AI

The growth of AI continues at a tremendous rate. Its use in the workplace has plenty of benefits including streamlining processes, automating repetitive tasks, and helping employees to be do their jobs ‘better’ and more effectively.

While many people are jumping in with both feet, others have growing concerns about the implications for individuals and their personal data. There are also very real concerns surrounding intellectual property and commercially sensitive information which may be being ‘leaked’ out of the business through AI applications.

As employees increasingly bring AI into the workplace, the risks grow. A recent Microsoft and LinkedIn Report found all generations of workers are bringing their own AI tools to work – ‘ 73% of Boomers’ through to ‘85% of Gen Z’. The report found many are hiding their use of AI tools from their employers, possibly fearing their jobs may be at risk.

Generative AI is a key focus for data protection authorities. The ICO has recently concluded a year-long investigation into Snap Inc’s launch of the ‘My AI’ chatbot, following concerns data protection risks had not been adequately assessed. The regulator is warning all organisations developing or using generative AI that they must consider data protection from the outset, before bringing products to the market or using them in the workplace.

In this article I’ve taken a look at how Generative AI works, the main concerns and what employers can do to try and mitigate the risks. And most importantly how to control the use of AI in the workplace.

Generative AI and Large Language Models

Generative artificial intelligence relates to algorithms, such as ChatGPT, which can be used to create new content like text, images, video, audio, code and so on. Recent breakthroughs in generative AI has huge potential to impact our whole approach to content creation.

ChatGPT for instance relies on a type of machine learning called Large Language Models (LLMs). LLMs are usually VERY large deep-neural-networks, trained on giant datasets such as published webpages. Recent technology advances have enabled LLMs to become much faster and more accurate.

What are the main AI concerns?

With increased capabilities and the growth in adoption of AI come existing and emergent risks. We are at trigger point, where governments and industry alike are keen to realise the benefits to drive growth. The public too are inspired to try out AI models for themselves.

There’s an obvious risk of jobs being displaced, as certain tasks carried out by humans are replaced by AI technologies. Concerns recognised in the technical report accompanying GPT-4 include:

  • Generating inaccurate information
  • Harmful advice or buggy code
  • The proliferation of weapons
  • Risks to privacy and cyber security

Others fear the risks posed when training models using content which could be inaccurate, toxic or biased – not to mention illegally sourced!

The full scope and impact of these new technologies is not yet unknown and new risks continue to emerge. But there are some questions that need to be answered sooner rather than later, such as:

  • What kinds of problems are these models best capable of solving?
  • What datasets should (and should not) be used to create and train generative AI models?
  • What approaches and controls are required to protect the privacy of individuals?
  • What are the main data protection concerns?

AI data inputs

The datasets used to train generative AI systems are often likely to contain personal data that might not have been lawfully obtained. In many AI models, the data used may be obtained by “scraping” (the automated gathering of data online), which often violates most privacy principles.

Certain information may have been used without consideration of intellectual property rights, where the owners have not been approached nor given their consent for use.

The Italian Data Protection Authority (Garante) blocked ChatGPT, citing its illegal collection of data and the absence of systems to verify the age of minors. Some observers have pointed out these concerns are broadly similar to why Clearview AI received an enforcement notice.

AI data outputs

AI not only ingests personal data, but may also generate it. Algorithms can produce new data that may unexpectedly exposes personal details, which leaves individuals with limited control over their data.

There are many other concerns such as transparency, algorithmic bias and inaccurate predictions and the risk of discrimination. Fundamentally, there are concerns that appropriate accountability for AI is often lacking.

Key considerations for organisations looking to adopt AI

We need to understand what people across the business are already doing with AI, or planning to do. Get clarity about any personal data they are using; particularly any sensitive or special category data. Make sure they are aware of the potential risks and know what questions to ask, rather than dive straight in.

We suggest you start by talking business leaders and their teams to identify emerging uses of AI across your business. It’s a good idea to carry out Data Protection Impact Assessment (DPIA) to assess privacy risks and identify proportionate privacy measures.

Rather than adopting huge ‘off-the-shelf’ generative AI models like Chat GPT (and what may come next), businesses may consider adopting smaller, more specialised AI models trained on the most relevant, compliantly gathered datasets.

Do we need an AI Policy for employees?

To make sure AI is being used responsibly in your organisation its crucial employees are provided with clear guidance on considerations and expected behaviour when using AI tools. A robust AI Policy can go some way to mitigate risks, such as those relating to inaccurate or harmful outputs, data protection, intellectual property and commercially sensitive information and so on. Here are some pointers for areas to cover in an AI Policy:

1. Your approach to AI: Does your company permit, limit or ban the use of AI in the workplace? What tasks is it permitted to be used for? What tasks must it never be used for?

2. Internal procedures and rules: Set out clear steps employees must follow. Be clear where the red lines are and who they should contact if they have questions or concerns, or if they need specialist support.

3. AI risks: Clearly explain the risks and you are likely to want to prohibit employees from using sensitive data of a personal, commercial or confidential nature.

4. Review of AI-generated work: Humans should review all AI generated outpusts as these may be may be inaccurate or completely wrong. Human review should be baked in to your procedures. Also will you hold employees accountable for errors in their AI generated work?

5. List of permitted AI tools/platforms

Regularly update and circulate the policy to take account of developments.

In all of this, organisations need to be mindful of emerging AI regulations around the globe, and in particular the jurisdictions in which your organisation operates.

Differing regulatory approaches

EU – The EU has adopted the world’s first Artificial Intelligence Act. It’s taking a ‘harm and risk’ approach which bans ‘unacceptable’ use of artificial intelligence and introduces specific rules for AI systems proportionate to the risk they pose. It imposes extensive requirements on those developing and deploying high-risk AI systems, yet be lighter touch for low risk/low harm AI applications.

Some have questioned whether existing data protection and privacy laws are appropriate for addressing AI risk. We should be mindful AI can increase privacy challenges and add new complexities to them. IAPP EU AI Cheat Sheet

UK –Despite calls for targeted AI regulation, the UK has no EU-equivalent legislation and currently looks unlikely to get one in the foreseeable future. The current Tory Government says it’s keen not to rush in and legislate on AI, fearing specific rules introduced too swiftly could quickly become outdated or in effective. For the time being the UK is sticking to a non-statutory principles-based approach, focusing on the following:

  • Safety, security, and robustness;
  • Appropriate transparency and explainability;
  • Fairness;
  • Accountability and governance; and
  • Contestability and redress.

Key regulators such as the Information Commissioner’s Office (ICO), the Financial Conduct Authority (FCA) and others are being asked to take the lead. Alongside this a new advisory service; the AI and Digital Hub has been launched.

There’s a recognition advanced General Purpose AI may require binding rules. The government’s approach is set out in its response to the consultation on last year’s AI Regulation White Paper. ICO guidance can be found here: Guidance on AI and data protection. Also see Regulating AI: The ICO’s strategic approach April 2024

US – In the US a number of AI guidelines and frameworks have been published. The National AI Research and Development Strategic Plan was updated in 2023. This stresses a co-ordinated approach to international collaboration in AI research.

As for the rest of the world, the IAPP has helpfully published a Global AI Legislation Tracker 

Wherever you operate it is vital data protection professions seek to understand how their organisations are planning to use AI, now and in the future. Evaluate how the models work and assess any data protection and privacy risks before adopting them.

Access controls: Protecting your systems and data

Is your data properly protected?

Do existing staff or former employees have access to personal data they shouldn’t have access to?  Keeping your business’ IT estate and personal data safe and secure is vital.  One of the key ways to achieve this is by having robust access controls.

Failure to make sure you have appropriate measures and controls to protect your network and the personal data on it could lead to a data breach. This could have very serious consequences for your customers and staff, and the business’ reputation and finances.

How things can go wrong

  • Recently a former management trainee at a car rental company was found guilty and fined for illegally obtaining customer records. Accessing this data fell outside his role at the time.
  • In 2023 a former 111 call centre advisor was found guilty and fined for illegally accessing the medical records of a child and his family.
  • In 2022 a former staff advisor for an NHS Foundation was recently found guilty of accessing patient records without a valid reason.

Anecdotally, we know of cases of former employees being found to be using their previous employer’s personal data once they have moved onto a new role.

The ability to access and either deliberately or accidentally misuse data is a common risk for all organisations. Add to this the risk of more employees and contractors working remotely, and it’s clear we need to take control of who has access to what.

High-level check list

1. Apply the ‘Principle of Least Privilege’

There’s a useful security principle, known as ‘the principle of least privilege’ (PoLP).  This sets a rule that employees should have only the minimum access rights needed to perform their job functions.

Think of it in the same way as the ‘minimisation’ principle within GDPR.  You grant the minimum access necessary for each user to meet the specific set of tasks their role requires, with the specific datasets they need.

By adopting this principle, you can prevent the risk of employees gaining more access rights over time.  You’ll need to periodically check to make sure they still need the existing access rights they have. For example, when someone changes role, their access needs may also change.

If your access controls haven’t been reviewed for a long time, adopting PoLP can give you great start point to tighten up security.

2. Identity and Access Management

IAM is a broad term for the policy, processes and technology you use to administer employee access to your IT resources.

IAM technology can join it all up – a single place where your business users can be authenticated when they sign into the network and be granted specific access to the selected IT resources, datasets and functions they need for their role.  One IAM example you may have heard of is Microsoft’s Active Directory.

3. Role-based access

Your business might have several departments and various levels of responsibility within them.  Most employees won’t need access to all areas.

Many businesses adopt a framework in which employees can be identified by their job role and level, so they can be given access rights which meets the needs of the type of job they do.

4. Security layers

Striking the right balance between usability and security is not easy.   It’s important to consider the sensitivity of different data and the risks if that data was breached.  You can take a proportionate approach to setting your security controls.

For example personal data, financial data, special category or other sensitive personal data, commercially sensitive data (and so on) will need a greater level of security than most other data.

Technologies can help you apply proportionate levels of security.  Implementing security technologies at the appropriate levels can give greater protection to certain systems & data which demand a high level of security (i.e. strictly-controlled access), while allowing non-confidential or non-sensitive information to be accessed quickly by a wider audience.

5. Using biometrics

How do you access your laptop or phone? Many of us use our fingerprint or facial recognition which give a high level of security, using our own biometrics data.  But some say, for all their convenience benefits, they are not as secure as a complex password!

But then, how many of us really use complex passwords? Perhaps you use an app to generate and store complex passwords for you.  Sadly lots of people use words, names or memorable dates within their passwords. Security is only going to be as good as your weakest link.

6. Multi-factor authentication (MFA)

Multi-factor authentication has become a business standard in many situations, to prevent fraudulent use of stolen passwords or PINs.

But do make sure it’s set up effectively. I’ve seen some examples where MFA has to be activated by the user themselves. So if they fail to activate it, there’s little point having it.  I’ve heard about data breaches happening following ineffective implementation of MFA, so do be vigilant.

There are an array of measures which can be adopted. This is just a taster, which I hope you found useful – stay safe and secure!