Data protection by design and default – what does that mean really?

January 2024

It’s an obligation in GDPR, but it can still be confusing to get to the heart of what ‘data protection by design and default’ really means, and what you need to do as a result.

It’s often used as a proxy for ‘do a DPIA’ but it’s so much more than that. I’ve seen it described as building or baking in data protection from the start. That’s more accurate but still leaves the question “yes but what does that actually mean?”

What does GDPR say?

It helps to go back to the source, not just article 25 GDPR (1) and (2) but also recital 78. These tell you what the legislators were concerned about, namely implementing the principles. They specifically call out the following.

  • Data minimisation
  • Purpose limitation
  • Retention
  • Wide disclosure / accessibility of the data
  • Transparency
  • Making sure individuals know what’s going on
  • Allowing organisations to set up and improve security measures.

Both the article and the recital mention pseudonymisation as one way to implement the data minimisation principle. It’s an example though, not a mandatory requirement.

The end of recital 78 seems to be directed at those producing software and systems, as well as public tender processes. My take on this is that the legislators were keen to make sure that organisations needing to comply with GDPR didn’t fall at the first hurdle due to a lack of thought for data protection compliance in systems and software. Their expectation is clear for those designing these things for organisations, and I think their hope was that it would lead to software and systems producers having to raise their game to stay competitive.

How to put data protection by design and default into practice

To put this obligation into practice, people often jump to PETs (privacy-enhancing technologies, not Charlie the golden retriever). There have been loads of studies, reports and articles on a range of specific technical measures, too much to go into here. In reality, and in my experience, these more sophisticated methods and third-party software solutions tend to only be available to those with deep pockets and an internal tech team.

The reality for many organisations is to embed DP compliance into processes and policies. When I work with clients on this, I tend to start by looking at how things get done in their organisation.

  • How do you go from “I’ve had a great idea” to “it’s gone live today”?
  • Where are the gatekeepers and / or the decision makers for ideas and projects?
  • How do you manage and record the different steps, phases or tasks needed?
  • How do you identify who needs to be involved, and then involve them?

Once you understand the process and who’s involved, look at what already exists. Many organisations use certain tools to project manage, to assign tasks to people, to report and manage bugs or other issues and so on. Many organisations already have a form or process to suggest ideas, ask for approvals, and to get items on the to do list.

I have found an effective approach is to build on what is already there. No-one will thank you for introducing a new 4-page form on data protection stuff. No-one will fill it in. Where are the points in the processes, or on the forms, or in the tool template to add the DP stuff? Can you add in questions, checklists, templates, approval points and the like?

Examples of data protection by design and default solutions

Speaking of checklists and templates, one of the most effective DP by design ‘solutions’ was where the organisation created a ‘standard build template’ that incorporated key DP principles. So anytime anything needed building or expanding, the build requirements already included things like specific security measures, elements of data minimisation and controls for the end users.

With another organisation there was a two-part solution. One part was for those developing the idea and deciding on the data collection. This involved adapting their project documentation to include key elements from a DPIA and a way to check each data field requested was actually necessary, as well as identify where it might be sensitive or need greater care. The other part was a checklist for those implementing the new thing that set out some core requirements such as the ability for the end user to take certain actions with their data, and what settings or controls had to be off by default. This approach also encouraged better communication between the two groups, as they worked together on which solutions would best implement the core requirements.

Think of the people

Getting to a practical implementation of DP by design and default involves taking a people-centred approach. Both in terms of collaborating with the relevant people internally, as well as thinking about impacts on and risks to the individuals whose personal data you are doing things with.

You also need to consider your organisation’s position on DP compliance. The law might want everyone to do the gold standard, but that’s not the reality we live in. How far you go in implementing DP by design and what your measures look like will be influenced by things like your organisation’s risk appetite, their approach to risk management, any relevant vision or principles they have, or any frameworks or internal committees in place covering compliance, ethics, and so on.

Your colleagues can also be a great asset. Your business development people will have information on what corporate customers are asking for, your end user support people will have intel on what the users complain about. How far do these overlap and where do they conflict?

For example, I have seen the same transparency issue in multiple organisations where both corporate customers and end users want to understand how the product or software works. The corporate customers need to do compliance diligence, and the end users want to know if they can trust the organisation with their data. Producing something for both audiences on how it all works not only helps implement the transparency point of article 25, it also checks if different parts of the organisation have the same understanding of how it works, and flags up discrepancies.

Sometimes, doing something because it leads to a good consumer experience or higher levels of satisfaction gets you to the right place, and can be an easier sell than labelling it a ‘DP compliance requirement’.

Some organisations have the resources to do user research and surveys, and the results of these can be very useful to the DP people. If you can work with and understand the objectives and the pain points of colleagues (such as those managing infrastructure, information security, compliance, risk management, customer support and even the executive team), you’ll be in a good place to see where you slide in your DP stuff, and piggyback on other initiatives.

This is one area of GDPR that takes an outcome-based approach. And the joy that is that it is scalable and adaptable to each situation, and to each organisation’s resources and capabilities. So by focusing on the required outcomes, and putting people at the centre, achieving data protection by design and default can be a lot easier than it first appears.

Quick Guide to UK GDPR, Marketing and Cookies

January 2024

How UK GDPR and PECR go hand-in-hand

Most have heard of GDPR. However, data protection law existed way before this new kid arrived on the block in 2018. And let’s not forget in the UK, GDPR has an equally important cousin called PECR.

The UK’s Privacy and Electronic Communications Regulations (PECR) have been around since 2003 before the days of smartphones and apps. Organisations need to consider both UK GDPR and PECR when it comes to marketing and cookies.

Why marketers need to pay attention

There are more fines issued by the Information Commissioner’s Office (ICO) for falling foul of the PECR marketing rules than there are under UK GDPR. Under UK data reform plans, the amount the Regulator can fine under PECR could be set to increase substantially to a maximum of around £17 million. Currently the maximum fine under PECR is £500k. So it’s worth taking notice.

This is a quick overview, and we’d encourage you to check the ICO’s detailed marketing guidance and cookie guidance.

What’s the difference between UK GDPR and PECR?

In a nutshell…

UK GDPR

✓ Tells us how we should handle personal data – information which could directly or indirectly identify someone.
✓ Sets out requirements organisations need to meet and their obligations.
✓ Provides us with seven core data protection principles which need to be considered whenever we handle personal data for any purpose, including marketing.
✓ Defines the legal standard for consent, which is relevant for direct marketing
✓ Gives people privacy rights, including an absolute right to object to direct marketing.

One of the principles is that processing of personal data must be lawful, fair and transparent. This includes making sure we have a lawful basis for our activities.

PECR

✓ Sets out specific rules for marketing to UK citizens, for example by emails , text messages or conducting telemarketing calls to UK citizens.
✓ Sets out specific rules when using cookies and similar technologies (such as scripts, tracking pixels and plugins).

PECR is derived from an EU directive, and EU countries have their own equivalent regulation which, whilst covering similar areas, may have different requirements, when marketing to their citizens.

We’ve written about the specific rules for email marketing and telemarketing here:
UK email marketing rules
UK telemarketing rules
The ‘soft opt-in’ – are you getting it right

How do UK GDPR and PECR work together?

Direct marketing

Marketers need to consider the core principles of UK GDPR when handling people’s personal information. Furthermore, they need to have a lawful basis for each data activity. Of the six lawful bases, two are appropriate for direct marketing activities; Consent and Legitimate Interests.

Consent: PECR tells us, for certain electronic marketing activity, we have to get people’s prior consent. UK GDPR tells us the standards we need to meet for this consent to be valid. Consent – Getting it right

Legitimate interests: If the types of marketing we conduct don’t require consent under PECR , we may choose to request consent anyway, or we could rely on legitimate interests. For example, marketing to business contacts rather than consumers.

Under GDPR, we need to be sure to balance our legitimate interests with the rights and interests of the people whose personal information we are using – i.e. the people we want to market to. ICO Legitimate Interests Guidance 

What about cookies?

PECR requires opt-in consent for most cookies or similar tech, regardless of whether they collect personal data or not. And we’re told this consent must meet the UK GDPR standards.

In simple terms, the rules are:

✓ Notify new users your website/app users about your use of cookies or similar technologies and provide adequate transparent information about what purposes they are used for.
✓ Consent is required for use of cookies, except a narrow exclusion for those which are ‘strictly necessary’ (also known as ‘essential’ cookies).
✓ Users need to be able to give or decline consent before the cookies are dropped on their device and should be given options to manage their consents at any time (e.g. opt-out after initially giving consent).

Our data, tech and the app-ocalypse

January 2024

In 2013, after Edward Snowden leaked thousands of secret files, the Kremlin’s security bureau did something interesting. They swapped computers for manual typewriters. Russian spooks reasoned hard copies were easier to protect than digital files. Furthermore, hackers might be able to infiltrate sensitive systems, but the old-school art of safe-cracking? It seemed to have fallen by the wayside.

As I get older, I’m beginning to think the Kremlin might have been onto something. Why?

Maybe it’s a generational issue. I’m Gen ‘X’. I grew up without mobile phones or the internet, but became familiar with the technology as it developed from the 1990s onwards. I enjoy technology. I respect it. I’m also, however, sceptical in a way many of my Millennial and Gen ‘Z’ colleagues may not be.

For me it boils down to two concerns – trust and over-reliance . Given how there’s now an app for everything, I have to ask – is the App-ocalypse Nigh ? What happens to the increasingly personal and intrusive levels of personal data entered into these ‘everything apps’.

Just because data’s aggregated into zeros and ones, it doesn’t mean it’s ‘tidy’. In fact, I suspect too many digital ‘data warehouses’ resemble the hoarder’s houses you might have seen on daytime TV, with stuff scattered everywhere.

It’s not just apps – the endless requirement to populate online forms is relentless. Now I hear more ‘frictionless facial recognition’ is planned at airports in the UK and elsewhere. And it’s making me uneasy. Technology is wonderful for creating efficiencies and streamlining processes. In my world alone, I see how clever privacy technology solutions ease the burden of data protection compliance.

But is technology always wonderful? Why am I uneasy?

An example – I needed to renew my driving licence. I went on to the Government website and duly entered a great deal of sensitive data. This included my passport number, my mother’s maiden name, my date of birth, my home address and my National Insurance number. This started me thinking… ‘How secure is this platform? What are the Government really doing to prevent my data falling into malicious hands?’

At the other end of the scale, I needed to reschedule a beautician’s appointment (much needed after eating my body weight in chocolate and cheese over Christmas). My call was met by a recorded message. I duly pressed ‘2’ to cancel/change an appointment. I was then informed I must (yes, they did say must) download the app to cancel/change appointments. A look at the app’s privacy information didn’t fill me with confidence, so I rang again, selecting ‘3’ for all other enquiries. After ten minutes of listening to promotions about fantastic rejuvenating treatments, I gave up. What if I prefer not to be forced to register and share my personal details via your app? I’m getting a face treatment, not applying for a pilot’s licence!

At this point, a shout out to the Kennel Club’s customer service. I took out their insurance for my puppy this year. They’re great. I’ve had to call twice, and each time a prompt pick-up from a lovely human. Somewhat of a rarity these days.

I recently read EasyPark Group, the owner of brands like RingGo and Park Mobile, were hacked. Yes, like many others I have RingGo. I was forced to download the app to use a station car park – there was no choice. I also have other parking apps. Oh the joys of standing in the rain in a car park trying to download yet another parking app. Handing over my data to yet another company. Will these companies protect it? What security measures and controls do they have? Did they conduct a DPIA? Was it outsourced to an app developer, possibly outside the UK/EU? Did they do any due diligence?

As well as my fears around data, I also worry for the significant minority disenfranchised by the widescale embrace of what my colleague Simon calls the ‘Mobilical Cord’. It’s so very true – I’m unable to properly function without my smartphone implanted in my paw. I use it to access the internet, my emails, messages, banking and so on. It’s also a crucial part of our company security – to authenticate I am really me.

The 2021 UK Census showed 90% of households had a home computer. 93% had access to a mobile phone. I suspect it’s higher now, but it’s still not everyone. As of 2023, according to research by Statista 98% of 16-24 year olds have a smartphone. However, this drops to 80% for the over 65s. Less tech-savvy and particularly the elderly are being left behind. My mother is 84. I got her a smartphone, but she hates it and doesn’t understand it. Apps? An enigma. She’s also terrified of online scams, knowing how the elderly are disproportionately targeted.

So, now we also face the prospect of passport-free travel. UK Border Force is set to trial an e-gate schemes similar to those rolled out in Dubai and Australia. This negates the need to show a passport, instead using facial recognition technology (FRT).

Phil Douglas, the Director General of Border Force has said “I’d like to see a world of completely frictionless borders where you don’t really need a passport. The technology already exists to support that.” He added: “In the future, you won’t need a passport – you’ll just need biometrics.”

According to the Times the biometric details of British and Irish travellers are already held after being collected in the passport application process. What does Phil Douglas feel about our personal biometrics being potentially harvested by countries with dodgy human rights records?

Too many people will shrug – an end to lengthy queues? Yes please. But who controls my facial map? How will it be used? Will it be shared? How will it be kept secure? Facial recognition tech also raises issues of bias in algorithms, and the potential for mistakes, with serious consequences.

I suspect, one day, there’ll be the kind of disaster one sees in movies, where the Internet collapses for a significant period. What then? I also wonder if, eventually, ambulance-chasers will identify companies using apps to disproportionately harvest data – and playing fast and loose with the safeguards set up to protect us. Will this become the next big Personal Indemnity Insurance (PII) style business opportunity?

What I do know is businesses who put all their eggs in one basket without contingencies, or fail to anticipate risk, are those likeliest to suffer when the app-ocalypse (however it manifests itself) is nigh!

Now, did I mention AI…?

The three foundations of good data governance

January 2024

People, processes and technologies

Creating a clear data governance strategy is crucial to making sure data is handled in line with your organisation’s aims and industry best practice.

Data governance is often thought of as the management process by which an organisation protects its data assets and ensures compliance with data laws, such as GDPR. But it’s far broader than compliance. It’s a holistic approach to data and should have people at its very heart. People with defined roles, responsibilities, processes and technologies which help them make sure data (not just personal data) is properly looked after and wisely used throughout its lifecycle.

How sophisticated your organisation’s approach needs to be will depend on the nature and size of your business, the sensitivity of the data you hold, the relationships you have with business partners, and customer or client expectations.

Benefits of good data governance

There are many benefits this activity can bring, including:

  • Minimising risks to the business, your employees, customers and suppliers
  • Giving your people clarity around expected behaviours and best practices
  • Embedding compliance requirements

A strong data governance approach can also help an organisation to make the most of their data assets, improve customer experience and benefits, and leverage competitive advantage.

Data governance – where to start?

There are three foundational elements which underpin successful data governance – People, Processes and Technologies.

Data governance people processes technologies

People

Engaging with stakeholders across the organisation to establish and embed key roles and responsibilities for data governance.

Many organisations look to establish a ‘Data Ownership Model’ which recognises data governance is an organisational responsibility which requires close collaboration across different roles and levels, including the delegation of specific responsibilities for data activities.

Here’s some examples of roles you may wish to consider:

  • Data strategy lead – such as Chief Data Officer / Chief Digital Officer
  • Data protection lead – such as Data Protection Officer (DPO), if you have one
  • Information security lead – such as Chief Information Security Officer (CISO) or Chief Technology Officer
  • Information asset owners (or data owners) – leaders of business functions / teams which collect and/or use personal data for particular purposes. Such as HR, Marketing & Sales, Finance, Operations, and so on.
  • Data specialists – heavy users of complex datasets, such as data analysts and data scientists.
  • System owners – the people who manage the key systems which hold personal data, such as IT managers.

Processes

Think about all the processes, policies, operating procedures and specialist training provided to guide your employees and contractors to enable them to handle data in line with your business expectations – as well to comply with the law. For example:

Without these in place and regularly updated, your people can’t possibly act in the ways you want and expect them to.

In my experience, success comes from keeping these items concise, and as relevant and engaging as possible. They can easily be forgotten or put in the ‘maybe later’ pile…  a little time and effort can really pay dividends!

Technologies

The technologies which underpin all data activities across the data lifecycle. For example, your HR, marketing & CRM, accounting and other operational systems you use regularly. Data governance requires those responsible for adopting technologies to ensure appropriate standards and procedures are in place which ensure appropriate:

  • Accessibility and availability standards
  • Data accuracy, integrity and quality management
  • Privacy and security

Looking at privacy technology in particular, the solutions available have really progressed in recent years in terms of both their capability and ease of use. Giving DPOs and others with an interest in data protection clear visibility of where the risks lie, help to prioritise them and pointers to relevant solutions. They can also help provide clear visibility and oversight to the senior leadership team.

The ‘Accountability Principle’

Data governance goes hand in hand with accountability – one of the core principles under GDPR. This requires organisations to be ready to demonstrate the measures and controls they have to protect personal data and in particular, show HOW they comply with the other data protection principles.

Appropriate measures, controls and records need to be in place to evidence accountability. For example, a Supervisory Authority (such as the ICO) may expect organisations to have:

  • Data protection programme, with clear data ownership & governance and regular reporting up to business leaders
  • Training and policies to guide staff
  • Records of data mapping exercises and processing reviews, such as an Information Asset Register and Record of Processing Activities
  • Risk assessments, such as Data Protection Impact Assessments and Legitimate Interests Assessments
  • Procedures for handling of individual privacy rights and data breaches
  • Contracts in place between organisations which include the relevant data protection clauses, including arrangement for restricted international data transfers
  • Data sharing agreements

Ready to get started?

If you’re keen to reap the benefits of improved compliance and reduced risk to the business, the first and crucial step is getting buy-in from senior leadership and a commitment from key stakeholders, so I’d suggest you kick-off by seeking their support.

Data Protection Impact Assessments for Agile projects

November 2023

How to assess risks when a project has multiple phases

Agile methodology is a project management framework comprising of several dynamic phases, known as ‘sprints’. Many organisations use Agile for software & technology development projects, which often involve the processing of personal data.

From a data protection perspective, Agile (and indeed other multi-stage projects) present some challenges. The full scope of data processing is often unclear at the start of a project. The team are focussed on sprint one, then sprint two, and so on. So how do you get Privacy by Design embedded into an Agile project?

Conducting a Data Protection Impact Assessment (DPIA) is a legal requirement under data protection law for certain projects. Even when a DPIA is not mandatory it’s a good idea to consider the privacy impacts of any new processing.

Looking at a project through a privacy lens at an early stage can act as a ‘warning light’, highlighting potential risks before they materialise and when measures can still be easily put in place to reduce the risks.

If your organisation uses Agile, it’s likely you’ll need to adapt your DPIA process to work for Agile projects. Understand the overall objectives and direction of travel to get a handle on how data use will evolve and what risks might be involved.

Working together to overcome challenges

It’s important all areas of the business collaborate to make sure projects can proceed at pace, without unnecessary delays. Compliance requirements must be built into Agile plans alongside other business requirements – just as ‘Privacy by Design’ intended.

Those with data protection responsibilities need project management teams to engage with them at an early stage, to explore the likely scope of processing and start to identify any potential privacy risks, while there’s still time to influence solution design.

This isn’t always easy. Given the fluid nature of Agile, which is its great strength, there is often very limited documentation available for review to aid Compliance assessments.

Privacy questions often can’t be answered at the start – there may be many unknowns. So its key to agree what types of data will be used , for what purposes and when more information will be available for the DPIA – crucially before designs are finalised. Timings for assessment need to be aligned to the appropriate sprints.

As many companies have found, embedding privacy awareness into the company culture is a big challenge and ensuring Data Protection  by Design is a key consideration for tech teams at the outset is an on-going task.

Example: data warehouse

Organisations with legacy data systems might want to build a data warehouse / data lake to bring disparate data silos together under one roof, gain new insights and drive new activity. It’s important to assess any privacy impacts this new processing create.

Using Agile, new capabilities may be created over several development phases. So it’s important to conduct an initial assessment at the start, but stay close to as the project evolves and be ready to collaborate again, in line with sprint timings – before data is transferred or before new solutions are created.

Top tips for ‘Agile’ DPIAs

Here are my top tips for a fluid DPIA process;

1. DPIA training & guidance – make sure relevant teams, especially IT, Development and Procurement, all know what a DPIA is (in simple layman’s terms) and why it’s important. They need to recognise the benefits of including privacy in scope from the start (i.e. ‘by Design’).

2. Initial screening – develop a quick-fire set of questions for the business owner or project lead, which will give the key information you need, such as

  • the likely personal data being use
  • any special category data, children’s data or vulnerable people’s data
  • the purposes of processing
  • security measures… and so on

Once it has been identified there is personal data involved you can start assessing the potential risks, if any. As odd as this may sound, it is not uncommon for tech teams to be unsure at the beginning of a project if personal data (as defined under GDPR to include personal identifiers) will in fact be involved.

3. DPIA ‘Lite’ – if there are potential risks, develop a series of questions to evaluate compliance against the core data protection principles of the GDPR.

The Agile environment can prove challenging but also rewarding. Adopting a flexible DPIA process which works in harmony with Agile is a positive step forward for innovative companies, allowing your business to develop new solutions while protecting individuals from data protection risks, as well as protecting your business from any possible reputational damage.

UK telemarketing rules

November 2023

How to avoid falling foul of the rules for marketing calls

Hardly a month goes by without the UK’s Information Commissioner’s Office (ICO) fining another company for breaking the telemarketing rules under the Privacy and Electronic Communications Regulations (PECR).

I’m sure all of us have been on the receiving end of a dodgy call. The favoured have you recently been involved in an accident? springs to mind.

Tackling nuisance calls is clearly a key priority for the Regulator, so how do bone fide businesses avoid being tarred with the same brush as the rogue operators?

6-point telemarketing guide

1. Service vs marketing calls

The definition of direct marketing covers any advertising or promotional material directed at particular individuals. Routine customer service calls don’t count as direct marketing.

But if you’re treating a call as a service call (and not applying the marketing rules under PECR) you need to be careful the script / call guide and what your call handlers say in practice doesn’t stray into the realms of trying to get customers to buy extra products, services or to upgrade or renew contracts.

A Trade Union was fined in 2021 for not screening numbers against the TPS. The Union didn’t believe its calls were direct marketing, but the ICO judged they were. Just because you believe you’re acting in good faith doesn’t mean you are. Marketing messages and service messages

2. Consent or Legitimate Interests?

Telephone numbers which can directly or indirectly identify an individual are personal data and fall under the scope of UK GDPR. For example, when using someone’s personal or work mobile, direct line business number or home landline you’ll need to comply with both UK GDPR and PECR.

You’ll need to decide whether to rely on consent or legitimate interests as your lawful basis under UK GDPR to make telemarketing calls to people. In brief:

  • Consent: make sure this meets the requirement to be a specific, informed, unambiguous indication of someone’s wishes made with a positive action (e.g. an opt-in). Keep records of consent (including, if relevant the script used) and make sure withdrawing consent is as easy as it is to give it. Consent – getting it right
  • Legitimate Interests: conduct a Legitimate Interests Assessment (LIA), keep a record of this assessment and be sure to provide people with a way to opt-out of future calls. Legitimate interests – is it legit? 

3. Live marketing calls to individuals

Below are the key rules to follow:

  • Don’t make marketing calls to anyone who’s told you they don’t want to hear from you. Keep a suppression file of all objections to telemarketing, and screen your campaigns against this internal ‘do not call list’.
  • Don’t make marketing calls to anyone registered with the Telephone Preference Service, unless you’ve collected consent to call them.
  • Say who’s calling – i.e. clearly state the name of your organisation.
  • Always display your number (or an alternative contact number).
  • Provide an address or freephone contact number if asked.
  • Make it easy to opt-out of further calls.

4. Remember sector specific rules

Stricter rules apply if you’re making calls about claims management or pension schemes. For claims management services you must have consent. For calls about pension schemes, you must have consent unless:

  • You are a trustee/manager of a pension scheme; or
  • A firm authorised by the Financial Conduct Authority; or
  • Your relationship with the individual meets strict criteria.

5. Automated calls

When using automated dialling systems which play a recorded message the rules are very strict. You must have:

  • Specific consent from individuals indicating they’re okay to receive automated calls; and
  • Calls must include your organisation’s name and contact address or freephone number; and
  • You must display your number (or alternative contact number).

In practice, these consent rules make genuine compliant automated calls very difficult.

6.  Marketing/sales calls to business numbers

The rules under the UK’s PECR are the same for calling businesses as they are for individuals.

  • You can call any business that has specifically consented to your calls. Or, and most commonly…
  • You can make live calls to any business number which is not registered with the TPS or the Corporate Telephone Preference Service (CTPS). But only if they haven’t objected to your calls and you’re not calling about claims management services.

The reason screening against both TPS and CTPS is necessary (if you don’t have consent), is sole traders and some partnerships may have registered with the TPS.

Applicable laws for telemarketing

PECR gives us the rules for telemarketing calls in the UK and the ICO has published telemarketing guidance. As well as complying with PECR you should comply with UK GDPR for your handling of personal data.

The rules differ in other countries, so check local laws if your telemarketing extends to calling people in other territories. Many countries have a ‘do not call’ register similar to the Telephone Preference Service.

There are also specific rules under PECR for email marketing messages, see UK email marketing rules.

3 steps to decide your data retention periods

November 2023

How to start tackling data retention

Both UK and EU data protection law requires organisations to not keep personal data any longer than necessary for the purpose(s)s the data is processed for. Sounds simple, doesn’t it?

In practice, it’s one the most challenging areas of the law to comply with. How do businesses decide on justifiable retention periods? How do they implement retention periods in practice? And, crucially, what are the risks if they get it wrong?

In our experience it’s not uncommon for many businesses to be holding onto unnecessary personal data. So when deciding how long personal data should be kept, it’s helpful to work through the following key steps.

1. Does the law tell us how long to retain certain records?

Sometimes there will be a legal or statutory requirement to retain personal data for certain purposes. This is the easy bit, as you can use this to set retention periods for certain categories of data.

For example, your business may be subject to laws relating to employment and finance which give specific periods when you process people’s data for these purposes.

There may also be a duty to preserve documents for disclosure in legal proceedings that may have started or may be started in future.

2. Are there industry standards, guidelines or known good practice?

In regulated sectors such as finance, health and manufacturing there may be agreed industry standards or agreed professional practices which recommend and/or can justify retention periods. Working to best practice and precedent makes things much easier.

3. What about… everything else?

Okay, you’ve established for certain dataset and what you use that data for, there’s no statutory requirements. Maybe you’ve also no industry standards that apply. What do you do now?

You’ll need to assess what’s necessary, proportionate and reasonable to retain. By its very nature, this is subjective; cases will often turn on their own merits. Ideally, you’ll want to be able to justify retention periods for different datasets.

Here are some of the questions you can ask to try and reach a defensible decision.

  • What are the business drivers for retention?
  • Does the product lifecycle have an effect on retention?
  • Does your approach to pricing have an effect on retention?
  • Can it be evidenced certain data is legitimately needed for a certain amount of time?
  • Do you need to keep personal data to handle queries or complaints?
  • How damaging would it be to the business to delete certain data?

To give an example, I know of a retailer which took the step of carrying out research into how often their customers purchased their products. Due to the sturdy nature of their products, the research clearly showed for many customers there was a gap of 3-4 years between purchases. This analysis was used as justification for retaining customer details for postal marketing longer than perhaps another company might.

What are the risks?

Businesses expose themselves to a number of risks if they keep personal data for longer than necessary, or indeed don’t keep it long enough.

Information security risks

The impact of a data breach could be significantly worse; with a larger volume of records and more people affected. Enforcement action could be more severe if it becomes clear personal data has been kept with no justifiable reason, i.e. a Regulator might deem that older data was unlawfully held. It could also increase the likelihood of complaints from individuals asking why their data was kept for so long.

I once received an email from a major UK brand informing me that my data had been involved in a data breach. My first thought was how on earth does this company still have information about me? I couldn’t remember when I’d last bought anything from them.

Legal risks

Where there’s a statutory requirement for personal data to be retained for a specific period, there’s clearly a risk if records aren’t kept for the statutory period.

Contractual risks

Certain personal data may need to be kept to meet contractual terms; for example to provide a service or warranties. Not keeping certain data long enough may lead to an inability to respond to complaints, litigation or regulatory enforcement.

Customer expectations

Customers expect organisations to be able to respond to their needs. For example, answering queries or responding to complaints. Data about them therefore needs to be kept long enough to meet customers’ reasonable expectations. However, once a reasonable period has elapsed a customer may not expect you to be continuing to hold their details.

All these risks could also result in reputational damage for an organisation which fails to meet its legal obligations, contractual obligations, or their customers’ expectations.

We’d recommend all businesses have a straightforward retention policy and keep a retention schedule. Admittedly these are only the first steps. Actually implementing and deleting data when it comes to the end of its retention period can be the biggest challenge. We’d suggest you review your data at least annually and cleanse.

Using the old adage ‘you can only eat an elephant one bite at a time’, we’d advise focusing on the biggest risk areas. What data represents the biggest risk if you keep it too long?

Our detailed Data Retention Guide is full of further tips, case studies and sample retention schedules.

Legitimate interests: is it legit?

November 2023

5-point legitimate interests checklist

“Legitimate interests is the most flexible lawful basis for processing,
but you cannot assume it will always be the most appropriate.”
UK Information Commissioner’s Office

Legitimate interests is used as a ‘go-to’ lawful basis for a host of business activities; analysis, administration, fraud prevention, network security, prospecting, marketing segmentation and personalisation… the list goes on.

But, just because we could do something with people’s personal data, doesn’t mean we should. The lack of another lawful basis as a ‘good fit’ doesn’t mean we should simply choose legitimate interests and decree it legit!

UK and EU GDPR require organisations to balance their own legitimate interests against the interests of the people whose data is used for a particular activity – and their rights and freedoms. Such business interests can be commercial ones, but they need to be balanced.

Legitimate interests checklist

Here’s a quick reminder of the elements to consider when relying on legitimate interests as your lawful basis.

1. Reasonable expectations

Are you handling people’s personal data in a way they would reasonably expect? If not do you have a very strong justification?

Judging reasonable expectations is objective. Legitimate interests is more likely to apply where you have a relevant and appropriate relationship with the people whose data you’re using. For example, they’re employees, clients or existing customers. Other factors which  play a part in this are how long ago you collected the data, where you sourced the data from and whether you’re using new technology or using data in a way people might not have expected.

2. Assessment

Have you conducted a Legitimate Interests Assessment (LIA)? This 3-part assessment should cover:

  • Identifying a legitimate interest
  • Demonstrating the processing is necessary for your organisation to achieve your objectives
  • Balancing your interests against individual interests, rights and freedoms

Where a case for relying on legitimate interests is clear cut, this needn’t be a complex assessment, but alarm bells should start ringing if what you’re planning to do…

  • isn’t really necessary
  • could be achieved in another less intrusive way
  • would be unexpected or unreasonable
  • may cause harm or distress to those whose data is involved
  • means people are unable to exercise their privacy rights

3. Transparency

Are you open about what you’re doing? Have you fulfilled people’s right to be informed about how their personal data’s being used?

It’s a legal requirement to tell people what processing activities you rely on legitimate interests for. This should be explained in a privacy notice clearly brought to people’s attention. Typically a privacy notice would be on forms where you collect personal data, on your website footer and in the footer of your emails.

4. Right to object

Can you provide people with a clear opportunity to object? If not, can you justify not doing so? For example, you probably wouldn’t give people the opportunity to object to necessary fraud or security checks.

5. Risk assessment?

Does what you want to do involve children’s data? Does it involve special category data (such as health data or biometrics)? Monitoring people on a large-scale? Involve innovative solutions like AI?

For any higher risk activities, it’s likely you’ll need to conduct a Data Protection Impact Assessment in addition to an LIA.

Legitimate interests and marketing

Direct marketing may be a legitimate interest, to paraphrase GDPR Recital 47, but organisations businesses still need to balance their commercial interests, and make sure their marketing doesn’t infringe on the rights and freedoms of individuals.

Crucially, legitimate interests can only be used if consent is not a requirement under eprivacy rules, such as the UK’s Privacy Electronic and Communications Regulations (PECR).

Clearly, it’s difficult to argue direct marketing is in people’s interests, so the ICO recommends focusing on the following factors when conducting a legitimate interest assessment:

  • Would people expect you to use their details for marketing?
  • Would unwanted marketing messages cause a nuisance?
  • Could the method and frequency of communications have a negative impact on more vulnerable people? In simple language, could you be accused of being overly pushy or aggressive?

Most importantly, everyone has an absolute right to object to direct marketing. The ICO says it’s more difficult to pass a balancing test if you do not give people a clear option to opt-out when you collect their details. Or, if the data wasn’t collected directly from them, in your first communication.

Ultimately to genuinely rely on legitimate interests for any purpose, we should be up front and honest about what we are doing, make sure it’s reasonable and give people the chance to say no. Unless we have a strong case for doing otherwise.