Understanding and handling Special Category Data

July 2024

Why is it special and what does data protection law tell us we need to do?

There is a distinct subset of personal data which is awarded ‘special’ protection under data protection law. This subset includes information for which people have been persecuted in the past, or suffered unfair treatment or discrimination, and still could be. These special categories of personal data are considered higher risk, and organisations are legally obliged to meet additional requirements when they collect and use it.

Employees need to be aware special category data should only be collected and used with due consideration. Sometimes there will be a clear and obvious purpose for collecting this type of information; such as a travel firm needing health information from customers, or an event organiser requesting accessibility requirements to facilitate people’s attendance. In other situations it will be more nuanced.

What’s special category data?

Special Categories of Personal Data under UK GDPR (and it’s EU equivalent), are commonly referred to as special category data, and are defined as personal data revealing:

  • Racial or ethnic origin e.g. diversity and inclusion data
  • Political opinions
  • Religious or philosophical beliefs
  • Trade union membership

The definition also covers:

  • Genetic data
  • Biometric data (where this is used for identification purposes)
  • Data concerning health e.g. medical records, sickness records, accessibility requirements and so on.
  • Data concerning a person’s sex life or their sexual orientation. E.g. diversity and inclusion data

Inferring special category data

Sometimes your teams might not realise they’re collecting and using special category data, but they might well be.

It’s likely if you have inferred or made any assumptions based on what you know about someone, for example they’re likely to have certain political opinions, or likely to suffer from a certain health condition, this will mean you are handling special category data.

There was an interesting ICO investigation into an online retailer which found it was targeting customers who’d bought certain products, assuming from this they were likely to be arthritis sufferers. This assumption meant the retailer was judged to be processing special category data.

If you collect information about dietary requirements these could reveal religious beliefs, for example halal and kosher. It’s also worth noting in 2020 a judge ruled that ethical veganism qualifies as a philosophical belief under the Equality Act 2010.

Other ‘sensitive’ data

There’s sometimes confusion surrounding what might be considered ‘sensitive’ data and what constitutes special category data. I hear people say “why is  financial data not considered as sensitive as health data or ethnic origin?’ Of course, people’s financial details are sensitive and organisations do still need to make sure they’ve got appropriate measures in place to protect such information and keep it secure. However, UK GDPR (and EU) sets out specific requirements for special category data which don’t directly apply to financial data.

To understand why, it’s worth noting special protection for data such as ethnicity, racial origin, religious beliefs and sexual orientation was born in the 1950s, under the European Convention on Human Rights, after Europe had witnessed people being persecuted and killed.

Special Category Data Requirements

In a similar way to all personal data, any handling of special category data must be lawful, fair and transparent. Organisations need to make sure their collection and use complies with all the core data protection principles and requirements of UK GDPR. For example;

  • Do you have a clear purpose and reason for collecting/using special category data?
  • Have you identified a lawful basis? For example:
    • is this data necessary in order for you to fulfil a contract you have with the individual?
    • Are you legally obliged to hold this data?
    • Should you be seeking their consent?
    • Or is there another appropriate lawful basis?  Quick Guide to Lawful Bases.
  • Have you told people what their special category data will be used for? What does your Privacy Notice tell people? Have people seen your Privacy Notice?
  • Can you minimise the amount of special category data you are collecting?
  • Have you decided how long this data will be kept for?
  • How will you make sure this data is not used for another different purpose?
  • What security measures will you put in place? e.g. can you limit who has access to this data?

What makes special category data unique is it will be considered a higher risk than other types of data, and also requires you to choose a special category condition.

Other key considerations and requirements

Risk Assessments

Confirm whether you need to conduct a Data Protection Impact Assessment for your planned activities using special category data. DPIAs are mandatory for any type of processing which is likely to be high risk. This means a DPIA is more likely to be needed when handling special category data. That’s not to say it will always be essential, it really will depend on the necessity, nature, scale and your purpose for using this data.

Special Category Condition

Alongside a lawful basis, there’s an additional requirement to consider your purpose(s) for processing this data and to select a special category condition. These conditions are set out in Article 9, UK GDPR.

(a) Explicit consent
(b) Employment, social security and social protection (if authorised by law)
(c) Vital interests
(d) Not-for-profit bodies
(e) Made public by the data subject
(f) Legal claims or judicial acts
(g) Reasons of substantial public interest (with a basis in law)
(h) Health or social care (with a basis in law)
(i) Public health (with a basis in law)
(j) Archiving, research and statistics (with a basis in law)

Associated condition in UK Law

Five of the above conditions are solely set out in Article 9. The others require specific authorisation or a basis in law, and you’ll need to meet additional conditions set out in the Data Protection Act 2018.

If you are relying on any of the following you also need to meet the associated condition in UK law. This is set out in Part 1, Schedule 1 of the DPA 2018.

  • Employment, social security and social protection
  • Health of social care
  • Public health
  • Archiving, research and statistics.

If you are relying on the substantial public interest condition you also need to meet one of 23 specific substantial public interest conditions set out in Part 2 of Schedule 1 of the DPA 2018.

The ICO tells us for some of these conditions, the substantial public interest element is built in. For others, you need to be able to demonstrate that your specific processing is ‘necessary for reasons of substantial public interest’, on a case-by-case basis. The regulator says we can’t have a vague public interest argument, we must be able to ‘make specific arguments about the concrete wide benefits’ of what we are doing.

Appropriate Policy Document (APD)

Almost all of the substantial public interest conditions, plus the condition for processing employment, social security and social protection data, require you to have an APD in place. The ICO Special Category Guidance in includes a template appropriate policy document.

Privacy Notice

A privacy notice should explain your purposes for processing and the lawful basis being relied on in order to collect and use people’s personal data, including any special category data. Remember, if you’ve received special category data from a third party, this should be transparent and people should be provided with your privacy notice.

Data breach reporting

You only have to report a breach to the ICO if it is likely to result in a risk to the rights and freedoms of individuals, and if left unaddressed the breach is likely to have a significant detrimental effect on individuals. Special category data is considered higher risk data, and therefore if a breach involves data of this nature, it is more likely to reach the bar for reporting. It is also more likely to reach the threshold of needing to notify those affected.

In summary, training and raising awareness are crucial to make sure employees understand what special category data is, how it might be inferred, and to know that collecting and using this type of data must be done with care.

Why the Tory app data breach could happen to anyone

June 2024

Shakespeare wrote (I hope I remembered this correctly from ‘A’ level English), ‘When sorrows come, they come not single spies but in battalions.’ He could’ve been writing about the UK Conservative Party which, let’s be honest, hasn’t been having a great time recently.

The Telegraph is reporting the party suffered it’s second data breach in a month. An error with an app led to the personal information of leading Conservative politicians – some in high government office – being available to all app users.

Launched in April, the ‘Share2Win’ app was designed as a quick and easy way for activists to share party content online. However, a design fault meant users could sign up to the app using just an email address. Then, in just a few clicks, they were able to access the names, postcodes and telephone numbers of all other registrants.

This follows another recent Tory Party email blunder in May, where all recipients could see each other’s details. Email data breaches.

In the heat of a General Election, some might put these errors down to ‘yet more Tory incompetence’. I’d say, to quote another famous piece of writing, ‘He that is without sin among you, let him first cast a stone’! There are plenty of examples where other organisations have failed to take appropriate steps to make sure privacy and security are baked into their app’s architecture. And this lack of oversight extends beyond apps to webforms, online portals and more. It’s a depressingly common, and easily avoided.

In April, a Housing Associate was reprimanded by the ICO after launching an online customer portal which allowed users to access documents (revealing personal data) they shouldn’t have been able to see. These related to, of all things, anti social behaviour. In March the ICO issued a reprimand to the London Mayor’s Office after users of a webform could in click on a button and see every other query submitted. And the list goes on. This isn’t a party political issue. It’s a lack of due process and carelessness issue.

It’s easy to see how it happens, especially (such as in a snap election) when there’s a genuine sense of urgency. Some bright spark has a great idea, senior management love it, and demand it’s implemented pronto! Make it happen! Be agile! Be disruptive! (etc).

But there’s a sound reason why the concept of data proteciton by design and by default is embedded into data protection legislation, and it’s really not that difficult to understand. As the name suggests, data protection by design means baking data protection into business practices from the outset; considering the core data protection principles such as data minimisation and purpose limitation as well as integrity & confidentiality. Crucially, it means not taking short-cuts when it comes to security measures.

GDPR may have it’s critics, but this element is just common sense. Something most people would get onboard with. A clear and approved procedure for new systems, services and products which covers data protection and security is not a ‘nice to have’ – it’s a ‘must have’. This can go a long way to protect individuals and mitigate the risk of unwelcome headlines further down the line, when an avoidable breach puts your customers’, clients’ or employees’ data at risk.

Should we conduct a DPIA?

A clear procedure can also alert those involved to when a Data Protection Impact Assessment is required. A DPIA is mandatory is certain circumstances where activities are higher risk, but even when not strictly required it’s a handy tool for picking up on any data protection risks and agreeing measures to mitigate them from Day One of your project. Many organisations would also want to make sure there’s oversight by their Information Security or IT team, in the form of an Information Security Assessment for any new applications.

Developers, the IT team and anyone else involved need to be armed with the information they need to make sound decisions. Data protection and information security teams need to work together to develop apps (or other new developments) which aren’t going to become a leaky bucket. Building this in from the start actually saves time too.

In all of this, don’t forget your suppliers. If you want to outsource the development of an app to a third-party supplier, you need to check their credentials and make sure you have necessary controller-to-processor contractual arrangements and assessment procedures in place – especially if once the app goes live, the developer’s team still has access to the personal data it collects. Are your contractors subbing work to other third party subcontractors? Do they work overseas? Will these subcontractors have access to personal data?

The good news? There’s good practice out there. I remember a data protection review DPN conducted a few years back. One of the areas we looked at was an app our client developed for students to use. It was a pleasure to see how the app had been built with data protection and security at its heart. We couldn’t fault with the team who designed it – and as such the client didn’t compromise their students, face litigation, look foolish or be summoned to see the Information Commissioner!

In conclusion? Yes, be fast. Innovate! Just remember to build your data protection strategy into the project from Day One.

How to manage employees WhatsApp use

WhatsApp is a great communication tool. Millions use it for chatting with friends, vitally important stuff like sharing cat/dog memes and organising our daily lives. However, what about using messaging apps in a work context? It certainly raises some challenges and data protection concerns.

Inappropriate use of messaging apps can, and has, resulted in serious consequences for both employees and employers. WhatsApp is an excellent example of how technology can blur our private and professional lives. It’s easy to see how it happens – it’s just so darn convenient. Not to mention virtually free.

There have been a number of high-profile cases where WhatsApp messages have led to reputational damage, as well as individuals and organisations being penalised. From police officers and firefighters sending racist, sexist and homophobic content in ‘private’ groups, to politicians and civil servants failing to retain or surrender WhatsApp messages to public inquiries. Aggrieved employees have won damages in tribunal cases for being excluded from work-related group chats. Then there was the famous case of former Health Secretary, Matt Hancock, who handed over thousands of sensitive political messages to a journalist he was working with on his autobiography!

This smorgasbord of drama is before data protection comes into play. 26 members of staff at NHS Lanarkshire used a WhatsApp Group on multiple occasions to share patient data; names, phone numbers, addresses, images, videos and screenshots were shared, including sensitive clinical information. Police officers were caught sharing crime scene images. And so on.

These are egregious examples. In others, however, Gen Z can be cut some slack. They live in an era of fast-moving technology and take instant messaging for granted.

The risks are evident. Employers might have limited control over employees setting up their own WhatsApp group, which are routinely private and set up on personal mobiles. But left unchecked? They can lead to the sharing of offensive content, confidential or commercially sensitive information, or can be the cause of a personal data breach.

Furthermore, employers have no control over how messages are then shared to any number of recipients beyond the organisation. In fact, employers might not know a group exists until a problem arises. In the wrong hands, messaging apps can be like the world’s leakiest chain email.

Mitigating the risks

In light of the risks, an outright ban on the use of WhatsApp for work-related matters may seem like a good idea, but in practice in many organisations this is unlikely to be enforceable. So what can employers do to mitigate the risks?

The answer probably lies in raising awareness, educating staff and setting clear boundaries. Clear policy guidelines on the use of messaging apps such as WhatsApp can help to prevent something nasty flaring up. In much the same way as you would tell people what is deemed acceptable use for email and internet use in the workplace, you can extend this to WhatsApp. Policy guidelines can clearly set out;

📌 what’s acceptable and unacceptable content

📌 don’t share sensitive company information

📌 don’t share personal information relating to customers, business partners, colleagues and so on

📌 don’t share images of people, especially children or vulnerable people

📌 don’t use WhatsApp to harass or bully other employees

📌 don’t deliberately exclude people from a work-related group chat without a good reason.

📌 the risks & consequences of inappropriate use for those involved

Your policy guidelines can distinguish between different types of group. For example, making it clear a WhatsApp group set up to arrange after-work socialising, be it a sports team or going for drinks, is either work-sanctioned or it isn’t. If it isn’t, the responsibility for the content of the chat lies with the users of that group. A fair, transparent policy is unlikely to be criticised if applied consistently and fairly.

Guidelines can be created with clear examples and case studies which resonate with your staff. There’s no shortage of examples out there – several police officers in the example above were sent to prison. Regularly remind people and consider including an ‘acceptable use of WhatsApp’ input during team training.

Should line managers, as part of their duties, be asked to act as moderators or gatekeepers for such groups? Should the DPO be asked to dip sample them? It might work for some organisations.

You can send a clear warning to staff that a breach of the policy is likely to lead to disciplinary action. You can also warn them, WhatsApp messages can (and have!) been used in evidence in legal disputes and civil litigation. They might think what they are doing is private, but it might turn out not be.

Given its huge popularity, there’s little doubt WhatsApp (or similar apps) will continue to be widely used as a simple and cost-effective way of communicating with people in the workplace. But, as with any form of communication, the key is to remain clear, open and transparent about the rules of use to make sure the rights of employees and the data your organisation handles remains protected.

Managing data deletion, destruction and anonymisation

How to keep what you need and get rid of what you don't

Clearing out personal data your business no longer needs is a really simple concept, but in practice it can be rather tricky to achieve! It throws up key considerations such as whether to anonymise or how to make sure its deleted or securely destroyed. Let’s take a look at the key considerations and how to implement a robust plan.

Data retention requirements and risks

Data protection law stipulates organisations must only keep personal data as long as necessary and only for the purposes they have specified. There are risks associated with both keeping personal data too long, or not keeping it long enough. These risks include, but are not limited to:

  • causing the impact of a personal data breach to be significantly worse – i.e. it involves personal data which an organisation has no justification for keeping. Regulatory enforcement action could be more severe and the damage to an organisation’s reputation worse This also raises the risk of class actions or individual compensation claims.
  • falling foul of relevant laws by failing to keep records for legally-defined periods.
  • an inability to respond to complaints, litigation or regulatory enforcement for failing to keep data necessary to meet contractual or commercial terms.

Data retention policy and schedule

To manage this legal obligation successfully, you’ll need to start with an up-to-date data retention policy and schedule. These should clearly identify which types of personal data your business processes, for what purposes, how long each should typically be kept and under what circumstances you might need to hold it for longer.

If your data retention policy or schedule is lacking, first focus on making sure these are brought up to scratch. Our Data Retention Data Retention Guidance has some useful templates.

5 Key steps when the retention period is reached

When an agreed retention period is reach (as per your retention schedule), we’d recommend taking the following steps:

  1. Identify the relevant records which have reached their retention period
  2. Notify the relevant business owner to confirm the data is no longer needed
  3. Consider any changes in circumstances which may require longer retention of the data
  4. Make a decision on what happens to the data
  5. Document the decision and keep evidence of the action

Making the right decision when the retention period is reached

There are different approaches an organisation can take when the data retention period is reached, such as:

  • Delete it – usually the default option
  • Anonymise it
  • Securely destroy it – for physical records, such as HR files

Deletion of records might seem the obvious choice, and it’s often the best one too, but take care how you delete data. Sometimes deleting whole records can affect key processes on your systems such as reporting, algorithms and other programs. Check with your IT colleagues first.

Anonymisation

Most organisations want to extract increasing information and value from their digital assets. In some situations, it can be helpful to remove any personal identifiers so you can keep the data that remains after the retention period has been reached. For example,

  • You might want to continue to provide management information or historical analysis, which you can do an anonymised form. This is quite common
  • If you have data of historic marketing campaign responders, you may wish to keep certain non-personal campaign data in an anonymised form for reporting or analytical purposes, such as response volumes by segment, phasing of responses, and so on
  • If you hold records of job applicants you may wish to keep certain demographics (such as gender or diversity information) in an anonymised form. This might support your equal opportunities endeavours

To be clear, anonymisation is the process of removing ALL information which could be used to identify a living person, so the data that remains can no longer be attributed back to any unique individuals.

Once these personal identifiers are deleted, data protection laws do not apply to the anonymised information that remains, so you may continue to hold it. But you have to make sure it is truly anonymised.

The ICO stresses you should be careful when attempting to anonymise information. For the information to be truly anonymised, you must not be able to re-identify individuals.  If at any point reasonably available means could be used to re-identify the individuals, the data will not have been effectively anonymised, but will have merely been pseudonymised. This means it should still be treated as personal data.

Whilst pseudonymising data does reduce the risks to data subjects, in the context of retention, it is not sufficient for personal data you longer need to keep.

How to manage deletion

There are software methods of deleting data, which may involve removing whole records from a dataset or overwriting them. For example, using of zeros and ones to overwrite the personal identifiers in the data.

Once the personal identifiers are overwritten, that data will be rendered unrecoverable, and therefore it’s no longer classed as personal data.

This deletion process should include backup copies of data. Whilst personal data may be instantly deleted from live systems, personal data may still remain within the backup environment, until it is overwritten.

If the backup data cannot be immediately overwritten it must be put ‘beyond use’, i.e. you must make sure the data is not used for any other purpose and is simply held on your systems until it’s replaced, in line with an established schedule.

Examples of where data may be put ‘beyond use’ are:

  • When information should have been deleted but has not yet been overwritten
  • Where information should have been deleted but it is not possible to delete this information without also deleting other information held in the same batch

The ICO (for example) will be satisfied that information is ‘beyond use’ if the data controller:

  • is not able, or will not attempt, to use the personal data to inform any decision about any individual or in a way that affects them;
  • does not give any other organisation access to the personal data;
  • has in place appropriate technical and organisational security; and
  • commits to permanently deleting the information if, or when, this becomes possible.

Destruction of physical records

Destruction is the final action for about 95% of most organisations’ physical records. Physical destruction may include shredding, pulping or burning paper records.

Destruction is likely to be the best course of action for physical records when the organisation no longer needs to keep the data, and when it does not need to hold data in an anonymised format.

Controllers are accountable for the way personal data is processed and consequently, the disposal decision should be documented in a disposal schedule.

Many organisations use other organisations to manage their disposal or destruction of physical records. There are benefits of using third parties, such as reducing in-house storage costs.

Remember, third parties providing this kind of service will be regarded as a data processor, therefore you’ll need to make sure an appropriate contract is in place which includes the usual data protection clauses.

Destruction may be carried out remotely following an agreed process. For instance, a processor might provide regular notifications of batches due to be destroyed in line with documented retention periods.

Don’t forget unstructured data!

Retention periods will also apply to unstructured data which contains personal identifiers. The most common being electronic communications records such emails, instant messages, call recordings and so on.

As you can imagine, unstructured data records present some real challenges. You’ll need to be able to review the records to find any personal data stored there, so it can be deleted in line with your retention schedules, or for an erasure request.

Depending on the size of your organisation, you may need to use specialist software tools to perform content analysis of unstructured data.

In summary, whilst data retention as a concept appears straightforward, it does require some planning, clearly assigned responsibilities for implementing retention periods, and the technical means to do so effectively.

The three foundations of good data governance

January 2024

People, processes and technologies

Creating a clear data governance strategy is crucial to making sure data is handled in line with your organisation’s aims and industry best practice.

Data governance is often thought of as the management process by which an organisation protects its data assets and ensures compliance with data laws, such as GDPR. But it’s far broader than compliance. It’s a holistic approach to data and should have people at its very heart. People with defined roles, responsibilities, processes and technologies which help them make sure data (not just personal data) is properly looked after and wisely used throughout its lifecycle.

How sophisticated your organisation’s approach needs to be will depend on the nature and size of your business, the sensitivity of the data you hold, the relationships you have with business partners, and customer or client expectations.

Benefits of good data governance

There are many benefits this activity can bring, including:

  • Minimising risks to the business, your employees, customers and suppliers
  • Giving your people clarity around expected behaviours and best practices
  • Embedding compliance requirements

A strong data governance approach can also help an organisation to make the most of their data assets, improve customer experience and benefits, and leverage competitive advantage.

Data governance – where to start?

There are three foundational elements which underpin successful data governance – People, Processes and Technologies.

Data governance people processes technologies

People

Engaging with stakeholders across the organisation to establish and embed key roles and responsibilities for data governance.

Many organisations look to establish a ‘Data Ownership Model’ which recognises data governance is an organisational responsibility which requires close collaboration across different roles and levels, including the delegation of specific responsibilities for data activities.

Here’s some examples of roles you may wish to consider:

  • Data strategy lead – such as Chief Data Officer / Chief Digital Officer
  • Data protection lead – such as Data Protection Officer (DPO), if you have one
  • Information security lead – such as Chief Information Security Officer (CISO) or Chief Technology Officer
  • Information asset owners (or data owners) – leaders of business functions / teams which collect and/or use personal data for particular purposes. Such as HR, Marketing & Sales, Finance, Operations, and so on.
  • Data specialists – heavy users of complex datasets, such as data analysts and data scientists.
  • System owners – the people who manage the key systems which hold personal data, such as IT managers.

Processes

Think about all the processes, policies, operating procedures and specialist training provided to guide your employees and contractors to enable them to handle data in line with your business expectations – as well to comply with the law. For example:

Without these in place and regularly updated, your people can’t possibly act in the ways you want and expect them to.

In my experience, success comes from keeping these items concise, and as relevant and engaging as possible. They can easily be forgotten or put in the ‘maybe later’ pile…  a little time and effort can really pay dividends!

Technologies

The technologies which underpin all data activities across the data lifecycle. For example, your HR, marketing & CRM, accounting and other operational systems you use regularly. Data governance requires those responsible for adopting technologies to ensure appropriate standards and procedures are in place which ensure appropriate:

  • Accessibility and availability standards
  • Data accuracy, integrity and quality management
  • Privacy and security

Looking at privacy technology in particular, the solutions available have really progressed in recent years in terms of both their capability and ease of use. Giving DPOs and others with an interest in data protection clear visibility of where the risks lie, help to prioritise them and pointers to relevant solutions. They can also help provide clear visibility and oversight to the senior leadership team.

The ‘Accountability Principle’

Data governance goes hand in hand with accountability – one of the core principles under GDPR. This requires organisations to be ready to demonstrate the measures and controls they have to protect personal data and in particular, show HOW they comply with the other data protection principles.

Appropriate measures, controls and records need to be in place to evidence accountability. For example, a Supervisory Authority (such as the ICO) may expect organisations to have:

  • Data protection programme, with clear data ownership & governance and regular reporting up to business leaders
  • Training and policies to guide staff
  • Records of data mapping exercises and processing reviews, such as an Information Asset Register and Record of Processing Activities
  • Risk assessments, such as Data Protection Impact Assessments and Legitimate Interests Assessments
  • Procedures for handling of individual privacy rights and data breaches
  • Contracts in place between organisations which include the relevant data protection clauses, including arrangement for restricted international data transfers
  • Data sharing agreements

Ready to get started?

If you’re keen to reap the benefits of improved compliance and reduced risk to the business, the first and crucial step is getting buy-in from senior leadership and a commitment from key stakeholders, so I’d suggest you kick-off by seeking their support.

Data Protection Policies – what do businesses need?

September 2023

Under EU and UK data protection law businesses need to make sure they have ‘appropriate technical and organisational measures’ in place to protect personal data. Organisational measures include making sure staff receive adequate data protection training and guidance about how they should handle personal data.

In my experience, people are keen to ‘do the right thing’ with personal data, but are sometimes unsure how to go about it.

This is where well-crafted policies can really help, sitting alongside and integrated with employee training. Unfortunately people often have a negative view of policies. Long-winded policies, full of impenetrable jargon which regurgitates the law can turn people off.

A vanilla one-size fits all approach has little value… but there’s a much better way. A well-written, easy-to-read, concise policy can communicate ‘what good looks like’ for your business and explain how your people should behave to deliver good practice.

Yes, you absolutely need to take into account what the law says. A policy should identify key risk areas, but crucially it should also tell your people how they should act to meet your company standards – which include legal compliance.

Don’t shy away from stressing the benefits for your business of acting responsibly. Focus on the needs of your business sector and the unique nature of your businesses processing.

Make policies relevant to your workforce and how your business operates. Even better if you can, tie-in the launch of improved data policies with data protection training, which shares the main themes from the policies, this can really bring them to life , improve awareness and reinforce positive behaviours.

What data protection related policies are needed?

First decide which policies you actually need and how they should fit together. My favoured approach is to have just two ‘parent’ data policies, a Data Protection Policy and an Information Security Policy, then link out to ‘child’ policies or procedures which sit below them.

You might consider a third parent policy, such as Acceptable Use, but personally I prefer information about acceptable use to be included within the Data Protection and Information Security policies, so people don’t have to search around.

Here’s a typical Policy Framework, showing the two ‘parent’ policies and examples of possible ‘child’ policies or procedures below.

The range of policies you’ll need will vary from business to business. A small company, with a handful of employees, processing relatively less sensitive data won’t need a raft of policies.

Many micro or small businesses may just focus on having a Data Protection Policy (which covers the data lifecycle from creation through to retention) and an Information Security Policy. Alongside these you’ll definitely need a clear procedure for handling data breaches and individual privacy rights.

How to write helpful, practical data protection policies

As said, too often policy documents are littered with legalise and jargon. Sometimes it feels like a policy has to be formal and massively detailed. Not true. People shouldn’t need a lot of specialist knowledge to understand your policies, particularly those aimed at ALL staff. Straight-forward instructions are more likely to be read, which means more people are likely to follow them.

Take a look at the way your policies are written. Are they a bit dry? If they could do with freshening up, here are some simple do’s and don’ts to consider:

Do’s

  • use everyday words in place of jargon
  • explain any necessary terminology in plain English
  • break up blocks of text with headings, lists and tables
  • highlight key messages you want to get across
  • include useful tips
  • give useful examples tailored to your business
  • rope in your Comms or L&D team to help simplify things (or anyone who’s good with words)
  • cut out detail by linking to other related policies, guidelines, procedures
  • ask for feedback – how often do people use them? Do they find them helpful? What would make them better?

Don’ts

  • avoid complex language / legalese
  • avoid ‘insider’ jargon – why say ‘data subject’ if you could say people, individuals, customers, patients etc?
  • avoid cut-and-paste definitions from GDPR text – where you use data protection terms, such as controller, processor, third-party, anonymisation, automated decision-making explain what these mean in layman’s terms
  • Avoid information overload

Of course, balance is important. While overly complex policies will gather dust, we need to include enough useful and important information to get key messages across. We’re not talking about talking down to people or patronising them, either.

Of course, we also need to make sure people are aware of relevant policies and can easily lay their hands on them.

How to communicate data protection policies

I’d recommend you host policies on your Intranet, if you have one, and create them in the form of web pages rather than PDFs. It’s good practice to include hyperlinks to and from topic-specific guidance notes, so people can easily navigate to find more about a specific topic. This helps you to keep the parent policies short and concise – easy to digest.

When you carry out data protection training, remind people where to find related policies. In fact throughout the year use near-misses, news stories and other events to reinforce key messages and point to your policies.

Well-crafted easy to digest data protection related policies will go a long way to guide staff on how you expect them to handle and keep personal data secure in their day-to-day roles. But as always proportionality is key, a smaller business handling fairly insensitive data wouldn’t be expected to have multiple policies.

Data breaches – human or a catalogue of errors?

August 2023

Why systems fail

The recent spate of serious data breaches, not least the awful case involving the Police Service of Northern Ireland (PSNI), left me wondering: who’s really to blame? We’re used to hearing about human error, but is it too easy to point the finger?

Is it really the fault of the person who pressed the send button? An old adage comes to mind, ‘success has a thousand fathers, failure is an orphan.’

Of course, people make mistakes. Training, technology and procedures can easily fail if ignored, either wilfully or otherwise. Yes, people are part of the equation. But that’s what it is. An equation. There are usually other factors at play.

In the PSNI case – one involving safety-critical data – I would argue there’s a strong argument that any system allowing such unredacted material to enter an FOIA environment in the first place is flawed?

Nobody is immune from human error. About nine years ago, on my second day in a new compliance role, I left my rucksack on the train. Doh! Luckily, there was no personal data relating to my new employer inside. I lost my workplace starter pack and had to cancel my debit card. I recall the sinking feeling as my new boss said, ‘well, that’s a bit embarrassing for someone in your job’. It was. But I knew it could have been so much worse.

Approximately 80% of data breaches are classified by the Information Commissioner’s Office as being caused by human error. Common mistakes include:

  • Email containing personal data sent to the wrong recipients
  • Forwarding attachments containing personal data in error
  • Failing to notice hidden tabs or lines in spreadsheets which contain personal data (this is one of the causes cited in the PSNI case)
  • Sensitive mail going to the wrong postal address (yes, a properly old-fashioned dead wood data breach!)

However, sometimes I hear about human error breaches and don’t think ‘how did someone accidently do that?’ Instead, I wonder…

  • Why didn’t anyone spot the inherent risk of having ALL those records in an unprotected spreadsheet in the first place?
  • Why wasn’t there a system in place to prevent people being able to forget to blind copy email recipients?
  • Is anyone reviewing responses to Data Subject Access Requests or FOI requests? What level of supervision / QA exists in that organisation?
  • Why is it acceptable for someone to take confidential papers out of their office?

I could go on.

Technical and Organisational Measures (TOMs)

Rather than human error, should we be blaming a lack of appropriate technical and organisational measures (TOMs) to protect personal data? A fundamental data protection requirement.

We all know robust procedures and security measures can mitigate the risk of human error. A simple example – I know employees who receive an alert if they’re about to send an attachment containing personal data without a password.

Alongside this, data protection training is a must, but it should never be a ‘tick box’ exercise. It shouldn’t be a case of annual online training module completed; no further action required! We need to make sure training is relevant and effective and delivers key learning points and messages. Training should be reinforced with regular awareness campaigns. Using mistakes (big or small) as case studies are a good way to keep people alert to the risks. This is another reason why post-event investigation is so important as a lesson-learning exercise.

Rather than being a liability, if we arm people with enough knowledge they can become our greatest asset in preventing data breaches.

Chatting with my husband about this, he mentioned a boss once asking him to provide some highly sensitive information on a spreadsheet. Despite the seniority and insistence of the individual, my husband refused. He offered an alternative solution, with protecting people’s data at heart. Armed with enough knowledge, he knew what he had been asked to do was foolhardy.

Lessons from previous breaches

It’s too early to call what precisely led to these recent breaches:

  • The Police Service of Northern Ireland releasing a spreadsheet containing the details of 10,000 police officers and other staff public in response to a Freedom of Information Request
  • Norfolk and Suffolk Police accidentally releasing details of victims and witnesses of crime
  • Scottish genealogy website revealing thousands of adopted children’s names.

However, we can learn from previous breaches and the findings of previous ICO investigations.

You may recall the case of Heathrow Airport’s lost unencrypted memory stick. Although ostensibly a case of human error, the ICO established the Airport failed not only ‘to ensure that the personal data held on its network was properly secured’, but also failed to provide sufficient training in relation to data protection and information security. The person blamed for the breach was unaware the memory stick should have been encrypted in the first place.

Then there was the Cabinet Office breach in which people’s home addresses we published publicly in the New Year’s Honours list. The actual person who published the list must’ve had a nightmare, when they realised what had happened. But the ICO findings revealed a new IT system was rushed in and set up incorrectly. The procedure given for people to follow was incorrect. A tight deadline meant short-cuts were taken. The Cabinet Office was found to have been complacent.

The lesson here? Data breaches aren’t always solely the fault of the person pressing the ‘send’ button. Too often,  systems and procedures have already failed. Data protection is a mindset. A culture. Not an add-on. As the PSNI has sadly discovered, in the most awful of circumstances.

The impact breaches can have on employees, customers, victims of crime, patients and so on, can be devastating. Just the knowledge that their data is ‘out there’ can cause distress and worry.

Data protection law doesn’t spell out what businesses must do. To know where data protection risks lie, we need to know what personal data we have across the business and what it’s being used for.  Risks need to be assessed and managed. And the measures put in place need to be proportionate to the risk.

Data Protection Impact Assessments Guide

July 2023

A quick guide to managing DPIAs

This short guide to Data Protection Impact Assessments covers what a DPIA is and when it’s mandatory to conduct one under UK GDPR and EU GDPR. It also includes helpful tips on how to manage the process.

DPIAs not only help to protect people’s data, they also help to protect the business.