Are you collecting more data than you need?

Five good reasons to apply the data minimisation principle

How often when completing an online form, or downloading a new app, do you think, “why do they need this information?”

I often do. I get frustrated when I can’t fathom out why certain fields are mandatory, like phone number or date of birth. Okay, so I work in data protection. I’m highly tuned to being affronted by this stuff, but I doubt I’m alone.

Sometimes we’re forced to grit our teeth and soldier on (standing in the rain, desperately trying to download yet another parking app, forced to hand over our vital details).

But in other situations we can choose not to engage with companies because they ask for too much of our personal information, or immediately delete an app for the same reason. Alternatively, we may be tempted to provide bogus details, where we can’t see any reasonable purpose for the request (or suspect a phone number will purely be used to badger us).

Faced with yet another data-hungry form this week, I began thinking (again) about the benefits of minimising the personal information collected.

Yes, it’s a core data protection principle under GDPR / UK GDPR, meaning organisations are legally required to collect personal data, which is relevant, adequate and limited to what’s necessary for the purpose(s) it’s being used for. But it’s also a sound approach for other reasons….

Here are five more reasons for streamlining data collection…

1. Build trust

If people think you’re collecting more information than necessary, they may be sceptical, not trust you, and decide to disengage. People are more likely to put their trust in organisations who collect data responsibly.

2. Reduce data breach risks

Minimising personal data mitigates the severity of any impact if you suffer a data breach. This could not only reduce the risk for those affected but lessen the negative impact on your organisation. It could even be the difference between a reportable breach and one that’s unlikely to pose a risk. A data breach of purely names and email addresses, won’t routinely be as serious as a breach which also includes telephone numbers, dates of birth, postal addresses etc.

3. Improve accuracy

Data minimisation can improve the quality of your data, reducing the risk of holding outdated and inaccurate information. This in turn helps to meet another data protection principle; personal data must be accurate and kept up to date.

4. Prevent other uses

If you collect more personal details than you need, you’re leaving the door open to employees (perhaps unwittingly) deciding to use it for other, unintended or unauthorised purposes. Or a purpose which you haven’t been transparent about and may lead to complaints or regulatory action. And yes, this helps to meet another principle: purpose limitation.

5. Save time and complexity of privacy rights requests

Minimising the data held, can make the process of handling privacy rights requests more efficient. For example, there’s less data to sift through when responding to a DSAR, or less data to erase. It also saves awkward questions like, “why do you have this information?”

These points all apply more broadly than simply to information collected via online forms or apps. The principle of data minimisation applies to all the personal data an organisation collects, uses and stores. But as a starter for ten? Why not streamline those data collection forms, they’re a window into your attitude to people’s information, and what your potential customers see first.

GDPR RoPA simplification

Will EU proposals to change Records of Processing Activities requirements have an impact in practice?

As GDPR passes its 7th birthday, there’s been a flutter of excited commentary about European plans to make changes to the ground-breaking data protection law. In particular, potential amendments aimed at easing the compliance burden on small to medium-sized businesses.

So far, it’s fair to say the proposed changes from the European Commission are far from earth-shattering (albeit there could be more in the pipeline). A key proposal relates to Article 30, Records of Processing Activities. The obligation to keep a RoPA would no longer apply to organisations with fewer than 750 employees provided their processing activities are unlikely to pose a ‘high risk‘ to the rights and freedoms of individuals.

The proposal also clarifies the processing of special category data for purposes related to employment, social security and social protection would not, on their own, trigger the requirement to maintain Article 30 records.

For comparison, the existing exception only applies to organisations with less than 250 employees, unless the processing carried out is:

 Likely to result in a risk to the rights and freedoms of data subjects,
 The processing is not occasional, or
The processing includes special category data or personal data relating to criminal convictions and offences.

What impact might this RoPA change have?

As many organisations process special category data (even if just for their employees), and processing activities are often routine, not occasional, the current exception for smaller companies is limited in scope. The proposed wider exemption would clearly apply to far more organisations.

I can absolutely see why the Commission has homed in on RoPA requirements, as in my experience many organisations struggle to maintain an up-to-date RoPA, or don’t have one at all. But how helpful could this change actually be?

In practice, organisations subject to GDPR will still need to assess whether their processing activities involve ‘high risk’ to individuals. To do this they will need to weigh up their purpose(s) for processing, their lawful basis, how long they keep personal data, who it is shared with, whether any international data transfers are involved, what security measures are in place and so on.

It seems a bit of a catch 22 – a RoPA is a great way of capturing this vital information and clearly ascertaining where risk might occur. Alongside this, organisations will still need to meet transparency requirements and the right to be informed. And, yes you guessed it, an accurate RoPA is very helpful ‘checklist’ in making sure a privacy notice is complete.

We’ve written more about the benefits of a RoPA here.

Importantly, if this proposed change goes ahead, it won’t apply to organisations which fall under the scope of UK GDPR (unless the UK Govt decides to adopt a similar change).

Notably, fairly significant changes to UK GDPR’s accountability requirements were on the cards under the previous Conservative Government’s data reform bill. However, seen as too controversial, these were swiftly dropped after the election in the new Labour Government’s Data (Use and Access) Bill (DUA).

It’s possible the UK could regret not being more ambitious in the DUA Bill; there’s an obvious irony given oft-heard criticisms of EU overregulation – here’s a case where the EU’s easing of certain requirements could leave UK organisations with more onerous rules.

DPIAs: how to get organisational buy-in

March 2025

Data Protection Impact Assessments (DPIAs) can get a bad rap. Project managers, team leaders and others may not understand them, complain they’re too onerous to complete or say they ‘slow things down’. The result – data protection risks may not be identified or mitigated. Assessments may get overlooked, conducted in a less than thorough way, or get started but remain incomplete.

To banish the negative vibes we need to shout about the benefits of DPIAs. Make sure relevant teams know what they are, when and how to conduct them, and most importantly make sure the process is clearly explained and straightforward to follow.

When used well in the right situations, they can be one of the most useful tools in your organisation’s data protection toolkit. It can’ be stressed enough – DPIAs help to identify, assess and tackle risks before they see the light of day. They help you meet you protect the rights and interests of your customers and employees, protect your business reputation, meet your GDPR accountability obligations and demonstrate how you comply with data protection laws.

Let’s take a look at how we breathe new life into the DPIA process. But first a quick recap on what the law requires…

When DPIAs are mandatory

Sometimes there’s no choice and a DPIA is a ‘must do’. Under GDPR/UK GDPR it is mandatory to conduct a DPIA when projects are likely to represent a ‘high risk’ to those whose personal data is involved. The law gives us three examples:

Large scale use of special category data
Systematic and extensive profiling with significant effect
Public monitoring on a large scale

The above activities are far from routine, so thankfully the UK’s Information Commissioner’s Office (ICO) and other European Data Protection Authorities have published their own lists of processing ‘likely to result in high risk’. For example, the ICO sets out the following:

1. Using innovative technologies or the novel application of existing technologies (including AI).

2. Any decisions which could lead to denial of service; processing which makes decisions about an individual’s access to a product, service, opportunity or benefit which is based to any extent on automated decision-making (including profiling) or involve processing special category data.

3. Large-scale profiling of individuals.

4. Any processing of biometric data, where this is used for identification purposes.

5. Any processing of genetic data (unless by an individual GP or health professional for the provision of health care directly to the person concerned).

6. Combining, comparing or matching personal data gathered from multiple sources.

7. Any invisible processing – this is where personal data is not collected directly from individuals, and they are not aware of how it’s being used (i.e. the effort of providing privacy information to individuals would be disproportionate).

8. Tracking individual’s geolocation or behaviour.

9. Targeting children or other vulnerable individuals.

10. Risk of physical harm – where a personal data breach could jeopardise the physical health or safety of individuals.

For more detail please see the ICO DPIA Guidance.

How to assess ‘high risk’

DPIAs aren’t required for every new or change of activity and insisting teams undertake them too often can turn them into a needless box-ticking exercise and can feed into a general air of malaise.

Judgement calls need to be made to assess ‘high-risk’ and ‘large-scale’ and a method for evaluating where the threshold falls. This will differ depending on sector, nature of data handled, organisational risk appetite and so on. Regulated sectors, such as financial services and telecoms, have more to think about and may adopt a cautious approach. Also, bear in mind a DPIA can be a helpful risk assessment exercise even when a project doesn’t fall under the mandatory requirements.

Adopt a screening process

In my experience, embedding a straight-forward screening questionnaire is a great way to effectively sift through change projects and decide which need a more detailed assessment and which don’t. You can either ask teams to complete the questionnaire, or set aside 30 minutes to lead them through the screening. Then the DPO or data protection leader can make the call. A screening process may include questions such as:

What does the project /activity hope to achieve?
What personal information is involved?
Does this include more sensitive data (like financial details) or special category data?
Where did we source the data from?
Does the activity involve children’s data or others who would be considered vulnerable?
Will data be shared with other organisations?
Could what we’re doing be considered innovative or cutting edge?
Are we using personal details for a new purpose?

This is not an exhaustive list, there are other pertinent questions to ask, but try not to make it too long.

Engage with your teams

First rule of DPIA Club is… we MUST talk about it!

Build relationships with the people who ‘do new stuff’ with your data. The people who run development projects and the key stakeholders – such as heads of the main functions which process personal data across your business, e.g. Marketing, Operations, HR, etc. If you have a Procurement team, then target them too.

Ask what projects they have on the horizon which could affect the way personal data is used. The aim is to make them aware of DPIA requirements and ask them to give you an early ‘heads up’ if they are looking to onboard a new service provider or use data for an innovative new project.

Let them know tech projects and system migrations almost always involve some form of personal data processing or other. They should be mindful of the potential for this to lead to privacy risks.

If they think about data protection from the outset it will save valuable time and money in the long run. Save unwelcome hiccups along the line. Give them examples of how things have gone wrong or could go wrong.

You could raise awareness across the business using your intranet, email reminders, posters, drop-in clinics … whatever it takes to get the message across. ‘Training’ sessions with key stakeholders can also really help to enhance their risk assessment skills.

Use a good DPIA template

In my opinion too many businesses use complex and jargon-filled DPIA templates, which many people find hard to understand. They ask questions in ‘GDPR-talk’ which people find hard to grasp and answer, and they often don’t really help people to identify what privacy risks actually look like.

Take a look at your DPIA template with fresh eyes. If you don’t like it use a better one, or adapt it to fit your business ways of working.

Be prepared for Agile working

Many development projects use Agile methodology; breaking projects into smaller manageable cycles called sprints. These allow teams to adapt quickly to changes and deliver incremental gains more quickly. This means adapting your assessment approach. You won’t get all the answers you need at the start. Stay close to the project as it evolves and be ready to roll your DPIA in line with scheduled sprints.

I hope this has given you some ideas for how to engage your colleagues and freshen up the DPIA process. Dispelling the myth DPIAs are a waste of time, too complex or too onerous is a fight worth winning.

Online services face scrutiny over use of children’s data

March 2025

The importance of compliance with the UK Children’s Code

Social media platforms. Content streaming services. Online gaming. Apps. Any online services popular with children carry an inherent privacy risk.

Along with growing concerns over protecting them from harmful content, there’s an increasing focus on children’s privacy. Beyond the companies you’d expect to be impacted, it’s worth remembering these issues can affect a growing number of other organisations.

We know children are being exposed to inappropriate and harmful content. Some content is illegal like child sexual abuse images or content promoting terrorism, but other material can still cause harm. Such as content promoting eating disorders, or content which is inappropriate for the age of children viewing it, or is overly influential.

The Information Commissioner’s Office (ICO) recently launched an investigation into TikTok, amid concerns at how the platform uses children’s data – specifically around how their data is used to deliver content into their feeds. The regulator is also investigating the image sharing website Imgur and the social media platform Reddit, in relation to their use children’s data and their age verification practices.

These investigations are part of wider interventions into how social media and video sharing platforms use information gathered about children and the ICO say it’s determined to continue its drive to make sure companies change their approach to children’s online privacy, in line with the Children’s Code, which came into force in 2021.

This all serves as a timely reminder of the need to comply with this Code.

What is the Children’s Code?

The Children’s Code (aka ‘Age-Appropriate Design Code’) is a statutory code of practice aimed at protecting children’s privacy online. It sets out how to approach age-appropriate design and gives fifteen standards organisations are expected to meet. These are not necessarily technical standards, but more principles and required privacy features.

Who does the Children’s Code apply to?

A wide range of online services are within scope, including apps, search engines, online games, online marketplaces, connected toys and devices, news and educational sites, online messaging services and much more.

If children are ‘likely’ to access your online service(s), even if they are not your target audience, the code applies. For example, free services, small businesses, not-for-profits and educational sites are all in scope. Companies need to ask themselves – is a child likely to use our product or service online?

What ages does it apply to?

The code adopts the definition of a child under the UN Convention on the Rights of the Child (UNCRC) which is anyone under the age of 18. This means it applies to online services likely to be accessed by older children aged 16 and 17, not just young children. (This shouldn’t be confused with the age of consent for a child, which for online services is 13 in the UK).

Who does the Children’s Code not apply to?

Some public authority services are out of scope. For example, an online public service which is not provided on a commercial basis, or a police force with an online service with processes personal data for law enforcement purposes. Preventative or counselling services are also not in scope, such as websites or apps which specifically provide online counselling or other preventative services to children. However, more general health, fitness and wellbeing apps are in scope.

How do you assess ‘likely to be accessed’ by a child?

Crucially, the Code covers services which may not be specifically ‘aimed or targeted’ at children, but are ‘likely’ to be accessed by them. Each provider will need to assess this, and the Code provides some questions to help you:

Is the possibility of children using your service more probable than not?
Is the nature and content of the service appealing to children even if not intended for them? (Remember this includes older 16-17 year olds)
Do you have measures in place to prevent children gaining access to an adult only service?

This assessment may not always be clear cut, but it’s worth nothing the Code states:

If you decide that your service is not likely to be accessed by children and that you are therefore not going to implement the code then you should document and support your reasons for your decision. You may wish to refer to market research, current evidence on user behaviour, the user base of similar or existing services and service types and testing of access restriction measures.

If you initially judge that the service is not likely to be accessed by children, but evidence later emerges that a significant number of children are in fact accessing your service, you will need to conform to the standards in this code or review your access restrictions if you do not think it is appropriate for children to use your service.

This means there’s a clear expectation online services may need to have evidence if they decide they do not need to conform with the Code.

The 15 standards of the Children’s Code

The Code is extremely detailed. Here’s a summary of the salient points:

1. Best interest of the child – consider the needs of children using your service and how best to support those needs. The best interests of the child should be a primary consideration when designing and developing an online service.

2. Data Protection Impact Assessments (DPIAs) – these should be carried out to identify and minimise any data protection risks to children.

3. Age-appropriate application – assess the age range of your audience. Remember, the needs of children of different ages should be central to design and development. Make sure children are given an appropriate level of protection about how their information is used.

4. Transparency – UK GDPR specifically states privacy information must be easy for children to find and easy for them to understand. The Code says you should consider bite-sized ‘just in time’ notices when collecting children’s data.

5. Detrimental use of data – don’t use children’s information in a way which would be detrimental to children’s physical or mental health and well-being.

6. Policies and community standards – if children provide you with their personal information when they join or access your service, you must uphold your own published T&Cs, policies and community standards. Children should be able to expect the service to operate in the way you say it will and for you to do what you say you are doing.

7. Default settings – privacy settings for children must be set to ‘high’ by default, unless you can demonstrate a compelling reason for taking a different stance. It’s not enough to allow children to activate high privacy settings, you need to provide these by default.

8. Data minimisation – only collect and keep the minimum amount of data about children that’s necessary for you to provide your service.

9. Data sharing – do not share children’s data unless you can demonstrate a compelling reason to do so. The word ‘compelling’ is significant here. It means the bar for non-compliance is set very high.

10. Geolocation – any geolocation privacy settings should be switched off by default. Children should have to actively change the default setting to allow their geolocation to be used (unless you can demonstrate a compelling reason for this to be switched on).

11. Parental controls –make it clear to children if parental controls are in place and if children are being tracked or monitored by their parents.

12. Profiling – options which use profiling should be switched off by default (again, unless you can demonstrate a compelling reason for these to be switched on). This doesn’t mean profiling is banned, but if conducted measures should be in place to protect children from harmful effects. If profiling is part of the service you provide, you need to be sure this is completely necessary.

13. Nudge techniques – don’t use techniques which lead or encourage children to activate options which mean they give you more personal information, or turn off privacy protections.

14. Connected toys and devices – you need to conform with the code if you sell toys or devices (such as talking teddy bears, fitness bands or home-hub interactive speakers) which collect personal data and transmit via a network connection. If your electronic toys or devices do not connect to the internet and only store data in the device itself, the code does not apply.

15. Online tools – it must be easy for children to exercise their privacy rights and report concerns. The right to erasure is particularly relevant when it comes to a children. Mechanisms used to help children exercise their rights should be age appropriate, prominent and easy to use.

The ICO investigations into TikTok, Imgur and Reddit should be seen as a direction of travel for us all. Going forward, they signal how the regulator intends to treat compliance around children’s online privacy, welfare and safeguarding.

If your service uses children’s data even tangentially, it’s worth remembering, re-examining and considering the Code and how it might impact on your business.

Controller or processor? What are we?

January 2025

The importance of establishing if an organisation is acting as a processor or controller

On paper the definitions of controller and processor under GDPR (& UK GDPR) may seem straight-forward, but deciding whether you’re acting as a controller, joint-controller or processor can sometimes be a contentious area.  Many a debate has been had between DPOs and lawyers when trying to classify the relationship between different parties.

It’s not unusual for it to be automatically assumed all suppliers providing a service are acting as processors, but this isn’t always the case. Sometimes joint controllership, or separate distinct controllers, is more appropriate. Or perhaps a company is simply providing a service, and is not processing the client’s personal data (other than minimal contact details for a couple of employees).

It’s worth noting service providers (aka suppliers or vendors) will often act as both, acting as controller and processor for different processing tasks. For example, most will be a controller for at least their own employee records, and often for their own marketing activities too.

What GDPR says about controllers and processors

The GDPR tells us a controller means ‘the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data’.

A processor means ‘a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller’.

How to decide if we’re a controller or processor

There are some questions you can ask to help reach a conclusion:

Do we decide how and what personal data is collected?
Are we responsible for deciding the purposes for which the personal data is used?
Do we use personal data received from a client/partner for our own business purposes?
Do we decide the lawful basis for the processing tasks we are carrying out?
Are we responsible for making sure people are informed about the processing? (Is it our privacy notice people should see?)

If you’re answering ‘yes’, to some or all of these questions, it’s highly likely you’re a controller.

The ICO makes it clear it doesn’t matter if a contract describes you as a processor; “organisations that determine the purposes and means of processing will be controllers regardless of how they are described in any contract about processing services”.

A processor only processes a controllers’ personal data on their behalf and crucially doesn’t use this data for its own business purposes. While a processor may make its own day-to-day operational decisions, it should only process the data in line with the controller’s instructions, unless required to do otherwise by law.

Sometimes overlooked is the fact even if a handful of employees of a service provider only have access to a controller’s personal data it still means the service provider is ‘processing’ the data, and will be a processor.

Why it’s important to confirm your status

Controllers have a higher level of accountability. They are obliged to comply with all data protection principles, such as ensuring the lawfulness of processing, being transparent (e.g. privacy notices), fulfilling privacy rights requests and so on.

Processors do have a number of direct obligations, such as being required to implement appropriate  technical and organisation measures to protect personal data. A processor is also responsible for ensuring the compliance of any sub-processors it may use to fulfil their services to a controller. In fact processors are liable for the sub-processors.

The ICO issued a £3m fine to a software company in March 2025 for failing to implement sufficient measures, which you can read about here.

Data processing agreements

There’s a requirement to have an appropriate agreement in place between a controller and a processor.  Article 28 of EU / UK GDPR sets out specific requirements for what must be included in the contractual terms.

Such terms are often covered in a Data Processing Agreement/Addendum, but sometimes will be covered in a specific section on data protection within the main contract. (If there’s no DPA, no addendum and no section on data protection that’s a massive red flag!)

Often overlooked is the need to have clear documented instructions from the controller. It can be helpful to have these as an annex to the main contract (or master services agreement), so they can be updated if the processing changes. We’ve written more about the detail of what needs to be covered in contractual terms here. Another area which can get forgotten is sub-processors and international data transfers.

There are times where you’re looking to engage the services of a household name, a well-known and widely used processor. This sometimes leads to limited or no flexibility to negotiate contractual terms. In such cases, it pays to check the terms and, if necessary, take a risk-based view on whether you wish to proceed or not.

Before even looking at the terms, due diligence on prospective processors is a ‘must do’ for controllers, while taking an approach proportionate to the level of risk the outsourced processing poses. And for their part processors need to be prepared to prove their data protection and information security credentials.

Why the Tory app data breach could happen to anyone

June 2024

Shakespeare wrote (I hope I remembered this correctly from ‘A’ level English), ‘When sorrows come, they come not single spies but in battalions.’ He could’ve been writing about the UK Conservative Party which, let’s be honest, hasn’t been having a great time recently.

The Telegraph is reporting the party suffered it’s second data breach in a month. An error with an app led to the personal information of leading Conservative politicians – some in high government office – being available to all app users.

Launched in April, the ‘Share2Win’ app was designed as a quick and easy way for activists to share party content online. However, a design fault meant users could sign up to the app using just an email address. Then, in just a few clicks, they were able to access the names, postcodes and telephone numbers of all other registrants.

This follows another recent Tory Party email blunder in May, where all recipients could see each other’s details. Email data breaches.

In the heat of a General Election, some might put these errors down to ‘yet more Tory incompetence’. I’d say, to quote another famous piece of writing, ‘He that is without sin among you, let him first cast a stone’! There are plenty of examples where other organisations have failed to take appropriate steps to make sure privacy and security are baked into their app’s architecture. And this lack of oversight extends beyond apps to webforms, online portals and more. It’s a depressingly common, and easily avoided.

In April, a Housing Associate was reprimanded by the ICO after launching an online customer portal which allowed users to access documents (revealing personal data) they shouldn’t have been able to see. These related to, of all things, anti social behaviour. In March the ICO issued a reprimand to the London Mayor’s Office after users of a webform could in click on a button and see every other query submitted. And the list goes on. This isn’t a party political issue. It’s a lack of due process and carelessness issue.

It’s easy to see how it happens, especially (such as in a snap election) when there’s a genuine sense of urgency. Some bright spark has a great idea, senior management love it, and demand it’s implemented pronto! Make it happen! Be agile! Be disruptive! (etc).

But there’s a sound reason why the concept of data proteciton by design and by default is embedded into data protection legislation, and it’s really not that difficult to understand. As the name suggests, data protection by design means baking data protection into business practices from the outset; considering the core data protection principles such as data minimisation and purpose limitation as well as integrity & confidentiality. Crucially, it means not taking short-cuts when it comes to security measures.

GDPR may have it’s critics, but this element is just common sense. Something most people would get onboard with. A clear and approved procedure for new systems, services and products which covers data protection and security is not a ‘nice to have’ – it’s a ‘must have’. This can go a long way to protect individuals and mitigate the risk of unwelcome headlines further down the line, when an avoidable breach puts your customers’, clients’ or employees’ data at risk.

Should we conduct a DPIA?

A clear procedure can also alert those involved to when a Data Protection Impact Assessment is required. A DPIA is mandatory is certain circumstances where activities are higher risk, but even when not strictly required it’s a handy tool for picking up on any data protection risks and agreeing measures to mitigate them from Day One of your project. Many organisations would also want to make sure there’s oversight by their Information Security or IT team, in the form of an Information Security Assessment for any new applications.

Developers, the IT team and anyone else involved need to be armed with the information they need to make sound decisions. Data protection and information security teams need to work together to develop apps (or other new developments) which aren’t going to become a leaky bucket. Building this in from the start actually saves time too.

In all of this, don’t forget your suppliers. If you want to outsource the development of an app to a third-party supplier, you need to check their credentials and make sure you have necessary controller-to-processor contractual arrangements and assessment procedures in place – especially if once the app goes live, the developer’s team still has access to the personal data it collects. Are your contractors subbing work to other third party subcontractors? Do they work overseas? Will these subcontractors have access to personal data?

The good news? There’s good practice out there. I remember a data protection review DPN conducted a few years back. One of the areas we looked at was an app our client developed for students to use. It was a pleasure to see how the app had been built with data protection and security at its heart. We couldn’t fault with the team who designed it – and as such the client didn’t compromise their students, face litigation, look foolish or be summoned to see the Information Commissioner!

In conclusion? Yes, be fast. Innovate! Just remember to build your data protection strategy into the project from Day One.

Guide to identifying and managing data protection risks

March 2024

Data protection risks come in all shapes, sizes and potential severities. We need to be able to identify the risks associated with our use of personal data, manage them and where necessary put appropriate measures in place to tackle them.

How can we make sure  good risk management practices are embedded in our organisation? In this short guide we cover the key areas to focus on to make sure you’re alert to, and aware of risks.

1. Assign roles and responsibilities

Organisations can’t begin to identify and tackle data risks without clear roles and responsibilities covering personal data. Our people need to know who is accountable and responsible for the personal data we hold and the processing we carry out.

Many organisations apply a ‘three lines of defence’ (3LoD) model for risk management. This model is not only used for data protection, but is also effective for handling many other types of risk a business may face.

  • 1st line: where the leaders of the business functions that process data are appointed as ‘Information Asset Owners’ and they ‘own’ the risks from their function’s data processing activities.
  • 2nd line: Specialists like the DPO, CISO & Legal Counsel support and advise these the 1st line, helping them understand their obligations under data laws, so they can make well informed decisions about how best to tackle any privacy risks. They provide clear procedures for the 1st line to follow.
  • 3rd line: An internal or external audit function provides independent assurance.

3 lines of defence for data protection

For example, risk owners, acting under advice from a Data Protection Officer or Chief Privacy Officer, must make sure appropriate technical and organisational measures are in place to protect the personal data they’re accountable for.

In this model, the second line of defence should never become risk owners. Their role is to provide advice and support to the first line risk owners. They should try to remain independent and not actually make decisions on behalf of their first line colleagues.

2. Decide if you should appoint a DPO

Under the GDPR, a Data Protection Officer’s job is to inform their organisation about  data protection obligations and advise the organisation on risks relating their processing of personal data.

The law tells us you need to appoint a DPO if your organisation is a Controller or Processor and one or more of the following applies:

  • you are a public authority or body (except for courts acting in their judicial capacity); or
  • your core activities require large-scale, regular and systematic monitoring of individuals (for example, online behaviour tracking); or
  • your core activities consist of large-scale processing of special categories of data or data relating to criminal convictions and offences.

In reality, most small organisations are unlikely to fall under the current UK or EU GDPR requirements to appoint a DPO. In fact, many medium-sized business won’t necessarily need a DPO either. Find out more in our DPO myth buster.

3. Conduct data mapping & record keeping

Mapping your data and creating a Record of Processing Activities (RoPA) is widely seen as the best foundation for any successful privacy programme. After all, how can you properly look after people’s data if you don’t have a good handle on what personal data you hold, where it’s located, what purposes it’s used for and how it’s secured?

Even smaller organisations, which may benefit from an exemption from creating a full RoPA, still have basic record keeping responsibilities which should not be overlooked and could still prove very useful. Also see Why is data mapping so crucial?

4. Identify processing risks

Under data protection laws, identifying and mitigating risks to individuals (e.g. employees, customers, patients, clients etc) is paramount.

Risks could materialise in the event of a data breach, failure to fulfil individual privacy rights (such as a Data Subject Access Request), complaints, regulatory scrutiny, compensation demands or even class actions.

We should recognise our service and technology providers, who may handle personal data on our behalf, could be a risk area. For example, they might suffer a data breach and our data could be affected, or they might not adhere to contractual requirements.

It’s good to be mindful about commercial and reputational risks too which can arise from an organisation’s use of personal or non-personal data.

International data transfers are another are where due diligence is required to make sure these transfers are lawful, and if not, recognise that this represents a risk.

Data-driven marketing activities could also be a concern, if these activities are not fully compliant with ePrivacy rules – such as the UK’s Privacy and Electronic Communications Regulations (known as PECR). Even just one single complaint to the ICO could result in a business finding themselves facing a PECR fine and the subsequent reputational damage. GDPR, marketing & cookies guide

Data protection practitioners share tips on identify and assessing risks

5. Risk assessments

In the world of data protection, we have grown used to, or even grown tired of, the requirement to carry out a Data Protection Impact Assessment (DPIA) or a Privacy Impact Assessment (PIA) as it called in some jurisdictions.

Build in a process of assessing whether projects would benefit from a DPIA, or legally require one.  DPIAs are a great way to pinpoint risks and mitigate them early on before they become a bigger problem.

The value of risk assessments in the world of data protection compliance and Quick Guide to DPIAs

6. Issues arising from poor governance or lack of data ownership

In the real world, the three lines of defence model can come under strain. Sometimes those who should take responsibility as risk owners can have slippery shoulders and refuse to take on the risks.

Some processing doesn’t seem to sit conveniently with any one person or team. Things can fall through the cracks when nobody takes responsibility for making key decisions. On these occasions a DPO might come under pressure to take risk ownership themselves. But should they push back?

Strictly speaking, DPOs shouldn’t ‘own’ data risks; their role is to inform and advise risk owners. GDPR tells us; data protection officers, whether or not they are an employee of the controller, should be in a position to perform their duties and tasks in an independent manner” (Recital 97).

The ICO, in line with European (EDPB) guidelines, says; …the DPO cannot hold a position within your organisation that leads him or her to determine the purposes and the means of the processing of personal data. At the same time, the DPO shouldn’t be expected to manage competing objectives that could result in data protection taking a secondary role to business interests.”

So, if the DPO takes ownership of an area of risk, and plays a part in deciding what measures and controls should be put in place, could they may be considered to be ‘determining the means of the processing’? This could lead to a conflict of interest when their role requires them to act independently.

Ultimately, accountability rests with the organisation. It’s the organisation which uses the data, collects the data and runs with it. Not the DPO.

7. Maintain an up-to-date risk register

When you identify a new risk it should be logged and tracked on your Data Risk Register. The ICO expects organisations to: identify and manage information risks in an appropriate risk register, which includes clear links between corporate and departmental risk registers and the risk assessment of information assets.

To do this you’ll need to integrate any outcomes from risk assessments (such as DPIAs) into your project plans, update your risk register(s) and keep these registers under continual review by the DPO or responsible individuals.

Workplace use of facial recognition and fingerprint scanning

February 2024

Just because you can use biometric data, doesn’t mean you should

The use of biometric data is escalating, and recent enforcement action by the UK Information Commissioner’s Office (ICO) concerning its use for workplace monitoring is worth taking note of. We share 12 key considerations if you’re considering using facial recognition, fingerprint scanning or other biometric systems.

In a personal context, many use fingerprint or iris scans to open their smartphones or laptops. In the world of banking facial recognition, voice recognition, fingerprint scans or retina recognition have become commonplace for authentication and security purposes. The UK Border Force is set to trial passport free travel, using facial recognition technology. And increasingly organisations are using biometrics for security or employee monitoring purposes.

Any decision to use biometric systems shouldn’t be taken lightly. If biometric data is being used to identify people, it falls under the definition of Special Category Data under UK GDPR. This means there are specific considerations and requirements which need to be met.

What is biometric data?

Biometric data is also special category data whenever you process it for the purpose of uniquely identifying an individual. To quote the ICO;

Personal information is biometric data if it:

  • relates to someone’s physical, physiological or behavioural characteristics (e.g. the way someone types, a person’s voice, fingerprints, or face);
  • has been processed using specific technologies (e.g. an audio recording of someone talking is analysed with specific software to detect qualities like tone, pitch, accents and inflections); and
  • can uniquely identify (recognise) the person it relates to.

Not all biometric data is classified as ‘special category’ data but it is when you use it, or intend to use it, to uniquely identify someone. It will also be special category data if, for example, you use it to infer other special category data; such as someone’s racial/ethnic origin or information about people’s health.

Special category data requirements

There are key legal requirements under data protection law when processing special category data. In summary, these comprise:

  • Conduct a Data Protection Impact Assessment
  • Identify a lawful basis under Article 6 of GDPR.
  • Identify a separate condition for processing under Article 9. There are ten different conditions to choose from.
  • Your lawful basis and special category condition do not need to be linked.
  • Five of the special category conditions require additional safeguards under the UK’s Data Protection Act 2018 (DPA 2018).
  • In many cases you’ll also need an Appropriate Policy Document in place.

Also see the ICO Special Category Data Guidance.

ICO enforcement action on biometric data use in the workplace

The Regulator has ordered Serco Leisure and a number of associated community leisure trusts to stop using Facial Recognition Technology (FRT) and fingerprint scanning to monitor workers’ attendance. They’ve also ordered the destruction of all biometric data which is not legally required to be retained.

The ICO’s investigation found the biometric data of more than 2,000 employees at 38 leisure centres was being unlawfully processed for the purpose of attendance checks and subsequent payment.

Serco Leisure was unable to demonstrate why it was necessary or proportionate to use FRT and fingerprint scanning for this purpose. The ICO noted there are less intrusive means available, such as ID cards and fobs. Serco Leisure said these methods were open to abuse by employees, but no evidence was produced to support this claim.

Crucially, employees were not proactively offered an alternative to having their faces and fingers scanned. It was presented to employees as a requirement in order to get paid.

Serco Leisure conducted a Data Protection Impact Assessment and a Legitimate Interests Assessment, but these fell short when subject to ICO scrutiny.

Lawful basis

Serco Leisure identified their lawful bases as contractual necessity and legitimate interests. However, the Regulator found the following:

1) While recording attendance times may be necessary to fulfil obligations under employment contracts, it doesn’t follow that the processing of biometric data is necessary to achieve this.

2) Legitimate interests will not apply if a controller can reasonably achieve the same results in another less intrusive way.

Special category condition

Initially Serco Leisure had not identified a condition before implementing biometric systems. It then chose the relevant condition as being for employment, social security and social protection, citing Section 9 of the Working Time Regulations 1998 and the Employment Rights Act 1996.

The ICO found the special category condition chosen did not cover processing to purely meet contractual employment rights or obligations. Serco Leisure also failed to produce a required Appropriate Policy Document.

Read more about this ICO enforcement action.

12 key steps when considering using biometric data

If you’re considering using biometrics systems which will be used to uniquely identify individuals for any purpose, we’d highly recommend taking the following steps:

1. DPIA: Carry out a Data Protection Impact Assessment.

2. Due diligence: Conduct robust due diligence of any provider of biometric systems.

3. Lawful basis: Identify a lawful basis for processing and make sure you meet the requirements of this lawful basis.

4. Special category condition: Identify an appropriate Article 9 condition for processing special category biometric data. The ICO says explicit consent is likely to most appropriate, but other conditions may apply depending on your circumstances.

5. APD: Produce an Appropriate Policy Document where required under DPA 2018.

6. Accuracy: Make sure biometric systems are sufficiently accurate for your purpose. Test and mitigate for biases. For example, bias and inequality may be caused by a lack of diverse data, bugs and inconsistencies in biometric systems.

7. Safeguards: Consider what safeguards will be necessary to mitigate the risk of discrimination, false acceptance and rejection rates.

8. Transparency: Consider how you will be open and upfront about your use of biometric systems. How will you explain this in a clear, concise, and easy to access way? If you are relying on consent, you’ll need to clearly tell people what they’re consenting to, and consent will need to be freely given. Consent: Getting it Right

9. Privacy rights: Assess how people’s rights will apply, and have processes in place to recognise and respond to individual privacy rights requests.

10. Security: Assess what security measures will be needed by your own organisation and by any biometric system provider.

11. Data retention: Assess how long you will need to keep the biometric data. Have robust procedures in place for deleting it when no longer required.

12. Documentation: Keep evidence of everything!

More detail can be found in the ICO Biometric Data Guidance.