Top 10 Data Protection Tips for SMEs

January 2023

Is it onerous for SMEs to become compliant?

One of the stated aims of the UK Government’s Data Protection and Digital Information Bill is to support small businesses and remove unnecessary bureaucracy. 

As context, there are 5.6m businesses in UK of which SMEs (less than 250 employees) represents 99% of the total. According to IAPP research approximately 32,000 organisations in UK have a registered DPO. It’s right, therefore, to focus on SMEs. 

But how onerous is small business data protection now? Arguably, the answer is, not as onerous as you might think. We’ve created a top 10 checklist for start-ups and small businesses to help you decide what you should be concerned with: 

1.     Do I need to worry about data protection regulation? 

Yes. Pretty much any business processing personal data for commercial purposes need to worry about data protection. (It does not apply to purely ‘personal or household activity’). Having said that, the law and regulatory advice focuses on taking a ‘proportionate’ approach. There’s no one size fits all and it will depend on the risk appetite of your organisation. 

2.     Do I need a DPO?

Probably not. If the answer to these three questions is no, you don’t need a DPO…

  • Are you a public authority or body?
  • Do your core business activities require regular and systematic monitoring of individuals on a large scale?
  • Do your core business activities involve processing on a large scale ‘special category data’, or criminal convictions or offences data?

Even if you don’t need a DPO, it’s wise to nominate someone in your organisation as a data protection lead. This does not need to be a full-time role. Alternatively, you can outsource this activity to someone/a company who can provide the support on a part-time basis. 

3.     Do I need a RoPA (Record of Processing Activity)

Maybe. There’s no escaping the fact RoPAs are challenging documents to complete and can absorb a huge amount of time. Companies with more than 250 employees must always keep a RoPA – that’s just under 8,000 businesses in UK.

If you have less than 250 employees, you don’t need a RoPA if the following applies:

  • Processing does not pose a risk to the rights and freedoms of the data subject 
  • No special category data is being processed
  • If the processing is only done occasionally

The debate start when you consider what constitutes a ‘risk to the rights and freedom of the data subject’. It’s worth considering the type of data you handle rather than the volumes to help you decide whether to complete a RoPA. As a start up, you may not need a RoPA as defined in the legislation. However, having a record of what information is processed, for what purpose and under what lawful basis is a good idea even if the ICO RoPA form is not. 

There are changes afoot with regards to the RoPA under UK data reform plans, but a record of your activities may still be necessary, just not as current prescribed.

4.     Do I need to register with ICO?

Almost certainly YES. The ICO asks all businesses that process personal data to pay the Data Protection Fee. This is used to fund the ICO and its activities. This isn’t onerous. In fact, most small businesses will only have to pay £40 (or £35 with a direct debit). And that’s before you’ve considered whether you’re exempt. Not for profit status is a possible example. 

 5.     Do I need a privacy notice (policy)?

Yes. A privacy notice is a foundational piece of your data protection work. Any organisation which processes personal data needs to set out what data they are processing and how they are processing it as well as the data subject’s rights. The ICO’s checklist provides very clear guidance for what must be in a notice and what might be in a notice.

6.     How about a cookie notice?

Yes again. If you have a website, assume you need a cookie notice. Even if all you’re doing is using cookies to manage the performance of your website, a cookie notice is required. This does not need to cost money. You can get free software from the major privacy software providers. They have simple step by step set up guides. There is really no excuse not to have a cookie notice. 

7.     What about accountability?

Yes, but make it proportionate. In a nutshell, accountability means ‘evidencing your activities’. Keep a record of what you do, why you’re doing it and your decision-making. It also means making sure you have appropriate technical and organisational measures in place to protect personal data. Have staff been adequately trained in data protection? Do we have clear guidelines and/or policies to help them? 

8.     What about Individual Rights? 

Yes. Every individual has clear rights and irrespective of the size of the organisation you need to fulfil these requests. 

These rights include right of access, the right to rectification, the right to erasure, the right to restrict processing, the right to data portability, the right to object and the right not to be subject to a decision based solely on automated processing.

Not all of these might apply to a small business but it’s important to decide how to recognise and respond to these requests from individuals. 

9.    Don’t forget information security

Yes. Cyber Essentials was designed for SMEs. Arguably it’s the absolute minimum for any business. It does cost money but not a lot. Gaining the Cyber Essentials certification (if self-certified) costs £300. The five technical controls are: 

  • Boundary firewalls and internet gateways
  • Secure configuration.
  • Access control.
  • Malware protection.
  • Patch management.

10.  What about International Data Transfers? 

Hopefully no! If you and your suppliers are only operating in UK and Europe stop reading now. However, if any data is exported to a third country (such as USA, South Africa or India), there’s no escaping the fact that international data transfers can be painful to work through. 

When EU-US Privacy Shield was invalidated in 2020 this caused significant problems for data transfers between US and EU/UK. At the time, Max Schrems’ advice was to only work with companies based in UK or Europe who are not exporting data to third countries. However, this isn’t always possible – just consider how many people use Google, Microsoft or Mailchimp. 

Many, if not most, businesses will have dealings with these three and the reality is that you must accept they’re not going to change anything for you, or choose not to use them. 

Conclusion

Many small and start-up businesses can get ready relatively quickly. The trick for small business data protection is to review your arrangements on a regular basis and be aware if any more complicated processing emerges. For instance, anything involving automated processing, special category data, AI or children’s data carries significant risk and should be treated with care. 

There’s more helpful information available on the ICO’s Small Business Hub.

Takeaways from Meta’s huge fine

January 2023

Digital advertising faces significant changes in the wake of the latest fine to be levelled at Mark Zuckerberg’s Meta.

The Data Protection Commission (DPC) for Ireland has fined Meta (Meta Platforms Ireland Limited) a huge 390 million Euros. It was ruled Meta’s reliance on contract terms as the lawful basis for personalised advertising on both Facebook and Instagram, is invalid.

On top of the fine the DPC has given Meta three months to comply with its interpretation of the EU GDPR.

What does this mean for social media advertising?

Behavioural advertising on Facebook and Instagram platforms are targeted using user-profile information. It’s based on people’s online activity and other details they share with the platform. This helps advertisers to target individuals based on location, hobbies, interests and other behaviours.

This latest ruling calls into question whether social media platforms must seek their users’ prior opt-in consent for behavioural advertising, rather than rely on the contractual terms people sign up for to use the platforms.

If social platforms switch to an opt-in consent, users will inevitably gain far more control over the adverts they see. But on the flipside, the number of individuals available for targeting by advertisers is likely to decline. This would have a big impact the marketing mix for many companies.

What’s behind the Meta ruling?

The DPC’s investigation stretches back to complaints originally raised on the very first day EU GDPR came into force, in May 2018. From the get go, it was argued Facebook’s (now Meta) processing for personalised advertising was unlawful.

Significantly prior to May 2018, Facebook Ireland updated both Facebook and Instagram’s Terms of Service. ‘Implicit consent’ had previously been used for behavioural advertising, but with consent being much more onerous to achieve under GDPR, there was a switch to relying on contract as the new lawful basis for the ads.

Users were asked to click ‘I accept’ to the updated Terms of Service and, in doing so, by default accepted behavioural advertising as part of the service package. The platforms simply would not be accessible if users declined to do so. The Irish DPC has now rejected the validity of the contract as a valid lawful basis for behavioural advertising.

This ruling follows a lot of uncertainty and disagreement between the DPC, other EU regulators and the European Data Protection Board (EDPB), over the use of contract as a legal basis for this type of advertising.

The Chair of EDPB, Andrea Jelinek: ‘The EDPB binding decisions clarify that Meta unlawfully processed personal data for behavioural advertising. Such advertising is not necessary for the performance of an alleged contract with Facebook and Instagram users. These decisions may also have an important impact on other platforms that have behavioural ads at the centre of their business model.’

This latest ruling represents a U-turn by the DPC, who have now stated their decisions ‘reflect the EDPB’s binding determinations.’

This is not Meta’s only fine. In September 2022, the DPC fined Meta €405 million for allowing minors to operate business accounts on Instagram, and there have been others. Unsurprisingly Meta plans to appeal both of the DPC’s decisions.

Key takeaways for digital advertising

  1. The burning question is ‘Can I still run ads on Facebook & Instagram?’. Technically yes – the ruling applies to Meta, not its advertisers. Meta, for its part, said; ‘These decisions do not prevent personalised advertising on our platform.’ However, using theses platforms is not without potential risks.
  2. Data protection by design is paramount for digital advertisers. There’s a regulatory expectation that the interests, rights, and freedoms of individuals are respected. Platforms need to evidence these considerations have been taken into account.
  3. Users must be given a real choice. They must be given the ability to receive adverts without tracking, profiling, or targeting based on personal data. They must be given meaningful control and the platforms must be able to demonstrate there is user choice through the data lifecycle.
  4. Accountability is key – there should be genuine transparency around how and why personal data is processed and who is responsible for that processing.

Max Schrems, privacy activist and honorary chair of Noyb: ‘People now need to be asked if they want their data to be used for ads or not. They must have a ‘yes or no’ option and can change their mind at any time. The decision also ensures a level playing field with other advertisers that also need to get opt-in consent.’

Estelle Masse, Global Data Protection Lead at Access Now, said the decisions are ‘hugely significant‘ for online companies which rely on targeted ad revenues. She said they should look at whether the way they deliver ads online is ‘legal and sustainable.’

Data Governance Quick Guide

January 2023

Taking control of our data

In essence Data governance is a framework of management practices which makes sure data is used properly in line with our organisational aims, the law and best practice.

Think of it as embedding Data Protection by Design and by Default across the organisation. It means business objectives can be met without taking unnecessary risks with data. Data governance helps us to:

  • protect the business and those whose data we process: customers, employees, etc.
  • reduce our organisational risk profile
  • educate our people, by providing policy & guidance to them on how to use data in the safe and appropriate ways
  • build in an ethical approach
  • build our reputation, customer trust and enhance the value of our data assets
  • support our teams’ innovation with use of data.

The 6 data governance steps

building a robust data governance framework from the data protection consultancy DPN

1. Data discovery

It’s vital to identify data assets held across the business understanding how personal data is being gathered, stored, used and shared. It can be helpful to map where the data is located on systems, and document it.

Most medium to large businesses will need to do this anyway to create and maintain an Information Asset Register (IAR) and Records of Processing Activity (RoPA).

2. Policies & standards

If our people don’t know how we expect them to behave when handling other people’s data, we can’t expect them to make a great job of it. Are your policies and procedures all up to scratch? Having a straight-forward, easy to understand and practical Data Protection Policy is a good place to start (alongside relevant training). The importance of well-crafted easy to use policies shouldn’t be underestimated.

3. Stakeholder accountability

We need to identify key stakeholders within the business. Likely to be heads of key functions, such as HR, Operations, Sales & Marketing, and so on.

It’s good to establish data roles and responsibilities, so people are clear what aspects they and others are responsible for. Who has the authority to make decisions about certain data?

4. Risk assessment process

Businesses should have risk assessment procedures to discover, assess, prioritise and take action to mitigate data risks. A governance programme helps teams to identify and assess both existing and emerging risks, so they can be efficiently assessed and mitigated.

Think of data like a balance sheet: it has great potential to create value, but also carries risks and liabilities.

The aim of a data governance programme is to protect both the business and those whose data we process from harm which may arise. For example, things like inaccurate data, unlawful or unfair processing or using people’s data in ways they would not expect or want.
For certain projects it will be necessary to conduct a Data Protection Impact Assessment (DPIA).

5. Technical and organisational measures (TOMs)

Once privacy risks have been identified, we need to consider what measures could be put in place to tackle them. You may choose to mitigate them internally with new procedures or security measures, or perhaps work with a third party to adopt technical or operational measures. Privacy Enhancing Technologies – how they can help

Organisational measures include making sure there’s good awareness about data protection across the business, and employees receive appropriate training.

6. Executive oversight

Risks should be reported up the line to make sure the Senior leadership team has proper oversight and the opportunity to take appropriate action. If your organisation has a Data Protection Officer (DPO) this reporting will be part of the formal accountabilities for their role. But remember not all businesses need to have a DPO. Should we appoint a DPO?

Overcoming cultural challenges

Data protection and privacy professionals face a cultural challenge to win hearts and minds. I have sometimes heard legal or privacy teams described as ‘the department of no’. That’s not how we want to be seen!

Smart businesses are realising the value of taking privacy seriously. We should help our business colleagues to balance the needs of commercial and operational functions with legal & ethical requirements.

We shouldn’t just explain what the law requires. We must go further and help them our colleagues to find practical solutions. Collaboration and mutual understanding are essential ingredients for successful data governance.

Data Protection Basics: The 7 data protection principles

November 2022

Understanding the key principles of data protection

Let’s get back to basics. There are seven core principles which form the foundation of data protection law. Understanding and applying these principles is the cornerstone for good practice and key to complying with UK / EU GDPR.

Here’s our quick guide to the data protection principles.

1. Lawfulness, fairness and transparency

This principle covers 3 key areas.

a) Lawfulness – We must identify an appropriate ‘lawful basis’ for collecting and using personal data. In fact, we need to decide on a lawful basis for each task we use personal data for, and make sure we fulfil the specific conditions for that lawful basis. There are 6 lawful bases to choose from.

We need to take special care and look to meet additional requirements when using what’s termed ‘special category’ data or data which relates to minors or vulnerable people.

We should also be sure not do anything which is likely to contravene any other laws.

b) Fairness – We must only use people’s data only in ways that are fair. Don’t process data in a way which might be unexpected, discriminatory or misleading. This means evaluating any adverse affects on individuals.

c) Transparency – We must be clear, open and honest with people about how we use their personal information. Tell people what we’re going to do with their personal information. Routinely this is achieved by providing relevant privacy information at the point data is collected, and by publishing a complete and up to date privacy notice and making this easy to find. Transparency requirements apply right from the start, when we collect or receive people’s data.

2. Purpose limitation

This is all about only using personal details in the ways we told people they’d be used for. We must be clear about what our purposes for processing are and specify them in the privacy information we provide to individuals.

Sometimes we might want to use personal data for a new purpose. We may have a clear legal obligation to do it, but if not we should check the new purpose is compatible with the original purpose(s) we had for that data. If not, then we may need to secure the individual’s consent before going ahead.

Remember, if we surprise people, they ‘ll be more likely to complain.

3. Data minimisation

We must make sure the personal data we collect and use is:

  • Adequate – necessary for our stated purposes. Only collect the data we really need. Don’t collect and keep certain personal information ‘just in case’ it might be useful in future.
  • Relevant – relevant to that purpose; and
  • Limited to what is necessary – don’t use more data than we need for each specific purpose.

4. Accuracy

We should take ‘all reasonable steps’ to make sure the personal data we gather and hold is accurate, up-to-date and not misleading.

It’s good practice to use data validation tools when data is captured or re-used. For example, validate email addresses are in the right format, or verify postal addresses when these are captured online.

If we identify any of the personal information we hold is incorrect or misleading, we should take steps to correct or delete it promptly.

Data accuracy can decline over time. For example, people change their email address, move house, get married or divorced, their needs and interests change. And of course some people on your database may pass away. So we need to consider ways to keep our data updated and cleansed.

Perhaps find ways to give people the opportunity to check and update their personal details?

5. Storage limitation

Don’t be a hoarder! We must not keep personal data longer than necessary for the purposes we have specified.

Certain records need to be kept for a statutory length of time, such as employment data. But not all data processing has a statutory period. Where the retention period is not set by law, the organisation must set an appropriate data retention period for each purpose, which it can justify.

The ICO would expect us to have a data retention policy in place, with a schedule which states the standard retention period for each processing task. This is key step to making sure you can comply with this principle.

When the data is no longer necessary, we must destroy or anonymise it, unless there’s a compelling reason for us to keep it for longer. For example, when legal hold applies. For more information see our Data Retention Guidance.

6. Security

This is the ‘integrity and confidentiality’ principle of the GDPR – often known as the security principle. This requires organisations to make sure we have appropriate security measures in place to protect the personal data we hold.

UK / EU GDPR talks about ‘appropriate technical and organisational measures’ (known as TOMs). These includes things like physical and technical security measures, conducting information security risk analyses, having information security policies & standards in place to guide our staff.

Our approach to security should be proportionate to the risks involves. The ICO advises us to consider available technology and the costs of implementation when deciding what measures to take.

Some of the basics include transferring data securely, storing it securely, restricting access to only those who need it and authenticating approved data users.

Cyber Essentials or Cyber Plus can be helpful as an assurance framework to carry out a review of your data security arrangements.

Controllers should consider information security standards when appointing and managing relationships with processors, i.e. service providers handling personal data on your behalf to provide their services. Are your processors securely handling their processing of the data you control? Carry out appropriate due diligence to make sure.

7. Accountability

The accountability principle makes organisations responsible for complying with the UK / EU GDPR and says they must be able to evidence how they comply with the above principles.

This requires data governance across the organisation. Think of accountability as a collective responsibility, flowing from the Executive team and down through to the teams that process personal data.

To demonstrate how we comply, we need to have records in place. For many organisations this will include a Record of Processing Activities (RoPA).

The ICO provides a useful ‘Accountability Framework’ we can use to benchmark performance against their expectations.

In summary, identify the lawful bases you’re relying on and be fair and be open about what you do. Minimise the data you collect and make sure it remains accurate over time. Always keep it secure and don’t keep it for longer than you need it. Take care if you want to use personal data for a new purpose. Keep records and be ready to justify your approach.  The ICO has published more detailed guidance on the seven principles.

Is your data use compatible with what you collected it for?

November 2022

Have the ways you use people's data strayed too far from the original purpose(s)?

An ICO reprimand issued to a Government department serves as a welcome reminder to be careful about what we’re using data for, who we’re sharing it with, and what they might use it for.

Is what we’re doing transparent, fair and reasonable? Are the tasks we now use data for still in line with what we originally collected it for?

In this public sector case, the ICO has chosen not to issue a fine, but rather a warning with a requirement to implement specific measures. Commercial businesses are unlikely to face the same leniency.

What went wrong?

The Department for Education (DfE) received a reprimand from the ICO after it came to light a database containing the learning records of up to 28 million children had been used to check whether people who opened online gambling accounts were aged 18 or over.

The ICO investigation criticised the DfE for failing to protect young people’s data from unauthorised processing by third parties, whose purposes were found to be incompatible with the original purposes the data was collected for.

The DfE has overall responsibility for the Learning Records Service (LRS) database, which provides a record of pupil’s qualifications for education providers to access. Its main purpose is to enable schools, colleges, higher education and other education providers to verify data for educational purposes – such as the academic qualifications of potential students, or check if they are eligible for funding. LRS is only supposed to be used for education purposes.

But the DfE also allowed access to LRS to Trust Systems Software UK Ltd (trading as Trustopia), an employment screening firm. They in turn offered their services commercially to other companies, including GB Group, which used it to help betting companies screen new online gambling customers to confirm they were 18 or over.

Trustopia had access to the LRS database from September 2018 to January 2020, carrying out searches involving 22,000 learners.

This incident followed an audit of the DfE’s data activities by the ICO in 2020, which also found the DfE broke data protection laws in how it handled pupil data.

What were the failings?

The ICO found against the DfE in two respects:

  1. It failed in its obligations (as data controller) to use and share children’s data fairly, lawfully and transparently. Individuals were unaware of what was happening and could not object or withdraw from the processing. DfE failed to have appropriate oversight to protect against unauthorised processing of personal data held on the LRS database.
  2. It was also found to have failed to ensure confidentiality by failing to prevent unauthorised access to children’s data. The DfE’s lack of oversight and appropriate controls to protect the data enabled it to be used for other purposes, which were not compatible with the provision of educational services.

In its reprimand the ICO set out clear measure the DfE need to action to improve data protection practices and make sure children’s learning records are properly protected.

Since the incident, the DfE has confirmed they have permanently removed Trustopia’s access to the data. In fact, they have removed access from 2,600 organisations.

A spokesperson for DfE said the department takes the security of data we hold “extremely seriously” and confirmed it will publish a full response to the ICO by the end of 2022 giving “detailed progress in respect of all the actions identified”.

Why wasn’t there a massive fine?

In keeping with the ICO’s Regulatory Action Policy, they considered issuing a fine of £10 million. This is the amount considered to be ‘effective, proportionate and dissuasive’. However, the Information Commissioner has chosen not to issue a fine in this case, in line with its revised approach to public sector enforcement, announced in June 2022.

Some may find this surprising, so let’s dig deeper. John Edwards, UK Information Commissioner, said:

“No-one needs persuading that a database of pupils’ learning records being used to help gambling companies is unacceptable. Our investigation found that the processes put in place by the Department for Education were woeful. Data was being misused, and the Department was unaware there was even a problem until a national newspaper informed them.

“We all have an absolute right to expect that our central government departments treat the data they hold on us with the utmost respect and security. Even more so when it comes to the information of 28 million children.

“This was a serious breach of the law, and one that would have warranted a £10 million fine in this specific case. I have taken the decision not to issue that fine, as any money paid in fines is returned to government, and so the impact would have been minimal. But that should not detract from how serious the errors we have highlighted were, nor how urgently they needed addressing by the Department for Education.”

So Govt Departments can break the law and not be fined?

Well on the face of it, in the case of data protection, yes! Mr Edwards has confirmed the ICO are trialling a new approach to public sector enforcement which will see more public reprimands without fines, in all but the most serious cases.

In return, the ICO has received a commitment from the Cabinet Office and DCMS to create a cross-Whitehall senior leadership group, to encourage compliance with high data protection standards.

Hmmm… how do we feel about this?

I totally understand issuing a fine to the DfE is, ultimately, a fine against public funds for education. Which means our children could potentially be the ones who would suffer if a hefty fine was imposed. Nobody wins here.

But on the flipside, could this approach significantly weaken the deterrent? Will public sector employees feel motivated enough to go take appropriate steps to comply with data protection laws when there’s little risk of being fined?

After all, the private sector will continue to be fined as appropriate when they’re found to have violated the data laws.

What do you think? We’d love to hear your thoughts at info@dpnetwork.org.uk

A timely reminder?

This case serves as a helpful reminder that we need to take care the personal data we collect and hold as an organisation is not used for purposes which are incompatible with the original purposes.

Due diligence is especially important when the data is shared with other organisations, who might use it for their own purposes.

We must always be clear and transparent about how we use people’s data so they have an opportunity to exercise their right to object, and indeed any other privacy rights.

Ask yourself this key question; ‘Is your data use compatible with what you collected it for?’

What keeps a DPO awake at night?

November 2022

A scary collection of Data Protection Officer nightmares

For DPOs the stuff of nightmares doesn’t involve monsters, falling off a cliff or being naked in a job interview. In fact, that’s small beer compared to their true nightmares; Data Transfer Impact Assessments and people in snazzy ICO enforcement jackets knocking on the office door.

No, being a DPO isn’t for the faint-hearted. It’s a perilous existence where hardy souls must navigate a hostile wilderness of data protection hazards.

It’s an ever-changing wilderness, too. Just when you’ve frightened away one data protection predator, another pops up from nowhere to take its place. And remember, this must be achieved in a ruthless economic climate where every penny counts.

So, what’s the really scary stuff? The scariest of the slithery data protection monsters hiding in the semi-opened cupboard? I asked a few friendly Data Protection Officers: ‘What keeps you awake at night?’

Seven chilling privacy nightmares

1. Fear of the unknown – DPO, education sector

Being worried about what staff in my organisation are doing with personal data that I know nothing about (which they know they probably shouldn’t be doing). Another big nightmare at the moment is trying to unravel the intricacies of IDTAs and SCCs for both the UK and EU whilst factoring in other international data protection regimes that my organisation is subject to by virtue of their extra territorial scope – I see you, China! And a general worry that I’m going to miss, and therefore not mitigate, a risk. The pressure of being seen as the person with all the answers and ultimately the one responsible (or who will be blamed) if anything goes wrong is not the stuff of dreams.

2. The recurring nightmare of data flowing overseas – Director of Privacy, financial sector

What keeps me awake at night? Mapping international data flows. What sends me to sleep… counting DTIAs!

3. Drowning in a sea of paperwork – DPO, publishing sector

Keeping track of changing processing activities in a large organisation without blocking progress by over-administration. Plus ensuring appropriate documentation of the growing share of online Data Processing Agreements concluded with large suppliers (like pre-signed downloadable SCCs from Google, Meta …)

4. Encircled by continually moving parts – DPO, charity sector

Facing our third legislative change in 5 years and the on/off nature of what that may be. The ability to keep on top of the “what, when, how, and why” of the technical changes – horizon scanning versus meeting current needs and the complexities of planning to implement uncertain changes with limited resources. All whilst maintaining consistency and expertise in the advice and guidance so staff make appropriate decisions in the here and now. A productive, pragmatic, commercially minded, problem solving attitude to data protection is enough to keep anyone awake at night, without factoring in constantly moving legislative goalposts.

5. Hounded by familiar, but angry faces – DPO, hospitality sector

Employee-related Data Subject Access Requests. We’re not a big business, we don’t get many DSARs, and we don’t have the fancy technology. But lay-offs this year has led to a persistent stream of DSARs. As soon as one is nearly cleared, another one drops (it’s as if they’re planning it!). Despite support from HR, the requests are ultimately my responsibility to handle. I don’t have a team to support me, nor on-tap internal legal support. Sometimes there is no assuaging people, and yes, we have heard from the ICO after someone complained to them about our response. I often press send, and lie awake praying we didn’t disclose something we shouldn’t have, or missed something we should.

Not all DPOs lie awake at night. In fact, some hit the hay and are out like a light. But what are their daytime nightmares made off?

6. Being held to ransom – Matthew Kay, DPO, Metro Bank

When I was first asked to write this my opening thought was, with being quite a deep sleeper, that it takes quite a lot to keep me awake! Quickly realising this wasn’t what DPN was after, I came to the conclusion that the data protection challenges I’m currently worrying about centre around two things. First is the enhanced threat resulting from the war in Ukraine, and ensuring appropriate technical measures are in place to see off any potential cyber-attacks, and second is closely monitoring the perceived increase of inside threat to organisations, resulting from the cost of living crisis.

7. Encircled by ICO enforcement jackets – Michael Bond, Group DPO, News UK

As a father of two young boys, not much keeps me up at night beyond about 8.30pm! But as I settle under the covers and wait for sleep, let me envisage my worst nightmare instead. It’s a quiet Friday afternoon and with one eye on the clock, a phone call comes in:

“Hello, it’s the ICO. Did you know that large volumes of personal data originating from your brands are now publicly available online for all to see?”

… a long pause. The case officer goes on:

“Yes, the data looks to be a mix of hundreds of thousands of customer profiles, as well as what appears to be employee personnel files”.

As a bead of cold sweat rolls down my neck, the ICO case officer asks me:

“Why haven’t you notified us about this incident? It’s very serious, as I am sure you’re aware and we’re going to have to take immediate action; enforcement officers are on their way…”

I wake, startled. Phew. Don’t worry, just a dream… *the phone rings – caller ID – Wilmslow*

Yikes!

I’ll leave you with one final, spine-chilling thought. A new type of cosmic privacy horror. I’ve heard rumours a social media platform, one with a controversial new proprietor, could have a potential vacancy for a new…

…Data Protection Officer.

Are we conducting too many DPIAs – or not enough?

October 2022

How to decide when to conduct Data Protection Impact Assessments

Make no mistake, Data Protection Impact Assessments (DPIAs) are a really useful risk management tool. They help organisations to identify likely data protection risks before they materialise, so corrective action can be taken. Protecting your customers, staff and the interests of the business.

DPIAs are key element of the GDPR’s focus on accountability and Data Protection by Design.

It’s not easy working out when a DPIA is necessary, or when it might be useful, even if not strictly required by law. Businesses need to be in control of their exposure to risk, but don’t want to burden their teams with unnecessary work. So it falls to privacy professionals to use their judgement in what can be a delicate balancing act.

Lack of clarity around when DPIAs are genuinely needed could lead businesses to carry out far more DPIAs than needed – whilst others may carry out too few.

When are DPIAs required?

We should check if a DPIA is required during the planning stage of new projects, or when changes are being planned to existing activity. Where needed, DPIAs must be conducted BEFORE the new processing begins.

DPIAs are considered legally necessary when the processing of personal data is likely to involve a ‘high risk’ to the rights and freedoms of individuals.

What does ‘high risk’ look like?

Why types of activity might fall into ‘high risk’ isn’t always clear. Fortunately the ICO have given examples of processing likely to result in high risk to help you make this call. Regulated sectors, such as financial services and telecoms, have specific regulatory risks to consider too.

Give consideration to the scope, types of data used and the manner of processing. It’s wise to also take account of any protective measures already in place. In situations where the nature, scope, context and purposes of processing are very similar to another activity, where a DPIA has already been carried out, you may not need to conduct another.

Three key steps for a robust DPIA screening process

1. Engage your key teams

In larger organisations, building good relationships with key teams such as Procurement, IT, Project Management, Legal and Information Security can really help. They might hear about projects involving personal data before you do. Make sure they’re aware when a DPIA may be required. This means they’ll be more likely to ‘raise a hand’ and let you know when a project which might require a DPIA comes across their desk.

In smaller businesses there may still be others who can help ‘raise a hand’ and let you know about relevant projects. Work out who those people are.

2. Confirm the businesses appetite for risk

Is your organisation the sort which only wants DPIAs to be carried out when strictly required by law? Or perhaps you want a greater level of oversight? Choosing to carry out DPIAs as your standard risk assessment methodology for any significant projects involving personal data – even if they might appear to involve lower levels of risks to individuals.

Logic says you’ll never be 100% sure unless you carry out an assessment and DPIAs are a tried and tested way to give you oversight and confidence. But this approach requires more time, resources and commitment from the business. You need to strike the right balance for your organisation.

3. Adopt a DPIA screening process

If you don’t currently use a screening process, you really should consider adopting one. It’s a quick and methodical way to identify if a project does or does not require a DPIA.

You can use a short set of standard questions, which can be provided for stakeholders to complete and return or discussed in a call. So the question ‘Is a DPIA needed or not?’ can be reached rapidly and with confidence.

Personally I prefer to arrange a short call with the stakeholders, using my screening questionnaire as a prompt to guide the discussion.

Don’t forget to keep a record of your decisions! Including when you decide a DPIA isn’t necessary.

Try not to burden colleagues with unnecessary assessments for every project, if there really is minimal risk. This is unlikely to be a well-received approach. Raise awareness and have a built-in DPIA screening process to make sure you catch the projects which really do warrant a deeper dive.

 

Is your marketing profiling lawful, fair and transparent?

October 2022

ICO fines catalogue retailer £1.35 million for ‘invisible processing’

Many companies want to know their customers better. This is not a bad thing. Information gathered about people is regularly used for a variety of activities including improving products and services, personalisation or making sure marketing campaigns are better targeted.

However, the significant fine dished out to catalogue retailer Easylife highlights why companies need to be transparent about what they do, have a robust lawful basis, be careful about making assumptions about people and take special care with special category data.

It also shows how profiling is not limited to the realms of online tracking and the adtech ecosystem, it can be a simpler activity.

What did the catalogue retailer do?

Easylife had what were termed ‘trigger products’ in its Health Club catalogue. If a customer purchased a certain product, it triggered a marketing call to the individual to try and sell other related products. This was done using a third-party call centre.

Using previous transactions to tailor future marketing is not an unusual marketing tactic, often referred to as ‘NBA – Next Best Action’. The key in this case is Easylife inferred customers were likely to have certain health conditions based on their purchase of trigger products.

For example, if a customer bought a product which could be associated with arthritis, this triggered a telemarketing call to try and sell other products popular with arthritis sufferers – such as glucosamine and bio-magnetic joint patches.

Data relating to medical conditions, whether provided by the individual or inferred from other data, is classified as special category data under data protection law and handling this type of data requires special conditions to be met.

The ICO’s ruling

To summarise the ICO’s enforcement notice Easylife was found have failed to:

  • have a valid lawful basis for processing
  • meet the need to have an additional condition for processing special category data
  • be transparent about its profiling of customers

It was found to have conducted ‘invisible processing’ of 145,000 customers.

There were no complaints raised about this activity; it only came to light due to a separate ICO investigation into contraventions of the telemarketing rules. The ICO says it wasn’t surprised no one had complained, as people just wouldn’t have been aware this profiling was happening, due to the lack of transparency.

It just goes to show ICO fines don’t always arise as a result of individuals raising complaints.

Key findings

Easylife argued it was just processing transactional data. The ICO ruled when this transactional data was used to influence its telemarketing decisions, it constituted profiling.

The ICO said while data on customer purchases constituted personal data, when this was used to make inferences about health conditions, this became the processing of special category data. The ICO said this was regardless of the statistical confidence Easylife had in the profiling it had conducted.

Easylife claimed it was relying on the lawful basis of Legitimate Interests. However, the Legitimate Interests Assessment (LIA) the company provided to the ICO during its investigation actually related to a previous activity, in which health related data wasn’t used.

When processing special category data organisations need to make sure they not only have a lawful basis, but also comply with Article 9 of UK GDPR.

The ICO advised the appropriate basis for handling this special category data was with the explicit consent of customers. In other words legitimate interests was not an appropriate basis to use.

Easylife was found to have no lawful basis, nor a condition under Article 9.

It was ruled there was a lack of transparency; customers hadn’t been informed profiling was taking place. Easylife’s privacy notice was found to have a ‘small section’ which stated how personal data would be used. This included the following:

*Keep you informed about the status of your orders and provide updates or information about associated products or additional products, services, or promotions that might be of interest to you.
*Improve and develop the products or services we offer by analysing your information.

This was ruled inadequate and Easylife was found to have failed to give enough information about the purposes for processing and the lawful bases for processing.

The ICO’s enforcement notice points out it would have expected a Data Protection Impact Assessment to have been conducted for for the profiling of special category data. This had not been done.

The Data Processing Agreement between Easylife and its processor; the third-party call centre, was also scrutinised. While it covered key requirements such as confidentiality, security, sub-contracting and termination, it failed to indicate the types of personal data being handled.

Commenting on the fine, John Edwards, UK Information Commissioner, said:

“Easylife was making assumptions about people’s medical condition based on their purchase history without their knowledge, and then peddled them a health product – that is not allowed.

The invisible use of people’s data meant that people could not understand how their data was being used and, ultimately, were not able to exercise their privacy and data protection rights. The lack of transparency, combined with the intrusive nature of the profiling, has resulted in a serious breach of people’s information rights.”

Alongside the £1.35 million fine, Easylife’s been fined a further £130,000 under PECR for making intrusive telemarketing calls to individuals registered on the Telephone Preference Service. Currently the maximum fine for contravening the marketing rules under PECR is £500,000, much lower than potential fines under DPA 2018/UK GDPR.

Update March 2023: The ICO announces reduction in GDPR fine from £1.35 million to £250,000.

6 key takeaways

1. If you are profiling your customers, try to make sure this is based on facts. Making the type of assumptions Easylife was making will always carry risks.

2. Be sure to be transparent about your activities. This doesn’t mean you have to use the precise term ‘profiling’ in your privacy notice, but the ways in which you use personal information should be clear.

3. Make sure your clearly state the lawful bases you rely upon in your privacy notice. It can be helpful and clear to link lawful bases to specific business activities.

4. If you’re processing special category data, collected directly or inferred from other data, make sure you can meet a condition under Article 9. For marketing activities the only option is explicit consent.

5. If you’re conducting profiling using special category data, carry out a DPIA.

6. Always remember the marketing rules under PECR for whatever marketing channel you’re using. For telemarketing, if you don’t have the consent of individuals, be sure to screen lists against the TPS.