What keeps a DPO awake at night?

November 2022

A scary collection of Data Protection Officer nightmares

For DPOs the stuff of nightmares doesn’t involve monsters, falling off a cliff or being naked in a job interview. In fact, that’s small beer compared to their true nightmares; Data Transfer Impact Assessments and people in snazzy ICO enforcement jackets knocking on the office door.

No, being a DPO isn’t for the faint-hearted. It’s a perilous existence where hardy souls must navigate a hostile wilderness of data protection hazards.

It’s an ever-changing wilderness, too. Just when you’ve frightened away one data protection predator, another pops up from nowhere to take its place. And remember, this must be achieved in a ruthless economic climate where every penny counts.

So, what’s the really scary stuff? The scariest of the slithery data protection monsters hiding in the semi-opened cupboard? I asked a few friendly Data Protection Officers: ‘What keeps you awake at night?’

Seven chilling privacy nightmares

1. Fear of the unknown – DPO, education sector

Being worried about what staff in my organisation are doing with personal data that I know nothing about (which they know they probably shouldn’t be doing). Another big nightmare at the moment is trying to unravel the intricacies of IDTAs and SCCs for both the UK and EU whilst factoring in other international data protection regimes that my organisation is subject to by virtue of their extra territorial scope – I see you, China! And a general worry that I’m going to miss, and therefore not mitigate, a risk. The pressure of being seen as the person with all the answers and ultimately the one responsible (or who will be blamed) if anything goes wrong is not the stuff of dreams.

2. The recurring nightmare of data flowing overseas – Director of Privacy, financial sector

What keeps me awake at night? Mapping international data flows. What sends me to sleep… counting DTIAs!

3. Drowning in a sea of paperwork – DPO, publishing sector

Keeping track of changing processing activities in a large organisation without blocking progress by over-administration. Plus ensuring appropriate documentation of the growing share of online Data Processing Agreements concluded with large suppliers (like pre-signed downloadable SCCs from Google, Meta …)

4. Encircled by continually moving parts – DPO, charity sector

Facing our third legislative change in 5 years and the on/off nature of what that may be. The ability to keep on top of the “what, when, how, and why” of the technical changes – horizon scanning versus meeting current needs and the complexities of planning to implement uncertain changes with limited resources. All whilst maintaining consistency and expertise in the advice and guidance so staff make appropriate decisions in the here and now. A productive, pragmatic, commercially minded, problem solving attitude to data protection is enough to keep anyone awake at night, without factoring in constantly moving legislative goalposts.

5. Hounded by familiar, but angry faces – DPO, hospitality sector

Employee-related Data Subject Access Requests. We’re not a big business, we don’t get many DSARs, and we don’t have the fancy technology. But lay-offs this year has led to a persistent stream of DSARs. As soon as one is nearly cleared, another one drops (it’s as if they’re planning it!). Despite support from HR, the requests are ultimately my responsibility to handle. I don’t have a team to support me, nor on-tap internal legal support. Sometimes there is no assuaging people, and yes, we have heard from the ICO after someone complained to them about our response. I often press send, and lie awake praying we didn’t disclose something we shouldn’t have, or missed something we should.

Not all DPOs lie awake at night. In fact, some hit the hay and are out like a light. But what are their daytime nightmares made off?

6. Being held to ransom – Matthew Kay, DPO, Metro Bank

When I was first asked to write this my opening thought was, with being quite a deep sleeper, that it takes quite a lot to keep me awake! Quickly realising this wasn’t what DPN was after, I came to the conclusion that the data protection challenges I’m currently worrying about centre around two things. First is the enhanced threat resulting from the war in Ukraine, and ensuring appropriate technical measures are in place to see off any potential cyber-attacks, and second is closely monitoring the perceived increase of inside threat to organisations, resulting from the cost of living crisis.

7. Encircled by ICO enforcement jackets – Michael Bond, Group DPO, News UK

As a father of two young boys, not much keeps me up at night beyond about 8.30pm! But as I settle under the covers and wait for sleep, let me envisage my worst nightmare instead. It’s a quiet Friday afternoon and with one eye on the clock, a phone call comes in:

“Hello, it’s the ICO. Did you know that large volumes of personal data originating from your brands are now publicly available online for all to see?”

… a long pause. The case officer goes on:

“Yes, the data looks to be a mix of hundreds of thousands of customer profiles, as well as what appears to be employee personnel files”.

As a bead of cold sweat rolls down my neck, the ICO case officer asks me:

“Why haven’t you notified us about this incident? It’s very serious, as I am sure you’re aware and we’re going to have to take immediate action; enforcement officers are on their way…”

I wake, startled. Phew. Don’t worry, just a dream… *the phone rings – caller ID – Wilmslow*

Yikes!

I’ll leave you with one final, spine-chilling thought. A new type of cosmic privacy horror. I’ve heard rumours a social media platform, one with a controversial new proprietor, could have a potential vacancy for a new…

…Data Protection Officer.

Are we conducting too many DPIAs – or not enough?

October 2022

How to decide when to conduct Data Protection Impact Assessments

Make no mistake, Data Protection Impact Assessments (DPIAs) are a really useful risk management tool. They help organisations to identify likely data protection risks before they materialise, so corrective action can be taken. Protecting your customers, staff and the interests of the business.

DPIAs are key element of the GDPR’s focus on accountability and Data Protection by Design.

It’s not easy working out when a DPIA is necessary, or when it might be useful, even if not strictly required by law. Businesses need to be in control of their exposure to risk, but don’t want to burden their teams with unnecessary work. So it falls to privacy professionals to use their judgement in what can be a delicate balancing act.

Lack of clarity around when DPIAs are genuinely needed could lead businesses to carry out far more DPIAs than needed – whilst others may carry out too few.

When are DPIAs required?

We should check if a DPIA is required during the planning stage of new projects, or when changes are being planned to existing activity. Where needed, DPIAs must be conducted BEFORE the new processing begins.

DPIAs are considered legally necessary when the processing of personal data is likely to involve a ‘high risk’ to the rights and freedoms of individuals.

What does ‘high risk’ look like?

Why types of activity might fall into ‘high risk’ isn’t always clear. Fortunately the ICO have given examples of processing likely to result in high risk to help you make this call. Regulated sectors, such as financial services and telecoms, have specific regulatory risks to consider too.

Give consideration to the scope, types of data used and the manner of processing. It’s wise to also take account of any protective measures already in place. In situations where the nature, scope, context and purposes of processing are very similar to another activity, where a DPIA has already been carried out, you may not need to conduct another.

Three key steps for a robust DPIA screening process

1. Engage your key teams

In larger organisations, building good relationships with key teams such as Procurement, IT, Project Management, Legal and Information Security can really help. They might hear about projects involving personal data before you do. Make sure they’re aware when a DPIA may be required. This means they’ll be more likely to ‘raise a hand’ and let you know when a project which might require a DPIA comes across their desk.

In smaller businesses there may still be others who can help ‘raise a hand’ and let you know about relevant projects. Work out who those people are.

2. Confirm the businesses appetite for risk

Is your organisation the sort which only wants DPIAs to be carried out when strictly required by law? Or perhaps you want a greater level of oversight? Choosing to carry out DPIAs as your standard risk assessment methodology for any significant projects involving personal data – even if they might appear to involve lower levels of risks to individuals.

Logic says you’ll never be 100% sure unless you carry out an assessment and DPIAs are a tried and tested way to give you oversight and confidence. But this approach requires more time, resources and commitment from the business. You need to strike the right balance for your organisation.

3. Adopt a DPIA screening process

If you don’t currently use a screening process, you really should consider adopting one. It’s a quick and methodical way to identify if a project does or does not require a DPIA.

You can use a short set of standard questions, which can be provided for stakeholders to complete and return or discussed in a call. So the question ‘Is a DPIA needed or not?’ can be reached rapidly and with confidence.

Personally I prefer to arrange a short call with the stakeholders, using my screening questionnaire as a prompt to guide the discussion.

Don’t forget to keep a record of your decisions! Including when you decide a DPIA isn’t necessary.

Try not to burden colleagues with unnecessary assessments for every project, if there really is minimal risk. This is unlikely to be a well-received approach. Raise awareness and have a built-in DPIA screening process to make sure you catch the projects which really do warrant a deeper dive.

 

Is your marketing profiling lawful, fair and transparent?

October 2022

ICO fines catalogue retailer £1.35 million for ‘invisible processing’

Many companies want to know their customers better. This is not a bad thing. Information gathered about people is regularly used for a variety of activities including improving products and services, personalisation or making sure marketing campaigns are better targeted.

However, the significant fine dished out to catalogue retailer Easylife highlights why companies need to be transparent about what they do, have a robust lawful basis, be careful about making assumptions about people and take special care with special category data.

It also shows how profiling is not limited to the realms of online tracking and the adtech ecosystem, it can be a simpler activity.

What did the catalogue retailer do?

Easylife had what were termed ‘trigger products’ in its Health Club catalogue. If a customer purchased a certain product, it triggered a marketing call to the individual to try and sell other related products. This was done using a third-party call centre.

Using previous transactions to tailor future marketing is not an unusual marketing tactic, often referred to as ‘NBA – Next Best Action’. The key in this case is Easylife inferred customers were likely to have certain health conditions based on their purchase of trigger products.

For example, if a customer bought a product which could be associated with arthritis, this triggered a telemarketing call to try and sell other products popular with arthritis sufferers – such as glucosamine and bio-magnetic joint patches.

Data relating to medical conditions, whether provided by the individual or inferred from other data, is classified as special category data under data protection law and handling this type of data requires special conditions to be met.

The ICO’s ruling

To summarise the ICO’s enforcement notice Easylife was found have failed to:

  • have a valid lawful basis for processing
  • meet the need to have an additional condition for processing special category data
  • be transparent about its profiling of customers

It was found to have conducted ‘invisible processing’ of 145,000 customers.

There were no complaints raised about this activity; it only came to light due to a separate ICO investigation into contraventions of the telemarketing rules. The ICO says it wasn’t surprised no one had complained, as people just wouldn’t have been aware this profiling was happening, due to the lack of transparency.

It just goes to show ICO fines don’t always arise as a result of individuals raising complaints.

Key findings

Easylife argued it was just processing transactional data. The ICO ruled when this transactional data was used to influence its telemarketing decisions, it constituted profiling.

The ICO said while data on customer purchases constituted personal data, when this was used to make inferences about health conditions, this became the processing of special category data. The ICO said this was regardless of the statistical confidence Easylife had in the profiling it had conducted.

Easylife claimed it was relying on the lawful basis of Legitimate Interests. However, the Legitimate Interests Assessment (LIA) the company provided to the ICO during its investigation actually related to a previous activity, in which health related data wasn’t used.

When processing special category data organisations need to make sure they not only have a lawful basis, but also comply with Article 9 of UK GDPR.

The ICO advised the appropriate basis for handling this special category data was with the explicit consent of customers. In other words legitimate interests was not an appropriate basis to use.

Easylife was found to have no lawful basis, nor a condition under Article 9.

It was ruled there was a lack of transparency; customers hadn’t been informed profiling was taking place. Easylife’s privacy notice was found to have a ‘small section’ which stated how personal data would be used. This included the following:

*Keep you informed about the status of your orders and provide updates or information about associated products or additional products, services, or promotions that might be of interest to you.
*Improve and develop the products or services we offer by analysing your information.

This was ruled inadequate and Easylife was found to have failed to give enough information about the purposes for processing and the lawful bases for processing.

The ICO’s enforcement notice points out it would have expected a Data Protection Impact Assessment to have been conducted for for the profiling of special category data. This had not been done.

The Data Processing Agreement between Easylife and its processor; the third-party call centre, was also scrutinised. While it covered key requirements such as confidentiality, security, sub-contracting and termination, it failed to indicate the types of personal data being handled.

Commenting on the fine, John Edwards, UK Information Commissioner, said:

“Easylife was making assumptions about people’s medical condition based on their purchase history without their knowledge, and then peddled them a health product – that is not allowed.

The invisible use of people’s data meant that people could not understand how their data was being used and, ultimately, were not able to exercise their privacy and data protection rights. The lack of transparency, combined with the intrusive nature of the profiling, has resulted in a serious breach of people’s information rights.”

Alongside the £1.35 million fine, Easylife’s been fined a further £130,000 under PECR for making intrusive telemarketing calls to individuals registered on the Telephone Preference Service. Currently the maximum fine for contravening the marketing rules under PECR is £500,000, much lower than potential fines under DPA 2018/UK GDPR.

Update March 2023: The ICO announces reduction in GDPR fine from £1.35 million to £250,000.

6 key takeaways

1. If you are profiling your customers, try to make sure this is based on facts. Making the type of assumptions Easylife was making will always carry risks.

2. Be sure to be transparent about your activities. This doesn’t mean you have to use the precise term ‘profiling’ in your privacy notice, but the ways in which you use personal information should be clear.

3. Make sure your clearly state the lawful bases you rely upon in your privacy notice. It can be helpful and clear to link lawful bases to specific business activities.

4. If you’re processing special category data, collected directly or inferred from other data, make sure you can meet a condition under Article 9. For marketing activities the only option is explicit consent.

5. If you’re conducting profiling using special category data, carry out a DPIA.

6. Always remember the marketing rules under PECR for whatever marketing channel you’re using. For telemarketing, if you don’t have the consent of individuals, be sure to screen lists against the TPS.

Is bias and discrimination in AI a problem?

September 2022

Artificial Intelligence - good governance will need to catch up with the technology

The AI landscape

We hear about the deployment and use of AI in many settings. The types and frequency of use are only going to increase. Major uses include:

  • Cybersecurity analysis to identify anomalies in IT structures
  • Automating repetitive maintenance tasks and guiding technical support teams
  • Ad tech to profile and segment audiences for advertising targeting and optimise advertising buying and placement
  • Reviewing job applications to identify the best-qualified candidates in HR
  • Research scientists looking for patterns in health to identify new cures for cancer
  • Predicting equipment failure in manufacturing
  • Detecting fraud in banking by analysing irregular patterns in transactions.
  • TV and movie recommendations for Netflix users
  • Inventory optimisation and demand forecasting in retail & transportation
  • Programming cars to self-drive

Overall, the different forms of AI will serve to improve our lives but from a privacy point of view, there is a danger that the governance around AI projects is lagging behind the evolving technology solutions.  

In that context, tucked away in its three-year plan, published in July, the ICO highlighted that AI driven discrimination might become more of a concern. In particular, the ICO is planning to investigate concerns about the use of algorithms to sift recruitment applications. 

Why recruitment applications?

AI is used widely in the recruitment industry. A Gartner report suggested that all recruitment agencies used it for some of their candidate sifting. The CEO of Ziprecruiter website in US is quoted as saying that three-quarters of submitted CVs are read by algorithms. There is plenty of scope for data misuse, hence the ICO’s interest. 

The Amazon recruitment tool – an example of bias/discrimination

The ICO are justified in their concerns around recruitment AI. Famously, Amazon developed their own tool to sift through applications for developer roles. Their model was based on 10 years of recruitment data for an employee pool that was largely male. As a result, the model discriminated against women and reinforced the gender imbalance by filtering out all female applications.

What is AI?

AI can be defined as: 

“using a non-human system to learn from experience and imitate human intelligent behaviour”

The reality is that most “AI” applications are machine learning. That is, models are trained to calculate outcomes using data collected from past data. Pure AI is technology designed to simulate human behaviour. For simplicity, let’s call machine learning AI.  

Decisions made using AI are either fully automated or with a “human in the loop”. The latter can safeguard individuals against biased outcomes by providing a sense check of outcomes. 

In the context of data protection, it is becoming increasingly important that those impacted by AI decisions should be able to hold someone to account.

You might hear that all the information is in a “black box” and that how the algorithm works cannot be explained. This excuse isn’t good enough – it should be possible to explain how a model has been trained and risk assess that activity. 

How is AI used? 

AI can be used to make decisions:

1.     A prediction – e.g. you will be good at a job

2.     A recommendation – e.g. you will like this news article

3.     A classification – e.g. this email is spam. 

The benefits of AI

AI is generally a force for good:

1.     It can automate a process and save time

2.     It can optimise the efficiency of a process or function (often seen in factory or processing plants)

3.     It can enhance the ability of individuals – often by speeding processes

Where do data protection and AI intersect?

An explanation of AI-assisted decisions is required: 

1.     If there is a process without any human involvement

2.     It produces legal or similarly significant effects on an individual – e.g. not getting a job. 

Individuals should expect an explanation from those accountable for an AI system. Anyone developing AI models using personal data should ensure that appropriate technical and organisational measures are in place to integrate safeguards into processing. 

What data is in scope?

  • Personal data used to train a model
  • Personal data used to test a model
  • On deployment, personal data used or created to make decisions about individuals

If no personal data is included in a model, AI is not in scope for data protection. 

How to approach an AI project?

 Any new AI processing with personal data would normally require a Data Protection Impact Assessment (DPIA). The DPIA is useful because it provides a vehicle for documenting the processing, identifying the privacy risks as well as identifying the measures or controls required to protect individuals. It is also an excellent means of socialising the understanding of AI processing across an organisation. 

Introducing a clear governance framework around any AI projects will increase project visibility and reduce the risks of bias and discrimination. 

Where does bias/discrimination creep in?

Behaviour prohibited under The Equality Act 2010 is any that discriminates, harasses or victimises another person on the basis of any of these “protected characteristics”:

  • Age
  • Disability
  • Gender reassignment
  • Marriage and civil partnership
  • Pregnancy and maternity
  • Race
  • Religion and belief
  • Sex
  • Sexual orientation. 

When using an AI system, your decision-making process needs to ensure and are able to show that this does not result in discrimination. 

Our Top 10 Tips

  1. Ask how the algorithm has been trained – the “black box” excuse isn’t good enough
  2. Review the training inputs to identify possible bias with the use of historic data
  3. Test the outcomes of the model – this really seems so obvious but not done regularly enough
  4. Consider the extent to which the past will predict the future when training a model – recruitment models will have an inherent bias if only based on past successes
  5. Consider how to compensate for bias built into the training – a possible form of positive discrimination
  6. Have a person review the outcomes of the model if it is challenged and give that person authority to challenge
  7. Incorporate your AI projects into your data protection governance structure
  8. Ensure that you’ve done a full DPIA identifying risks and mitigations
  9. Ensure that you’ve documented the processes and decisions to incorporate into your overall accountability framework
  10. Consider how you will address individual rights – can you easily identify where personal data has been used or has it been fully anonymised? 

In summary

AI is complex and fast-changing. Arguably the governance around the use of personal data is having to catch up with the technology. When people believe that these models are mysterious and difficult to understand, a lack of explanation for how they work is not acceptable. 

In the future clearer processes around good governance will have to develop to understand the risks and consider ways of mitigating those risks to ensure that data subjects are not disadvantaged. 

Data Retention Guide

Data retention tools, tips and templates

This comprehensive guides take you through the key steps and considerations when approaching data retention. Whether you’re starting out or reviewing your retention policy and schedules, we hope this guide will support your work.

The guide, first published in June 2020 was developed and written by data protection specialists from a broad range of organisations and sectors.  A huge thank you to all those who made it possible.

Suppliers – why your contracts and security are important

Processors and controllers are both accountable

Do you provide a service to clients and handle your client’s personal data? If you’re acting as a processor, a recent GDPR fine serves as a helpful reminder to be sure to have all your ducks in a row.

There’s a clear warning you shouldn’t just assume the contracts your clients ask you to sign are okay, nor can you just say you have robust security measures in place, you actually have to have them!

In this recent case a software publisher, acting as a processor for their clients, was fined 1.5 million Euros by the French regulator (CNIL) following a data breach involving sensitive health data.

It was found data was exfiltrated by unauthorised parties from a poorly protected server. In a nutshell the key findings were:

  • Significant gaps in the processor’s security processes
  • Contractual documentation which failed to include mandatory obligations required under Article 28 of GDPR.

It’s worth noting the fine was based on both these counts. The ruling makes it clear processors should be wary of relying on their clients to make sure contractual terms are up to scratch. It’s the responsibility of both parties.

Here’s a quick recap on the how suppliers can minimise their risks.

Getting the relationship clear

The most important first step is to establish the relationship between your company and another.

  • Are you handling a client’s data on their behalf, under their instruction, to provide a service to them?
  • Are you acting as controller, clearly determining how the personal data will be used for your own purpose(s)?
  • Are you both? i.e. acting as a controller in certain circumstances, but a processor for specific services you provide to clients.

Are we controller or are we processor?

What are the contractual requirements?

Once you’re clear you are a processor, acting under your client’s instructions, the law states your arrangements with clients must be covered by a binding agreement. EU and UK GDPR set out specific provisions which must be written into such contracts. In brief these are as follows:

1. Types of personal data & categories of data subject

The contract needs to specify what types of personal data you’ll be handling. It should also include details of whether this data relates to your client’s employees, patients, customers, and so forth.

2. Nature, purpose, duration of processing

The contract should describe the nature of the service(s) you provide, what purpose(s) this serves and the term of the contract. The agreement should cover instructions from your client of what you are permitted to do with their data.

3. The rights and duties of each party

The obligations of both parties should be clearly defined. For example, the client’s obligation to have a lawful basis for processing, its responsibility to fulfil individual privacy rights and your commitment as a supplier to not use your client’s data for any other purpose.

4. Technical and organisational measures

As a supplier you need to provide sufficient guarantees to implement proportionate technical and organisational measures to meet requirements of UK/EU GDPR.

5. Sub-processors

If you engage other companies (‘sub processors’) to support you in delivering your services, you’ll need specific or general written authorisation from your client(s). If you make any changes to which sub-processors you use (including software providers), you’ll need to tell your client and give them the opportunity to object. Contractual terms should stipulate that you are accountable for your sub-processors.

6. International transfers

If relevant, the agreement should include details and provisions for any transfers of personal data to a third country. For example if you are based in the UK, a transfer to any other country. This would include details of any sub-processors based outside the UK. A transfer is often associated with the act of sending or transmitting personal data from one country to another. It should be noted the definition also covers cases where personal data is made ‘available’, in other words can be accessed in a third country.

7. Duty of confidentiality

There must be a confidentiality clause, which commits you to ensuring any of your staff authorised to access the client’s data are committed to a duty of confidentiality or are under a statutory obligation of confidentiality.

8. Assisting your clients

The contract should cover your commitment to assisting your clients, where necessary, with handling individual privacy rights, handling data breaches and conducting data protection impact assessments.

9. Return or destruction of data

It should be clear what happens to the client’s data when the contract ends. Does the client want you to return the data or destroy it?

10. Audits and inspections

As a processor you must agree to make available all information necessary to demonstrate your compliance and agree to audits, including inspections by your client or their authorised auditor.

Processors have obligations

This recent CNIL fine shows you can’t just sign a contract, sit back and relax.

As a processor you’re responsible for your sub-processors, data transfers, staff training and confidentiality, assisting your clients when necessary and so forth. You have to be sure to implement the technical and organisation measures you said you would to protect your client’s data.

While some clients will ask you to jump through multiple hoops as part of their due diligence process, making you clearly demonstrate your security measures are robust, others may not be so picky. But that doesn’t release you from your responsibilities.

The law and this recent fine make it clear processors can be held liable. In the event of a breach, your contractual arrangements and internal practices could come under rigorous scrutiny.

Data Subject Access Request Guide

Being prepared and handing DSARs

Handling Data Subject Access Requests can be complex, costly and time-consuming. How do you make sure you’re on the front foot, with adequate resources, understanding and the technical capability to respond within a tight legal timeframe?

Data subject access request from the data protection consultancy DPN - Data Protection Network

This guide aims to take you through the key steps to consider, such as…

  • Being prepared
  • Retrieving the personal data
  • Balancing complex requests
  • Applying redactions & exemptions
  • How technology can help

Is your Privacy Notice complete?

April 2022

A GDPR fine reveals gaps in necessary privacy information

A core GDPR theme is transparency; being upfront and open about how people’s personal information is collected and used. People have a fundamental right to be informed and one of the key ways organisations can do this is with easily accessible privacy notices.

Four years ago, in the run up to GDPR enforcement, many businesses rushed to make sure their privacy notices met the enhanced and specific requirements.

  • When did you last review yours?
  • Have your business activities changed in recent years?
  • Are you sure you’ve got everything covered?

It can be easy to think nobody actually reads our privacy notices, but some do, and a regulator most definitely would. A recent 725,000 Euro fine for a lack of transparency shows how it can come back to haunt you if you’ve missed vital aspects out, or not been as clear as you could have been.

This is an area some major charities were found wanting before GDPR was even enforced. Back in 2017 the Information Commissioner’s Office (ICO) issued a series of fines and a key finding was the charities had failed to tell people about activities such as wealth-screening and appending telephone numbers.

The GDPR fine

Fast forward to 2022 and a recent fine against Klarna Bank AB, by the Swedish Data Protection Authority (IMY), reveals a failure to give customers necessary privacy information.

What necessary information did the bank not provide?

  • Purposes and lawful basis for processing
    It was found for one of the bank’s services Klarna did not provide information on the purpose(s) for which it was processing personal data and the lawful basis/bases it was relying on.
  • Recipients who data is shared with
    It was found incomplete and misleading information was provided about other companies they shared personal data with.
  • International transfers
    Information was not given on which countries outside the EU/EEA personal data was transferred to. There was also no information about the safeguards which might apply to such transfers.
  • Individual rights
    Incomplete information was provided about people’s privacy rights, such as the right to erasure, data portability and the right to object.

In conclusion it was found Klarna had failed to fulfil the basic principle of transparency and people’s right to information.

Privacy notice checklist

As a reminder for us all, here are the key points which should be covered in privacy notices. This checklist is based on Article 13 of UK/EU GDPR and ICO guidance.

The 7 essential elements

  1. Name and contact details of your organisation
  2. Purposes of processing – explain each different purpose you use people’s personal information for.
  3. Lawful basis for processing – explain the lawful basis you rely on to collect and use people’s personal data.
  4. Data retention – tell people how long you envisage keeping personal data for, or at least the criteria used to decide retention periods.
  5. Privacy rights – tell people what their privacy rights are and how they can exercise them. The right of access, erasure, objection, rectification, data portability, restriction.
  6. Right to withdraw consent – tell people they can withdraw their consent at any time, where this is the lawful basis you are relying on. It should be as easy to withdraw consent as it is to give it and you should tell people how they can withdraw their consent.
  7. Right to lodge a complaint – tell people they have the right to complain to a supervisory authority, for example the Information Commissioner’s Office in the UK.

7 more points to include, if relevant for your business

Where applicable you’re also required to provide the following details:

  1. DPO – Provide contact details of your Data Protection Officer (if you have appointed one)
  2. Data Protection Representative – If you are based outside the EU/UK, but you offer services of monitor the behaviour of people based in the EU/UK you should have a Data Protection Representative and provide contact details for them.
  3. Legitimate Interests – Explain which purposes you rely on legitimate interests for.
  4. Recipients, or categories of recipients – Provide details of who you’ll share people’s personal data with. This includes suppliers acting as processors, handling data on your behalf. ICO guidance states you can provide specific names, or at least the categories of organisation they fall within.
  5. International Transfers – Inform people if you transfer their personal data to any countries outside the UK (or if based in the EU, outside the EU). Explain whether transfers are based on an adequacy decision. If not provide a description of other safeguards in place, such as Standard Contractual Clauses.
  6. Automated decision-making, including profiling – Tell people if you make solely automated decisions, including profiling that may have a legal or similar significant effect on individuals. Meaningful information should be provided about the logic involved, the significance and envisaged consequences.
  7. Statutory/contractual obligations – Let people know if you are required to collect their data by law or under contract, and the consequences should they not provide necessary information.

In addition to the above there are some other best practice points, such as indicating when the privacy notice was last updated and offering further assurances surrounding how personal data is protected.

Furthermore, if you collect details about people from another source, in order words not directly from them, you should make sure you tell them you are handling their personal data and provide the relevant privacy information.

This case serves as a reminder that we need to regularly review our privacy notices. Put very simply, the law says there should be no surprises about how we’re using people’s personal data.

Our privacy notice may be the least clicked link on our websites, but it’s not just regulators and people like me who read them. It’s not unusual for businesses, as part of their data protection due diligence when considering working with other companies, to take a peek at privacy notices to check they look relatively in order.