The value of risk assessments in the world of data protection compliance
In the world of data protection, we have grown used to, or even grown tired of, the requirement to carry out a Data Protection Impact Assessment (DPIA) or a Privacy Impact Assessment (PIA) as it called in some jurisdictions.
What are DPIA and PIA?
They are processes that help assess privacy risks to individuals in the collection, use and disclosure of personal information. They identify privacy risks, improve transparency and promote best practice.
In a report by Trilateral Research & Consulting, commissioned by the ICO in 2013 , it was recommended that “Ensuring the “buy-in” of the most senior people within the organisation is a necessary pre-condition for a successful integration of privacy risks and PIA into the organisation’s existing processes. PIA processes need to be connected with the development of privacy awareness and culture within the company. Companies need to devise effective communication and training strategies to sustain a change in the mindsets of, and in the development of new skills for, project managers. The organisation needs to deliver a clear message to all project managers that the PIA process must be followed and that PIAs are an organisational requirement. Simplicity is the key to achieve full implementation and adoption of internal PIA guidelines and processes.”
The GDPR and guidance from Data Protection Authorities make it clear projects that may require a DPIA include:
- A new IT system for storing and accessing personal data;
- Using existing data for a new and unexpected purpose;
- A new database acquisition
- Corporate restructuring
- Monitoring in the workplace
A DPIA will become mandatory in the following cases:
- Systematic and extensive evaluation of personal aspects of natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects on the individual or similarly affect the individual
- Processing on a large scale of special categories of data or data relating to criminal offences
- Systematic monitoring of publicly accessible areas on a large scale
Some data protection authorities have published guidance on how and when to effectively use a DPIA and the DPIA process is best broken down into several distinct phases which are:
- Identify the need for the project to have a PIA
- Describe information flows
- Identify privacy risks
- Identify privacy solutions
- Record outcomes and obtain sign-off
- Integrate outcomes of PIA into project plan
But it is not as simple as set out above.
My experience is that if a DPIA is a risk management tool and is to be considered at the outset of a project, then almost every project or new processing activity needs a pre-DPIA screening process. This at least flags up if a full DPIA is needed and will highlight any areas of risk.
These risks may not only relate to possible infringements of fundamental rights but also to business and reputational risks and infringements of other laws.
Assuming a full DPIA is needed then it is not long in the process before we are assessing the lawful grounds for processing and if we are relying on Legitimate Interests then we need to do a Legitimate Interests Assessment.
Legitimate Interests Assessments (LIAs) – the “balancing test”
An essential part of the concept of Legitimate Interests is the balance between the interests of the Controller and the rights and freedoms of the individual:
‘processing is necessary for the purposes of the legitimate interests pursued by the controller or by a Third Party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of Personal Data, in particular where the data subject is a child.’ GDPR Article 6(1)(f)
If a Controller wishes to rely on Legitimate Interests for processing Personal Data it must carry out an appropriate assessment, called a Legitimate Interests Assessment, or LIA. When carrying out an LIA, the Controller must balance its right to process the Personal Data against the individuals’ data protection rights.
In certain circumstances an LIA may be straight forward. However, under the accountability provisions of the GDPR, the Controller must maintain a written record that it has carried out an LIA and the reasons why it came to the conclusion that the balancing test was met.
International Data Transfer Risk Assessments
In so many projects and data sharing activities we find that personal data is being transferred and the EDPB guidance on risk assessment must be followed and for Controllers in the UK then the ICO guidance applies. There are six steps:
The six steps:
Note that, in order to meet the GDPR’s accountability requirements, each of these steps would need to be documented, and the documentation provided to the supervisory authorities on request.
Step 1: Know your transfers
Understand what data you are transferring outside the EEA and/or UK, including by way of remote access. Perhaps fairly self-evident, but can be challenging when it comes to onward transfers by processors (to sub- processor, or even sub-sub-processors).
Step 2: Identify your transfer tool(s)
Identify what lawful mechanism you are relying on to transfer the data.
Step 3: Assess whether the transfer mechanism is effective in practice
Now we come to the crucial question: in practice, is the transferred personal data afforded a level of protection in the third country that is essentially equivalent to that guaranteed in the EEA/UK?
The EDPB recommends considering multiple aspects of the third country’s legal system, but in particular the rules granting public authorities rights of access to data. Most countries allow for some form of access for law enforcement and national security, and so the assessment should focus on whether those laws are limited to what is necessary and proportionate in a democratic society.
If, after this assessment, you decide your transfer mechanism, ensures an equivalent level of protection, you can stop there. If, however, you decide that the local law does impinge on the protection afforded by your transfer mechanism, you must proceed to Step 4.
Step 4: Adopt supplementary measures
The EDPB separates potential supplementary measures into three categories: technical, contractual, or organisational.
Step 5: Procedural steps if you identified any supplementary measures
This step may lead you to impose regular audits on the importing party.
Step 6: Re-evaluate at appropriate intervals
Monitor developments in the recipient country which could impact your initial assessment. The obligations on the data importer under solutions like the EU Standard Contractual Clauses should help here, as it is required to inform the data exporter of a change of law which impacts its ability to comply with the SCCs.
AI, analytics and new technologies
The EU AI Act is intended to apply to any business that puts AI or uses AI on or in the EU market and so is extra-territorial in its reach. More than that, the AI Act will integrate with and co-exist alongside existing legislation such as the General Data Protection Regulation, the Digital Services Act and the draft Cyber Resilience Act.
The use of new technologies such as smart devices, internet of things and artificial intelligence, coupled with the economic and humanitarian uses of big data analytics, means that there has to be a balance between the acquisition of personal data and the rights of citizens.
Beyond GDPR, PECR, Digital Services Act and so on, assessing your supply chain is more important now than ever, particularly as we rely so much on international suppliers and distributors as well as physical and digital supply chains. We have learned to address issues in the supply chain, such as bribery, competition, modern slavery, and intellectual property; however, more recently we have had to consider geopolitical issues, import and export controls, and other compliance and ethics issues. Now in 2024, we must also consider environmental, sustainability, cyber resilience, digital safety, and accessibility of physical products and digital services that we provide.
Harmful Design in Digital Markets
A position paper on Harmful Design in Digital Markets by the ICO and the CMA is targeted to firms that deploy design practices in digital markets (such as on websites or other online services), as well as product and UX designers that create online interfaces for firms. It provides:
- an overview of how design choices online can lead to data protection, consumer and competition harms, and the relevant laws regulated by the ICO and CMA that could be infringed by these practices; and
- practical examples of design practices that are potentially harmful under our respective regimes when they are used to present choices about personal data processing. These practices are “harmful nudges and sludge”, “confirmshaming”, “biased framing”, “bundled consent” and “default settings”.
It now needs us to assess how we manage Data Protection by Design and how we respect consumer choices. Yet another assessment to minimise potential risks!
Nearly 6 years on from the General Data Protection Regulation, we now face a growing list of assessments that we need to carry out, from Legitimate Interest Assessments, Transfer Risk Assessments, Privacy by Design Assessments, Accessibility Assessments, Children’s Code compliance, and now Online Safety, AI and Cyber Resilience….and the list goes on. Have we reached the point where we need an Assessments Handbook that incorporates these various assessments I have outlined and ensure they integrate with each organisations overall risk management policy?
Used appropriately, I find that these assessments really do manage risk and not only protect the rights of individuals but also protect the business from reputational and brand damage. Sometimes, the use of a risk assessment at the start of or even at an early stage of a project, can act as a “Stop” sign and cause the project team and compliance team to say “just because we can doesn’t always mean we should”.