UK Data Reform – key changes ahead

March 2025

What data protection teams need to know

Plans to reform the UK’s data laws are making speedy progress through Parliament, with the Data (Use & Access) Bill expected to be passed in April or May.

When enacted, the new law will usher in significant amendments to the Data Protection Act 2018, UK GDPR and the Privacy & Electronic Communications Regulations (PECR), as well as measures which go beyond the realms of data protection and ePrivacy.

Controversial plans to amend UK GDPR’s accountability obligations, led by the previous Conservative Government, are not included. So, requirements in relation to Data Protection Officers, Data Protection Impact Assessments and Records of Processing Activities remain the same.

Some new provisions are likely to make data protection compliance efforts slightly easier, although others will impose increased obligations. Here’s our summary of some key changes ahead, with the caveat there’s still time for further amendments.

Individual privacy rights

New right to complain

People will have the right to raise complaints related to use of their personal data. This will require controllers to make sure they have clear procedures to facilitate complaints, for instance, by providing a complaint form. Complaints will require response within 30 days. Alongside this, organisations may also be obligated to notify the ICO of the number of privacy-related complaints they receive during a specified time period.

In practice this means individuals will first have to seek a resolution directly with an organisation, before escalation to the regulator. This is aimed in part at reducing the volume of complaints the ICO receives.
Some sectors, such as financial services and those who receive FOI requests, will already have complaints procedures in place to meet other legal obligations. For others, these procedures will need to be established.

It’s likely privacy notices will need to be updated to reflect this change. If notification to the ICO of complaint volumes is required, this raises questions about how complaints are categorised and what additional records organisations will be required to keep.

Timescales and seeking clarification

Amendments will clarify the time period for compliance with privacy rights requests. The clock does not start until the organisation is satisfied the requestee is who they say they are (i.e. proof of identity has been received). If an organisation reasonably requests further information to clarify a request, the timescale for responding can be paused (i.e. the ‘clock stops’) until this information is provided. These changes are unlikely to have much operational impact, as they simply provide statutory footing to existing ICO guidance on this subject.

Reasonable and proportionate searches

It’s confirmed organisations should conduct a “reasonable and proportionate” search for personal data in responding to Data Subject Access Requests (DSAR). Again, this gives current ICO guidance a statutory footing, and may prove helpful for organisations handling particularly demanding requests.

Court procedures

Where there’s a legal dispute over the information provided (or not provided) in response to a DSAR, a court will be able to request organisations make such information available for the court to inspect and assess. This means organisations will need to make sure they clearly document non-disclosure decisions, including their justifications. This is something we’d strongly advise doing already.

Right to be informed

The obligation to provide privacy information to individuals (i.e. under Article 13 and 14 of UK GDPR) will not apply if providing this information “is impossible or would involve disproportionate effort”. This is most likely to be particularly relevant where organisations have gathered personal data indirectly, i.e. not directly from the individuals. This was a point of contention in the Experian vs ICO case, where Experian argued it would be disproportionate effort to notify and provide privacy information to the millions of people whose data they process from the Edited Electoral Roll.

Legitimate Interests

Direct marketing

Legitimate interests will be confirmed in law as an acceptable lawful basis where necessary for direct marketing purposes. While there are concerns in some quarters this will lead to more ‘spam’ marketing, I’d stress the direct marketing rules under PECR will still apply, so legitimate interests will remain an option only when the law doesn’t require consent.

Recognised legitimate interests

The concept of ‘recognised legitimate interests’ is to be introduced, whereby organisations will not need to conduct a balancing test (i.e. Legitimate Interests Assessment) when relying this lawful basis for certain purposes. The list of recognised legitimate interests includes the following (and may be expanded):
Disclosures to public bodies, where it is asserted personal data is necessary to fulfil a public function.
Disclosures for national or public security or defence purposes, emergencies,
Disclosures for prevention or detection of a crime, and safeguarding vulnerable individuals.

International Data Transfers

There are amendments to risk assessment requirements for international data transfers. Currently, where there’s no ‘adequacy’ decision for the destination country, organisations need to undertake a Transfer Risk Assessment. Moving forward, organisations transferring data overseas will need to “reasonably and proportionately” consider if the data protection standards in the destination country will be materially lower than those in the UK. This gives potential room to streamline assessment procedures, especially to reduce the burden for low-risk transfers.

Reforms to UK data laws will be scrutinised by the EU Commission when it reviews its adequacy decisions for the UK. These currently allow for the free flow of personal data between the EEA and UK, without the need for additional risk assessments or safeguard measures. The EC review of these decisions was due in June this year, but this has been delayed until December. The general consensus is there’s hopefully nothing considered too radical to scare the horses and UK adequacy will be renewed. Nonetheless, this is one to watch.

Special Category Data

A mechanism is included allowing for future introduction of newly defined special categories of personal data. An example given is ‘neurodata’, which is information gathered from the human brain and/or from the nervous system. As the requirements for processing special category data are restricted under UK GDPR, introducing new types has the potential to lead to significant implications in some sectors.

Automated decision-making

A noteworthy amendment is to be made to Article 22 of UK GDPR which currently places strict restrictions on automated decision-making (including profiling) which results in legal or similar significant effects. This will be relaxed, only applying to automated decisions using special category data. With any other personal data, there will be a requirement to put in place certain safeguards, such as giving individuals the ability to contest decisions and requiring human intervention.

This change will give organisations more flexibility to make automated decisions using ‘normal’ personal data, for example when utilising AI systems. However, there are concerns it could have a negative impact on people’s rights. This also represents a marked distinction between the UK and the EU approaches, which may be a key consideration in the EU’s review UK adequacy.

Steve Wood, Founder of PrivacyX Consulting and former UK Deputy Information Commissioner says: “This creates a real importance on the Code that will be produced by the ICO, covering how the safeguards should be applied in practice. A current priority for the ICO is use of AI in recruitment and this is an emerging area of risk, including the use of AI in fire and hire decisions in the gig economy. Time will tell whether it was premature to remove the precautionary approach of Article 22 when the implications of using AI for automated decision making are still being assessed.”

‘High risk’ AI decisions

People will have the right to request information where a decision is either solely, or in part, based on automated processing including AI and machine learning, and has a legal or similar significant effect on them. Controllers will be required to provide an explanation of the criteria used to reach the decision along with a description of the key factors (or features) which most significantly influenced the decision. Individuals will be able to request human review or details of how to appeal the decision.

Data protection by design to protect children

Amendments to existing law make specific reference to additional protections for children (anyone under the age of 18). When assessing appropriate ‘technical and organisational’ measures in relation to online services likely to be accessed by children, organisations will be legally obliged to take account of how children can best be protected, confirm that children merit additional protection, and have different needs at different ages and stages of development. Such measures strengthen the need to adhere to the UK Children’s Code.

Charities and the marketing ‘soft opt-in’

The use of the ‘soft opt-in’ exemption to consent for electronic marketing is to be extended to charities. This means charities will be able to provide people with an ‘opt-out’ mechanism rather than an ‘opt-in’ to marketing emails (and/or SMS), as long as the following conditions are met:

The sole purpose of the direct marketing is for the charity’s own charitable purpose(s)
Contact details were collected when the individual:
a) expressed an interest in the charity’s purpose(s); or
b) offered or provided support to further the charity’s purpose(s).
An opportunity to refuse/opt-out is given at the point of collection, and in every subsequent communication.

We’ve written about the pros and cons of switching to the ‘soft opt-in’ here.

PECR Fines

Fines for infringements of the Privacy & Electronic Communications Regulations which govern direct marketing and cookies are set to significantly increase. Currently the maximum fine under PECR is currently capped at £500k, but the limits will be brought in line with the much more substantial fines which can be levied under UK GDPR. Reckless disregard for marketing and cookie rules is about to get more costly.

Spam emails and texts

What constitutes ‘spam’ is to be extended to include emails and text messages which are sent, but not received by anyone. This will mean the regulator will be able to consider much larger volumes in any enforcement action, which may result in much higher fines – SPAMMERS BEWARE!

Cookies & similar technologies

Exemptions are set to be introduced from the requirement to collect consent for certain types of cookies and similar technologies, as long as a clear opportunity to opt-out is provided. This will be permitted for purposes such as website analytics and optimising content. I envisage much reconfiguring of the array of website consent management platforms which have been implemented in recent years. But remember, targeting/advertising cookies (including social media targeting pixels) will still need consent.

Alongside these changes the ICO is reviewing PECR consent requirements to “enable a shift towards privacy-preserving advertising models”.  This autumn a statement is expected identifying ‘low-risk’ advertising activities which in the ICO’s view are unlikely to cause harm or trigger enforcement action. You can read more about this in the ICO’s package of measures to drive economic growth.

Research

Purpose limitation and provision of privacy information

Currently, UK GDPR makes it tricky to reuse personal data for new purposes, yet research projects can often move into areas which weren’t anticipated when data was originally collected. A new exemption is to be introduced, in relation to the provision of privacy information. Amendments are also set to be made to the purpose limitation principle to make further ‘RAS purposes’ compatible with the processing. Both these changes are subject to ‘appropriate safeguards’. (‘RAS purposes’ covers processing for scientific and historic research, and archiving in the public interests, and statistical purposes).

Scientific research

The definition of ‘scientific research’ is to be clarified and will explicitly state research can be a commercial or non-commercial activity. Consent for scientific research is to be adapted, in part driven by a desire to make it easier for personal data collected for specific research to be reused for other scientific research purposes.

Commenting on these changes Ellie Blore, Data Protection Officer at Best Companies says; “The aims are to provide greater flexibility for commercial research and innovation. It expands the definition of ‘Scientific Research’ to include certain privately funded and commercial research activities, meaning that some private AI training and research will now be classified under Scientific Research. Furthermore, secondary processing of data for Scientific Research and Development purposes will be considered compatible with the original purpose of data collection, provided the appropriate safeguards are in place. There are exemptions added here, and this will undoubtedly be an area to watch as the Secretary of State will have the power to further vary those safeguards.”

Smart data schemes

Provisions are being introduced to support the growth of new ‘smart data schemes’. The right to portability under UK GDPR currently allows individuals to obtain and reuse their personal data. Moving forward, this will be expanded to allow consumers to request their data is directly shared with authorised and regulated third parties. This will be underpinned by a framework with data security at its core. It’s hoped this will allow for the growth of smart data schemes, enabling data sharing in areas such as energy, telecoms, mortgages and insurance.

Healthcare information

Ever been to hospital and found your GP has no record of your treatment, or the hospital can’t access your GP’s notes? The government is hoping data reform will pave the way for a more consistent approach to information standards and technology infrastructure, so systems can ‘talk’ to each other. For example, allowing hospitals, GP surgeries, social care services, and ambulance services to have real-time access to information such as patient appointments, tests, and pre-existing conditions.

Department Board Appointments

A new measure is to be introduced requiring digital leaders to be represented at executive level within Government departments and other bodies, such as NHS Trusts. At least one of the following roles will need to be appointed to a departmental board or equivalent body; a Chief Information Officer, Chief Technology Officer, Chief Digital Information Officer, Service Transformation Leader or other equivalent role.

Digital verification services

The aim is to create a framework for trusted digital verification services, moving the country away from paper-based and in-person tasks. For example, proposals allow for digital verification services aimed at simplifying processes such as registering births and deaths, starting a new job and renting a home.

New Information Commission

The Information Commissioner’s Office is set to be replaced by an Information Commission. This is to be structured in a similar way to the FCA, OFCOM and the CMA, as a body corporate with an appointed Chief Executive. There’s also provision for the Government to have considerable influence over the operations of the new Commission.

In summary, reform of UK data law has its critics. Among other matters they fear a watering down of people’s rights and an increased ability for personal data to be shared, perhaps recklessly, with and within the public sector. However, the changes are not overly radical, having varying degrees of impact depending on your sector and organisation’s core activities.

Chris Combemale, Director of Policy and Public Affairs at the Data & Marketing Association, welcomes the changes ahead; “The DMA strongly supports the DUA Bill and has worked tirelessly for almost five years to achieve reforms that balance innovation and privacy in accordance with the principles laid out in recital 4 of GDPR. We particularly welcome the greater certainty on the use of legitimate interests as a lawful basis for direct marketing, the extension of the email soft opt-in to charities, exemptions to consent for some types of cookies, greater clarity in Article 22 for automated decision making and the obligation for the ICO to consider innovation and competition alongside privacy.”

Privacy X Consulting’s Steve Wood doesn’t believe the impact will be hugely significant; “The DUA Bill represents an evolution of UK GDPR that should not drive many changes for multi-national companies’ DP governance, which is likely to remain focused around the EU GDPR standard. The more interesting opportunities may lie in the confidence that is provided to the take up of federated digital identity by the statutory underpinning for the Trust Framework and opportunities for data intermediary businesses in relation to the Smart Data provisions.”

UPCOMING ONLINE EVENT – UNWRAPPING UK DATA REFORM

Join a great line up of speakers on 29 April who’ll be discussing the changes under the DUA Bill and taking your questionsBOOK YOUR PLACE  

ICO cookie action and ‘consent or pay’ guidance

January 2025

There’s been a flurry of activity recently in relation to cookies and similar technologies; a key strategic area for the ICO in 2025.

A review of cookie usage on the UK’s top 1,000 websites has been announced, new guidance on ‘consent or pay’ models has been published, and the ICO has published draft ‘guidance on the use of storage and access technologies’ (previously known as ‘cookies guidance’).

Meanwhile the Data (Use & Access) Bill is progressing through Parliament and may pave the way for a more relaxed approach to first party analytics cookies.

1,000 websites in the regulatory spotlight

With a focus on the ‘uncontrolled tracking’ of users, the ICO is set to review the compliance of the top 1,000 websites. This marks a significant expansion of the regulator’s proactive reviews, having already assessed the top 200 websites. That resulted in communications to 134 of those organisations (i.e. 2 out of every 3), setting out specific regulatory expectations. One reprimand was issued for ‘unlawfully processing people’s data through advertising cookies without their consent’. You can read more about the reprimand here and our five steps to cookie compliance.

Stephen Almond, ICO Executive Director of Regulatory Risk says; “Our ambition is to ensure everybody has meaningful choice over how they are tracked online.”

What does this mean in practice?

Along with not deploying non-essential cookies on users’ devices if they haven’t actively given their consent, the ICO is stressing organisations must make it as easy for users to ‘reject all’ as it is to ‘accept all’.

ICO action in this area, has seen a number of organisations (but certainly not all) implement consent management platforms to provide users choices. Whether CMPs are actually configured correctly is another matter.

Organisations should be mindful or this regulatory attention, and could also be the target of a cookie compensation demands from individuals.

Are ‘consent or pay’ models, okay?

The ICO’s focus on cookies, has seen some websites move to a ‘consent or pay’ model. This model means access to online content or services is dependent on users either consenting to being tracked for advertising purposes (using cookies or similar technologies), or paying for access without being tracked.

This move has caused some controversy with questions over whether it can really be a fair choice. Let’s look at how it has come about, and I’ll use news content publishers as an example.

I remember the days when I had to buy a physical copy of a newspaper. Then everything went online, and we could all access this content for ‘free’. However, the publishers still needed a way to fund (and dare I say make a profit) from all this content they paid journalists to write. The way this has been primarily funded is by running ads which are targeted to optimise the user experience and revenues using cookies and other tracking technologies.

When a US senator as Mark Zuckerberg how Facebook remained free, you may recall he famously and simply answered; “We run ads”. And the same could be said for publishers and other website operators.

With websites pushed to make sure their activities comply with data protection and ePrivacy rules, we’ve seen more sites providing a “Reject all” button. However, if users increasingly click “Reject all” alternative approaches are needed to fill a not insignificant revenue hole.

This led some publishers to introduce a full pay wall. For example, for a quite a while users had to pay to read most articles on the Telegraph or Independent’s websites. Other forms of advertising have and are also being experimented with, such as ‘contextual advertising’ which don’t rely on cookies/similar tech. However, there remain concerns alternatives do not yet (or may never) be as profitable as cookies. Please also see Life after cookies

Hence the emergence of the ‘consent or pay’ model which has been adopted by some website operators in the UK and elsewhere in Europe – notably news publishers.

The ICO’s take on ‘consent or pay’

The ICO’s new guidance states: “Consent or pay” models can be compliant with data protection law if you can demonstrate that people can freely give their consent and the models meet the other requirements set out in the law.

The guidance makes it clear the right to the protection of personal data needs to be balanced against other rights, such as the right to conduct business. We may have got used to lots of free news content, online games, and other free services, but the ICO recognises organisations should be able to monetise products, and there is no obligation for providers of online services to offer their services for free.

However, the ICO says any decision to adopt the ‘consent or pay’ model must be assessed and documented to make sure it is compliant with the UK GDPR and the Privacy and Electronic Communications Regulations (PECR). Businesses need to be able to justify their approach.

Four key factors are set out in the guidance to support this assessment. These are:

Power imbalance: Is there a clear power imbalance between you and the people using your product or service? It’s unlikely that people can freely give their consent if they have no realistic choice about whether or not to use the service. You should especially consider existing users of your product or service under this factor.

Appropriate fee: Have you set an appropriate fee for accessing your service without personalised advertising? It’s unlikely that people can freely give their consent if your fee is inappropriately high, making it an unrealistic choice.

Equivalence: Is your core service broadly equivalent in the products and services offered where people consent to personalised advertising and where people pay to avoid personalised advertising? You can include additional perks or features in either service, however you should provide an equivalent core service across all options to ensure that people have a free choice.

Privacy by design: Do you present the choices equally to people, with clear, understandable information about what each choice means and what they involve? People cannot freely give their consent if they are uninformed about the available options or have their choice influenced by harmful design practices.

Any business looking to use the ‘consent or pay’ model would be wise to read the Consent or Pay Guidance in detail.

It’s worth noting this model has also been scrutinised by EU Data Protection Authorities and is the subject of complaints by privacy rights groups. See the European Data Protection Board Opinion on Consent or Pay.

Draft guidance on the use of storage and access technologies

The ICO has also published draft Guidance on the use of storage and access technologies, which is open to consultation until 14 March 2025. This builds on the regulator’s previous ‘cookies guidance’, and has clearly been deliberately renamed to reflect the range of storage and access technologies which are in widespread use, alongside cookies. The aim is to give providers of online services a deeper understanding of how PECR, and where relevant, data protection law applies to the use of storage and access technologies.

In brief, PECR applies to any technology which stores information, or accesses information stored on a subscriber/user’s terminal equipment. The ICO says this includes but is not limited to; cookies, tracking pixels, link decoration and navigational tracking, web storage, fingerprinting techniques, and scripts and tags. In a nutshell, the rules are:

  • You must tell users about any storage and access technologies you use, including explaining what they do.
  • You must collect prior consent unless an exemption applies, and such consent must meet the UK GDPR standard.
  • For the ‘communication’ exemption to apply, the transmission of the communication must be impossible without the particular storage and access technology.
  • For the ‘strictly necessary’ exemption to apply, the purpose of storage or access must be essential to provide the service requested.

It’s also worth noting the ICO says any UK-based organisation, even if they host online services overseas, will need to comply with PECR.

In conclusion, every business with an online presence needs to carefully consider how they are using cookies and similar technologies, and requesting consent from users. Web and app developers in particular need to be aware of the regulatory landscape and have a good understanding of the rules.

Data Protection Impact Assessments for Agile projects

November 2023

How to assess risks when a project has multiple phases

Agile methodology is a project management framework comprising of several dynamic phases, known as ‘sprints’. Many organisations use Agile for software & technology development projects, which often involve the processing of personal data.

From a data protection perspective, Agile (and indeed other multi-stage projects) present some challenges. The full scope of data processing is often unclear at the start of a project. The team are focussed on sprint one, then sprint two, and so on. So how do you get Privacy by Design embedded into an Agile project?

Conducting a Data Protection Impact Assessment (DPIA) is a legal requirement under data protection law for certain projects. Even when a DPIA is not mandatory it’s a good idea to consider the privacy impacts of any new processing.

Looking at a project through a privacy lens at an early stage can act as a ‘warning light’, highlighting potential risks before they materialise and when measures can still be easily put in place to reduce the risks.

If your organisation uses Agile, it’s likely you’ll need to adapt your DPIA process to work for Agile projects. Understand the overall objectives and direction of travel to get a handle on how data use will evolve and what risks might be involved.

Working together to overcome challenges

It’s important all areas of the business collaborate to make sure projects can proceed at pace, without unnecessary delays. Compliance requirements must be built into Agile plans alongside other business requirements – just as ‘Privacy by Design’ intended.

Those with data protection responsibilities need project management teams to engage with them at an early stage, to explore the likely scope of processing and start to identify any potential privacy risks, while there’s still time to influence solution design.

This isn’t always easy. Given the fluid nature of Agile, which is its great strength, there is often very limited documentation available for review to aid Compliance assessments.

Privacy questions often can’t be answered at the start – there may be many unknowns. So its key to agree what types of data will be used , for what purposes and when more information will be available for the DPIA – crucially before designs are finalised. Timings for assessment need to be aligned to the appropriate sprints.

As many companies have found, embedding privacy awareness into the company culture is a big challenge and ensuring Data Protection  by Design is a key consideration for tech teams at the outset is an on-going task.

Example: data warehouse

Organisations with legacy data systems might want to build a data warehouse / data lake to bring disparate data silos together under one roof, gain new insights and drive new activity. It’s important to assess any privacy impacts this new processing create.

Using Agile, new capabilities may be created over several development phases. So it’s important to conduct an initial assessment at the start, but stay close to as the project evolves and be ready to collaborate again, in line with sprint timings – before data is transferred or before new solutions are created.

Top tips for ‘Agile’ DPIAs

Here are my top tips for a fluid DPIA process;

1. DPIA training & guidance – make sure relevant teams, especially IT, Development and Procurement, all know what a DPIA is (in simple layman’s terms) and why it’s important. They need to recognise the benefits of including privacy in scope from the start (i.e. ‘by Design’).

2. Initial screening – develop a quick-fire set of questions for the business owner or project lead, which will give the key information you need, such as

  • the likely personal data being use
  • any special category data, children’s data or vulnerable people’s data
  • the purposes of processing
  • security measures… and so on

Once it has been identified there is personal data involved you can start assessing the potential risks, if any. As odd as this may sound, it is not uncommon for tech teams to be unsure at the beginning of a project if personal data (as defined under GDPR to include personal identifiers) will in fact be involved.

3. DPIA ‘Lite’ – if there are potential risks, develop a series of questions to evaluate compliance against the core data protection principles of the GDPR.

The Agile environment can prove challenging but also rewarding. Adopting a flexible DPIA process which works in harmony with Agile is a positive step forward for innovative companies, allowing your business to develop new solutions while protecting individuals from data protection risks, as well as protecting your business from any possible reputational damage.

Call for ban on use of live facial recognition

October 2023

Live facial recognition is being used by UK police forces to track and catch criminals and may be used by retailers to crack down on shoplifting. Is live facial recognition a force for good or a dangerous intrusion on people’s privacy?

The announcement by the UK Government of plans for police to access passport photos to help catch criminals has led to a call for an immediate ban on live facial recognition surveillance.

The accuracy of the algorithms behind this technology are being questioned, as are the privacy implications. Where facial recognition is used, there needs to be a strong justification for its use and robust safeguards in place to protect people.

What is live facial recognition?

Live facial recognition (LFR) is a broad term used to describe technologies that identify, catalogue and track human faces. The technology can be used in many ways but probably the biggest topic of debate relates to the use of facial images captured via CCTV or photos which are processed via biometric identifiers.

These identifiers typically include the unique ratios between an individual’s facial features, such as their eyes, nose and mouth. These are matched to an existing biometric ‘watchlist’ to identify and track specific individuals.

Use of LFR by UK police forces

The Home Office says facial recognition has a ‘sound legal basis’, has already led to criminals being caught and could also help the police in searching for missing or vulnerable people.

Facial recognition cameras are being used to scan the faces of members of the public in specific locations. Currently UK police forces using the technology tell people in advance about when and where LFR will be deployed, with physical notices alerting people entering areas where it’s active.

However, the potential for police to be able to access a wider range of databases, such as passports, has led a cross-party group of politicians and privacy campaigners say both police and private companies should ‘immediately stop’ their use of such surveillance, citing concerns about human rights and discrimination.

Silkie Carlo, Director of Big Brother Watch says; “This dangerously authoritarian technology has the potential to turn populations into walking ID cards in a constant police line-up.”

It’s worth noting in 2020 the Court of Appeal in the UK ruled South Wales Police use of facial recognition was unlawful.

Use of LFR by retailers

Some of the UK’s biggest supermarkets and retailers are also turning to face-scanning technology in a bid to combat a significant rise in shoplifting.

Earlier this year the ICO announced its findings from an investigation into the live facial recognition technology provided to the retail sector by the security firm Facewatch. The aim of the technology is to help businesses protect their customers, staff and stock.  People’s faces are scanned in real time as they enter a store and there’s an alert raised if a subject of interest has entered.

During its investigation the ICO raised concerns including surround the amount of personal data collected and protecting vulnerable people by making sure they don’t become a ‘subject of interest’. Based on information provided by Facewatch about improvement made, and ongoing improvements, the ICO concluded the company had a legitimate purpose for using people’s information for the detection and prevention of crime.

Collaboration between police and retailers

Ten of Britain’s largest retailers including John Lewis, Next and Tesco are set to fund a new police operation. Under Project Pegasus, police will run CCTV pictures of shoplifting incidents provided by the retailers against the Police National Database. It’s anticipated the project will be funded by retailers.

The risk of false positives

The use of Live Facial Recognition raises significant privacy and human rights concerns, such as when it is used to match faces to a database for policing and security purposes.

A 2019 study of facial recognition technology in the US by National Institute of Standards and Technology (NIST) discovered that systems were far worse at identifying people of colour than white people. Whilst results were dependent on the algorithms used, NIST found that some facial-recognition software produced far higher rates of false positives for black and Asian people than whites, by a factor of 10 to 100 times.

NIST also found the algorithms were worse at identifying women than men. Clearly there are huge concerns to be addressed, brought into sharp focus now with the Black Lives Matter movement. Interestingly, there was no such dramatic difference in false positives in one-to-one matching between Asian and white faces for algorithms developed in Asia.

Privacy concerns

Any facial recognition technology capable of uniquely identifying an individual is likely to be processing biometric data (i.e. data which relates to the physical, physiological or behavioural characteristics of a person).

Biometric data falls under the definition of ‘special category’ data and is subject to strict rules. To compliantly process special category data in the UK or European Union, a lawful basis must be identified AND a condition must also be found in GDPR Article 9 to justify the processing. In the absence of explicit consent from the individual however, which is not practical in most LFR applications, it may be tricky to prove the processing meets Article 9 requirements.

Other privacy concerns include:

  • Lack of transparency – an intrusion into the private lives of members of the public who have not consented to and may not be aware of the collection or the purposes for which their images are being collected and used.
  • Misuse – images retrieved may potentially be used for other purposes in future.
  • Accuracy – inaccuracies inherent within LFR reference datasets or watchlists may result in false positives and the potential for inaccurate outcomes which may be seen as biased or discriminatory.
  • Automated decision-making – if decisions which may significantly affect individuals are based solely on the outcomes of live facial recognition.

Requirement to conduct a Data Protection Impact Assessment (DPIA)

A DPIA must be conducted before organisations or public bodies begin any type of processing that is likely to result in a ‘high risk’ to the rights and freedoms of individuals.

This requirement includes:

  • the use systematic and extensive profiling with significant effects on individuals;
  • the processing special category or criminal offence data on a large scale; and
  • the systematic monitoring of publicly accessible places on a large scale.

In our view, any planned use of LFR is very likely to fall under the requirement for the organisation or public body to conduct a DPIA in advance of commencing the activity and take appropriate steps to ensure people’s rights and freedoms are adequately protected.

So where does this leave us?

Police forces and other organisations using LFR technology need to properly assess their compliance with data protection law and guidance.

This includes how police watchlists are compiled, which images are used and for what purpose, which reference datasets they use and how accurate and representative of the population these datasets . The potential for false positives or discriminatory outcomes should be addressed.

Any organisation using LFR must be ready to demonstrate the necessity, proportionality and compliance of its use.

Meanwhile, across the Channel, members of the European Parliament have agreed to ban live facial recognition using AI in a draft of the EU’s Artificial Intelligence Act. Will the UK follow suit?

Suppliers – why your contracts and security are important

Processors and controllers are both accountable

Do you provide a service to clients and handle your client’s personal data? If you’re acting as a processor, a recent GDPR fine serves as a helpful reminder to be sure to have all your ducks in a row.

There’s a clear warning you shouldn’t just assume the contracts your clients ask you to sign are okay, nor can you just say you have robust security measures in place, you actually have to have them!

In this recent case a software publisher, acting as a processor for their clients, was fined 1.5 million Euros by the French regulator (CNIL) following a data breach involving sensitive health data.

It was found data was exfiltrated by unauthorised parties from a poorly protected server. In a nutshell the key findings were:

  • Significant gaps in the processor’s security processes
  • Contractual documentation which failed to include mandatory obligations required under Article 28 of GDPR.

It’s worth noting the fine was based on both these counts. The ruling makes it clear processors should be wary of relying on their clients to make sure contractual terms are up to scratch. It’s the responsibility of both parties.

Here’s a quick recap on the how suppliers can minimise their risks.

Getting the relationship clear

The most important first step is to establish the relationship between your company and another.

  • Are you handling a client’s data on their behalf, under their instruction, to provide a service to them?
  • Are you acting as controller, clearly determining how the personal data will be used for your own purpose(s)?
  • Are you both? i.e. acting as a controller in certain circumstances, but a processor for specific services you provide to clients.

Are we controller or are we processor?

What are the contractual requirements?

Once you’re clear you are a processor, acting under your client’s instructions, the law states your arrangements with clients must be covered by a binding agreement. EU and UK GDPR set out specific provisions which must be written into such contracts. In brief these are as follows:

1. Types of personal data & categories of data subject

The contract needs to specify what types of personal data you’ll be handling. It should also include details of whether this data relates to your client’s employees, patients, customers, and so forth.

2. Nature, purpose, duration of processing

The contract should describe the nature of the service(s) you provide, what purpose(s) this serves and the term of the contract. The agreement should cover instructions from your client of what you are permitted to do with their data.

3. The rights and duties of each party

The obligations of both parties should be clearly defined. For example, the client’s obligation to have a lawful basis for processing, its responsibility to fulfil individual privacy rights and your commitment as a supplier to not use your client’s data for any other purpose.

4. Technical and organisational measures

As a supplier you need to provide sufficient guarantees to implement proportionate technical and organisational measures to meet requirements of UK/EU GDPR.

5. Sub-processors

If you engage other companies (‘sub processors’) to support you in delivering your services, you’ll need specific or general written authorisation from your client(s). If you make any changes to which sub-processors you use (including software providers), you’ll need to tell your client and give them the opportunity to object. Contractual terms should stipulate that you are accountable for your sub-processors.

6. International transfers

If relevant, the agreement should include details and provisions for any transfers of personal data to a third country. For example if you are based in the UK, a transfer to any other country. This would include details of any sub-processors based outside the UK. A transfer is often associated with the act of sending or transmitting personal data from one country to another. It should be noted the definition also covers cases where personal data is made ‘available’, in other words can be accessed in a third country.

7. Duty of confidentiality

There must be a confidentiality clause, which commits you to ensuring any of your staff authorised to access the client’s data are committed to a duty of confidentiality or are under a statutory obligation of confidentiality.

8. Assisting your clients

The contract should cover your commitment to assisting your clients, where necessary, with handling individual privacy rights, handling data breaches and conducting data protection impact assessments.

9. Return or destruction of data

It should be clear what happens to the client’s data when the contract ends. Does the client want you to return the data or destroy it?

10. Audits and inspections

As a processor you must agree to make available all information necessary to demonstrate your compliance and agree to audits, including inspections by your client or their authorised auditor.

Processors have obligations

This recent CNIL fine shows you can’t just sign a contract, sit back and relax.

As a processor you’re responsible for your sub-processors, data transfers, staff training and confidentiality, assisting your clients when necessary and so forth. You have to be sure to implement the technical and organisation measures you said you would to protect your client’s data.

While some clients will ask you to jump through multiple hoops as part of their due diligence process, making you clearly demonstrate your security measures are robust, others may not be so picky. But that doesn’t release you from your responsibilities.

The law and this recent fine make it clear processors can be held liable. In the event of a breach, your contractual arrangements and internal practices could come under rigorous scrutiny.

Overcoming the challenges of data retention

January 2022

Clearing out data you no longer need

How long should we keep our data? Sounds simple enough, but a question many businesses struggle with.

The UK GDPR tells us personal data should only be kept ‘as long as necessary for specified purposes’. So if your organisation is found to be storing data for don’t really need now, you could be subject to unwelcome scrutiny.

Perhaps the main risk here is if your business suffers a data breach. It could become far more serious if you couldn’t provide a suitable justification why you were still holding onto unnecessary data which was included in the breach. In effect, it means two violations of the law in one fell swoop! If you have to notify the individuals affected, what would you say?

Tackling the data we’re holding too long

This does require some thought and planning. As a pre-requisite, you’ll need to know what personal data your organisation holds and what purposes it’s being used for.

Creating a data retention policy is straightforward enough, but developing a record retention schedule can be more complex.

Most organisations use personal data for multiple purposes. You need to take account of each specific purpose and identify the appropriate lawful basis for that processing, before you consider an appropriate retention period. An up-to-date Record of Processing Activities can be a real asset here.

Deciding on suitable retention periods

Firstly, check if there’s a law which mandates you how long certain data must be kept. Laws may dictate minimum or maximum retention periods.

For example, in the UK employment law requires data on ex-employees to be kept for at least 6 years after they leave the business. In certain situations the retention period may be longer. For example, let’s imaging you’re a building firm and your employees have come into contact with hazardous substances as part of their job and you carry out health monitoring. The retention period for these records is much longer.

In many scenarios, however, there are no relevant laws which specify how long the data must be help. Examples include marketing, sales & account management records. In these situations organisations need to judge for themselves what an appropriate retention period should be, and be ready to justify their decision. Take a balanced and reasonable approach, based on your reasons for processing that data.

Deciding what period is ‘necessary’

Where there is no statutory requirement, we suggest speak with internal data owners / relevant functions. The following questions should help you reach the appropriate decision on a period you can justify:

a. Are there any industry standards, guidelines or known good-practice guidelines?
b. Does the product lifecycle have an impact on retention?
c. What are the business drivers for retention? Are they justifiable?
d. What evidence is there that the data is needed for the proposed amount of time?
e. Is there potential for litigation if its keep too long (or deleted too soon)?
f. Is it necessary to keep personal data to handle complaints?

Don’t forget your processors service providers

Controllers who use service providers acting as data processors, should make sure they provide clear contractual instructions about their data retention requirements.

Tell them the retention periods you need and give specific actions they should take when a retention period ends. For example, should they delete the data, return it to you or anonymise it? These may be listed in a data schedule, appended to the main contract or agreement.

Key takeaways

Data retention can be tackled effectively if you get key stakeholders across the business engaged and involved. Agree retention periods and get started on implementing them.

For more tips, tools and templates…

Why not download DPN’s Data Retention Guide.

 

Is working from home a security nightmare?

September 2021

Yes! Here’s our checklist of what do to and watch out for with your WFH teams.

I was on yet another zoom call with my DPN colleagues the other day and we were baffled by some dreadful echoing of our voices. Everything I said bounced back at me.

We logged out, logged back in again but nothing changed. I turned my phone off – no change. Then I remembered that I was sitting in the kitchen with my Alexa turned on. When I unplugged Alexa, the echo disappeared.

That felt odd – we concluded that my Alexa was listening for instructions and so was listening into our call. That felt creepy!!

As we all work from home, this led to a discussion about whether we should put in place additional measures to maintain security over and above the work we had recently done to achieve Cyber Essentials.

The cyber essentials questionnaire doesn’t mention Alexa style devices or much about the location of workspace when you’re WFH.

With thanks to the ICO guidance and the Cyber Essentials documentation, here is our checklist for safely working from home.

1. Policies

Make sure you have policies and procedures in place which all your employees must adhere to. Make sure employees have read and understood the policies you’ve created. Even better, test them on it.

2. BYOD (Bring your own device)

Do decide whether employees can use their own devices. Some organisations have very firm “no personal devices” policies but some are more ambiguous. It is an inescapable fact that letting employees use their own devices is high risk; you’re mixing up business related apps and software with random stuff your employee may have downloaded from the web.

3. Network Access

Decide how employees are going to access business servers – is there a VPN in place? Do you need strong security protocols? It’s important to be proportionate with security measures. Obviously, a bank will feel different to a consultancy that handles no data.

4. WFH in coffee shops/cafes

Does your employee ever work outside the home? In a café for instance? Should you supply them with screens for their devices? Have they been briefed on the importance of keeping their devices secure in a public space and never leaving them alone?

5. The home environment

Does your WFH employee share their home with others? Are they using their personal broadband connection? If so, make sure they change the original passcode on the Wi-Fi to make access more secure. Can they lock their rooms or lock their devices away? Are there any Alexa style devices nearby?

In some instances, you may decide there is no circumstance under which an employee can work from home if the data they’re handling is too sensitive. Make sure you risk assess who can and cannot work at home and provide clear guidance.

6. 2FA and MFA

Where possible, enforce two factor or multi-factor authentication. There is often a lot of resistance to this additional security but, if available, they should be mandatory.

7. Passwords

How about password length – I suspect a surprising number of people still use simple passwords like, say, “12345”. They should be unique and complex with a mixture of letters, numbers and symbols and, ideally, change enforced on a regular basis.

Increasingly it makes sense to use a password manager to keep all you unique and complex passwords in one place. You still need one master password for that system but at least that’s only one you need to remember.

8. Software updates

Are you able to update the user’s software remotely? If they’re using their own device, how do you ensure software is up to date? What safeguards are in place?

9. Cloud Storage

How are documents and files stored? Is there a cloud-based storage facility such as Sharepoint? How is this accessed and who controls the access? There are plenty of opportunities to inadvertently share a document with multiple people by allowing the sharing of links. Try not to let that happen.

10. Email

When using email, all the usual safeguards should apply when it comes to phishing attacks. The IT team should be carrying out tests on a regular basis and provide up to date training on what to watch out for.

Even though our cabinet ministers seem to do it, never ever use your personal email account for work related correspondence!!

How does this all add up?

If you do nothing else, consider the following actions:

  • Gain Cyber Essentials or Cyber Essentials Plus certification: Ensure that you’ve carried out the Cyber Essentials evaluation. It’s particularly important for small businesses but large organisations have also found it useful as well.
  • Conduct a DPIA: Carry out a Data Protection Impact Assessment. This can identify the circumstances under which people are working from home and introduce measures to mitigate the identified risks.
  • Create or bolster your Infosec policy: Create and maintain a robust and proportionate information security policy and ensure all employees are familiar with its contents. Maybe a short test would work well?