UK data regime change consultation: 12 highlights

September 2021

The Government’s consultation on UK data protection reform contains a number of sensible proposals to ease the burden on business. There are also a few surprises likely to raise eyebrows in Brussels. The headlines are:

  • The UK is not about to become the ‘Wild West’ for data, as some may have feared
  • Changes to both UK GDPR and the UK’s Privacy and Electronic Communications Regulations (PECR) look likely
  • A probable relaxation of several areas of UK GDPR, with a focus on outcomes rather than prescribed processes
  • Plans to increase fines under PECR to match those under GDPR, a clear warning to those flagrantly disregarding marketing rules
  • The consultation is a ‘direction of travel’ – nothing’s carved in stone. It’s business as usual for now

The Government’s overall aim is to drive economic growth and innovation and strengthen public trust in use of data.

The way they want to achieve this is to alleviate some of the more prescriptive GDPR obligations on business, whilst retaining a robust data protection regime built largely on existing laws.

This approach is in keeping with the UK’s common law tradition, also used in Australia, New Zealand, Jamaica, Pakistan and Singapore (to name a few), as opposed to the statute law system used across Europe. Common law is viewed by its proponents as more flexible. It’s also why legal proceedings tend to move more quickly in UK courts than those in the EU.

It’s clear the UK Government hopes any changes will be compatible with EU equivalency, enabling the UK to retain adequacy.

Data regime proposals 12 highlights

1. Accountability & Privacy Management Programmes (PMPs)

Changes to the accountability framework are proposed, with businesses expected to have a Privacy Management Programme in place. This approach to accountability is long-established in countries such as Australia, Canada and Singapore.

It’s argued this would allow organisations to implement a risk-based privacy programme based on the volume and sensitivity of personal data they handle, and the types of activities they’re involved in.

By doing this, the proposal seeks to do away with some of the accountability obligations under the current UK GDPR, which may be considered to be more burdensome.

Organisations will still need to know where their data is, what its used for, apply lawful bases, implement robust security measures, manage suppliers, assess privacy risks and fulfil privacy rights. But there could be more flexibility and control over how you achieve this.

This doesn’t mean ripping up all the hard work you’ve done to comply with GDPR.

When the dust has settled, many organisations may choose to stick with the tried and tested framework they’ve already established. Others may jump on the opportunity to adapt their approach.

And let’s not forget, UK businesses operating in Europe will still be governed by EU GDPR.

2. No mandatory Data Protection Officers

The consultation proposes removing the mandatory requirement to appoint a DPO.

Under GDPR, a DPO must be appointed by public authorities – and in the commercial sector – if organisations meet specific criteria. It also sets out requirements and responsibilities for the role.

It’s proposed the requirement for a DPO is replaced with a requirement to designate a suitable individual (or individuals) responsible for overseeing compliance. However, the new law wouldn’t lay down specific requirements & obligations for this role.

3. No mandatory requirement for Data Protection Impact Assessments 

Currently, GDPR makes a DPIA mandatory for high-risk activities. It also sets out core elements such an assessment must include.

Furthermore, it requires supervisory authorities to establish a list of processing operations which definitely require a DPIA.  This led authorities, including the UK’s ICO, to dutifully publish lists of where DPIAs would be considered mandatory, as well as best practice.

The Government is proposing removing this mandatory requirement, although this won’t mean throwing out screening questionnaires and DPIA templates, which are often very useful.

The onus would be on organisations to take a proportionate and risk-based decision on when they consider it appropriate to carry out impact assessments and how they go about this.

4. More flexible record keeping

Completing and maintaining up-to-date records, known as Records of Processing Activities (RoPA) has been one of the more onerous aspects of GDPR.

Again, current law and guidance is prescriptive about records keeping requirements – although small and medium sized organisations (with less than 250 employees) are exempt from this.

It’s proposed a more flexible model for record keeping is introduced.

Maintaining a central record of what personal data you hold, what it’s used for, where it’s stored and who it’s shared with is a sensible and valuable asset for any organisation. Many feel such records are vital to effective data risk management.

So again, you don’t need to rip up your current ROPA, but you may soon be allowed to adapt your record keeping to suit your business and perhaps make your records easier to maintain.

5. Data breach notification threshold changes

It’s clear GDPR has led to data protection authorities being inundated with data breach reports. The ICO, for one, has highlighted a substantial amount of over-reporting.

This isn’t surprising when there’s a legal obligation for organisations to report a personal breach if it is likely to represent a ‘risk’ to individuals.

Its proposed organisations would only need to report a personal data breach where the risk to the individual is ‘material’.  The ICO would be encouraged to produce clear guidance and examples of what would be ‘non-material’ risk, and what would or would not be considered a reportable breach.

6. Data Subject Access Requests changes

The stated purpose of a subject access request is to give individuals access to a copy of their personal data so they can ‘be aware and verify the lawfulness of processing’ (although many organisations might question if this is why some submit requests).

The consultation recognises the burden of responding to DSARs has on organisations, especially smaller businesses which often lack the resources to handle them.

The possibility of charging a nominal fee could be reintroduced. It’s also proposed the threshold for judging when a request may be vexatious / manifestly unfounded is amended.

7. Cookies

Headlines surrounding UK data reform usually focus on ending the barrage of cookie pop-ups. The consultation proposes two main options:

  • Permitting organisations to use analytics cookies and similar technologies without the user’s consent. In other words, treating them in the same way as ‘strictly necessary’ cookies. It’s worth noting that this proposal is included in the most recent EU ePrivacy draft. (It’s accepted further safeguards would be required to ensure this had a negligible impact on user privacy and any risk of harm. It would also not absolve organisations from providing clear and comprehensive information about cookies and similar technologies).


  • Permitting organisations to store information on, or collect information from, a user’s device without their consent for other limited purposes. An example given is that this could include processing necessary for the legitimate interests of controllers where the impact on privacy is likely to be minimal.

The Government says it is keen to hear feedback on the most appropriate approach.

8. Legitimate Interests

There’s a proposal to create an exhaustive list of legitimate interests which organisations could rely on without needing to conduct the balancing test, i.e. no Legitimate Interest Assessment (LIA) required.

The following are some of the examples given:

  • ensuring bias monitoring, detection and correction in AI systems
  • statutory public communications and public health & safety messages by non-public bodies
  • network security
  • internal research and development projects

Where an activity is not on the list, we’re assuming assessments using the current 3-step test would still be needed.

9. Extended use of the ‘soft opt-in’

PECR currently permits email and SMS marketing messages where consent has been given, or for existing customers only, when the soft opt-in requirements are met.

This exemption to consent for existing customers is only currently available to commercial organisations. It’s proposed this could be extended to other organisations such as political parties and charities.

This could be great news for charities, but could it lead to a deluge of unwanted messages from political parties?

10. Research purposes

The Government wants to simplify the use of personal data for research, with a specific focus on scientific research.

Considerations include establishing new lawful grounds for research (subject to ‘suitable safeguards’) and incorporating a clear definition of ‘scientific research’.

11. Artificial intelligence

It’s proposed certain automated decision-making should be permitted without human oversight.

GDPR prohibits this unless necessary for a contract with an individual, authorised by law or based on explicit consent. The consultation suggests Article 22 is scrapped.

The aim is to ‘deliver more agile, effective and efficient public services and further strengthen the UK’s position as a science and technology superpower’.

It’s hoped this can be achieved by developing a safe regulatory space for responsible AI development, testing and training which allows greater freedom to experiment.

In the consultation press release, an AI partnership between Moorfields Eye Hospital and the University College London Institute of Ophthalmology is highlighted.  Researchers have trained machine-learning technology to identify signs of eye disease, which is more successful than using clinicians.

This is cited as a clear example of the type of data use which should be encouraged, not hindered by law.

12. Reform of the ICO

The Government wants to assert greater control over the UK’s data protection regulator, the Information Commissioner’s Office.

They propose to introduce a new, statutory framework to set out the ICO’s strategic objectives and duties and a power for the Secretary of State for DCMS to prepare a statement of strategic priorities to inform how the ICO sets its own regulatory priorities.

This would will bring the ICO into line with other UK regulators such as Ofcom, Ofwat and Ofgem.

The proposals also include introducing a new overarching objective for the ICO, in addition to its other functions, tasks and duties with two key elements:

  • Upholding data rights and safeguard personal data from misuse
  • Encouraging trustworthy and responsible data use, to uphold the public’s trust and confidence in use of personal data


Yes, a shake-up of UK data laws and enforcement is on the horizon, but the final outcome remains unknown, and a healthy debate will surely follow.

The consultation closes on 19th November 2021, and there will undoubtedly be some time before any changes become law.

For the time being its business as usual, but this document gives us a clear idea of what the future might look like.

Meanwhile, the EU will be keeping a very close eye on developments, and it’s possible the UK could be deemed to be going a step to far – it’s easy to see EC adequacy decisions being held over the UK Government like the Sword of Damocles.

The UK Government’s objective is to give organisations more control and flexibility around data protection management within a less burdensome regime, which supports the data economy and drives innovation.

In some ways, it could even be seen as a move towards giving organisations who don’t take data protection seriously more rope to hang themselves with.

The full consultation document is worth a read and can be found HERE.

Simon Blanchard, Phil Donn & Julia Porter – September 2021

Is working from home a security nightmare?

September 2021

Yes! Here’s our checklist of what do to and watch out for with your WFH teams.

I was on yet another zoom call with my DPN colleagues the other day and we were baffled by some dreadful echoing of our voices. Everything I said bounced back at me.

We logged out, logged back in again but nothing changed. I turned my phone off – no change. Then I remembered that I was sitting in the kitchen with my Alexa turned on. When I unplugged Alexa, the echo disappeared.

That felt odd – we concluded that my Alexa was listening for instructions and so was listening into our call. That felt creepy!!

As we all work from home, this led to a discussion about whether we should put in place additional measures to maintain security over and above the work we had recently done to achieve Cyber Essentials.

The cyber essentials questionnaire doesn’t mention Alexa style devices or much about the location of workspace when you’re WFH.

With thanks to the ICO guidance and the Cyber Essentials documentation, here is our checklist for safely working from home.

1. Policies

Make sure you have policies and procedures in place which all your employees must adhere to. Make sure employees have read and understood the policies you’ve created. Even better, test them on it.

2. BYOD (Bring your own device)

Do decide whether employees can use their own devices. Some organisations have very firm “no personal devices” policies but some are more ambiguous. It is an inescapable fact that letting employees use their own devices is high risk; you’re mixing up business related apps and software with random stuff your employee may have downloaded from the web.

3. Network Access

Decide how employees are going to access business servers – is there a VPN in place? Do you need strong security protocols? It’s important to be proportionate with security measures. Obviously, a bank will feel different to a consultancy that handles no data.

4. WFH in coffee shops/cafes

Does your employee ever work outside the home? In a café for instance? Should you supply them with screens for their devices? Have they been briefed on the importance of keeping their devices secure in a public space and never leaving them alone?

5. The home environment

Does your WFH employee share their home with others? Are they using their personal broadband connection? If so, make sure they change the original passcode on the Wi-Fi to make access more secure. Can they lock their rooms or lock their devices away? Are there any Alexa style devices nearby?

In some instances, you may decide there is no circumstance under which an employee can work from home if the data they’re handling is too sensitive. Make sure you risk assess who can and cannot work at home and provide clear guidance.

6. 2FA and MFA

Where possible, enforce two factor or multi-factor authentication. There is often a lot of resistance to this additional security but, if available, they should be mandatory.

7. Passwords

How about password length – I suspect a surprising number of people still use simple passwords like, say, “12345”. They should be unique and complex with a mixture of letters, numbers and symbols and, ideally, change enforced on a regular basis.

Increasingly it makes sense to use a password manager to keep all you unique and complex passwords in one place. You still need one master password for that system but at least that’s only one you need to remember.

8. Software updates

Are you able to update the user’s software remotely? If they’re using their own device, how do you ensure software is up to date? What safeguards are in place?

9. Cloud Storage

How are documents and files stored? Is there a cloud-based storage facility such as Sharepoint? How is this accessed and who controls the access? There are plenty of opportunities to inadvertently share a document with multiple people by allowing the sharing of links. Try not to let that happen.

10. Email

When using email, all the usual safeguards should apply when it comes to phishing attacks. The IT team should be carrying out tests on a regular basis and provide up to date training on what to watch out for.

Even though our cabinet ministers seem to do it, never ever use your personal email account for work related correspondence!!

How does this all add up?

If you do nothing else, consider the following actions:

  • Gain Cyber Essentials or Cyber Essentials Plus certification: Ensure that you’ve carried out the Cyber Essentials evaluation. It’s particularly important for small businesses but large organisations have also found it useful as well.
  • Conduct a DPIA: Carry out a Data Protection Impact Assessment. This can identify the circumstances under which people are working from home and introduce measures to mitigate the identified risks.
    Create or bolster your Infosec policy: Create and maintain a robust and proportionate information security policy and ensure all employees are familiar with its contents. Maybe a short test would work well?


Need some support? Our experienced team is on hand to give no-nonsense help and advice.  To find out more GET IN TOUCH

Artificial Intelligence – helping businesses address the privacy risks

August 2021

The use of artificial intelligence (AI) is increasing at great pace, to drive valuable new benefits across all areas of business and society. We see its applications expanding across many areas of our daily lives anything from social media usage through to self-driving and parking cars, and medical applications.

However, as with any new technology, there can be challenges too. How can we be sure we are protecting people from risk and potential harm when processing their personal data within AI systems?

Like with any other use of personal data, businesses need to ensure they comply with core data protection principles when designing, developing or productionising AI systems which use personal data.

You may recall in April 2021, the European Commission published its proposal for new regulation, harmonising the rules governing artificial intelligence.

The regulation of AI is a tricky balancing act. On the one hand there’s the desire not to hinder research and development from adopting new technologies to bring increasing societal benefits – but those exciting opportunities must be balanced against the need to protect individuals against any inherent risks.

So how can we strike the right balance?

AI privacy ‘toolkit’

The ICO have published an improved ‘beta’ version of their AI toolkit, which aims to help organisations using AI to better understand & assess data protection risks.

It’s targeted at two main audiences; those with a compliance focus such as DPOs, general counsel, risk managers and senior management; alongside technology specialists such as AI/ML developers, data scientists, software developers & engineers and cybersecurity & IT risk managers.

So what is the toolkit?

It’s an Excel spreadsheet which maps key stages of the AI lifecycle against the data protection principles, highlighting relevant risks and giving practical steps you can take to assess, manage and mitigate risks.

It also provides suggestions on technical and organisational measures which could be adopted to tackle any risks. The toolkit focuses on four key stages of the AI lifecycle:

  • Business requirements and design
  • Data acquisition and preparation
  • Training and testing
  • Deployment and monitoring

The ICO have quite rightly recognised that the development of AI systems is not always a linear journey from A to B to C. One stage does not necessarily flow straight into another.

Therefore it will often be best to take a holistic approach and recognise you won’t have all the information available for assessment at ‘product definition’ stage. The engagement for a DPO (or other privacy champion) will need to stretch across all stages of the AI lifecycle.

What kinds of risk are highlighted?

Quite a few actually, including:

  • Failure to adequately handle the rights of individuals
  • Failure to choose and appropriate lawful basis for the different stages of development
  • Issues with training data which could lead to negative impacts on individuals – such as discrimination, financial loss or other significant economic or social disadvantages
  • Lack of transparency regarding the processes, services and decisions made using AI
  • Unauthorised / unlawful processing, accidental loss, destruction or damage to personal data
  • Excessive collection or use of personal data
  • Lack of accountability or governance over the use of AI and the outcomes it gives

AI has become a real focus area for the ICO of late. The toolkit follows on the heels of their Guidance on AI and Data Protection; their co-badged guidance with The Alan Turing Institute on Explaining Decisions Made With AI. This is all connected with their commitment to enable good data protection practice in AI.

Want to join the consultation?

The ICO are currently looking for organisations using their guidance, to better understand how it works in practice and make sure the Regulator keeps pace with emerging developments and that guidance and toolkits are genuinely useful to businesses. You can give your feedback to the ICO here.

In summary

The use of AI is exciting and presents many opportunities and potential benefits, but it‘s clearly not without its risks. There’s more and more guidance emerging to help organisations begin to adopt or continue to expand their use of AI. The clear message from the Regulator is this activity must be handled carefully and data protection must be considered from the outset.

The ICO is keen to work with businesses to make sure its guidance its useful for organisations, so it can continue to support the increasing use of AI.


Is this all going in the right direction? We’d be delighted to hear your thoughts. Alternatively if you’d like data protection advice when designing and developing with AI, we can help. CONTACT US.


Cyber Essentials – gain peace of mind with your information security

August 2021

Data breaches are endemic with 2,552 reported to ICO between April and June 2021. Some of these are self-inflicted problems whilst others are malicious attacks on an organisation.

For many small and medium sized businesses, it’s not entirely obvious how they should be addressing the increasing cyber security threats that present themselves on a virtually daily basis. The phishing emails and other attempts to access your networks feels like an endless war of attrition.

It therefore makes sense to ensure you’ve got some basic security arrangements in place to address the most obvious threats. This is where Cyber Essentials can help.

What is Cyber Essentials?

Launched in 2014, it is information assurance scheme operated by the National Cyber Security Centre (NCSC). Today over 30,000 organisations are accredited.

The intention is to provide an assurance framework for organisations to carry out a simple review of their security arrangements. It also helps you to ensure that basic controls are introduced to protect networks/systems from threats from the internet. There are two versions:

1. Cyber Essentials

A self-certification scheme which needs to be signed by a company director and costs £300 plus VAT to be independently verified.

2. Cyber Essentials Plus

Like Cyber Essentials, but with the addition of independent validation by an accredited third party. The cost will depend on the complexity of the organisation but is typically £1,400 plus VAT.

What does the scheme cover?

1. Use a firewall to secure your internet connection

  • Protect your internet connection with a firewall. A firewall is essentially a buffer between you and external networks.
  • It’s possible to have an organisational firewall for your company network as well as a personal firewall on your device. Make sure both are enabled.

2. Choose the most secure settings for your devices and software

  • Check your device settings and ensure that they are providing a higher level of security.
  • Always password protect your devices and change any default passwords.
  • Wherever it’s available, always use Two Factor Authentication of Multi-Factor Authentication on your accounts.

3. Control who has access to your data and services

  • Minimise the access levels for individual employees in an organisation. Only give them what they need based on their role.
  • Separate administrative accounts from accounts which are also using email or browsing the web to minimise the damage caused by an attack.
  • Only use software from official sources and control who can install software.

4. Protect yourself from viruses and other malware

  • Introduce anti-malware measures such as Windows Defender.
  • Create an allowed list of applications available to install on a device.
  • Use versions of software that supports sandbox and keeps applications separate with restricted access.

5. Keep your devices and software up to date

  • Make sure that all software is up to date with the most recent version – known as patching.
  • If software becomes obsolete/is no longer supported, upgrade to a more modern version.

Why bother?

  • Procurement: Increasingly organisations are being asked about their security arrangements when they are bidding for work. Already, the central government procurement has specified that any supplier is required to be certified if they are handling certain types of sensitive and personal.
  • Insurance: Beyond that, insurance companies are starting to recognise that accreditation can have an impact on insurance premiums although this isn’t quantified.
  • Peace of mind: Many data breaches and malicious attacks can be deflected using the basic security measures recommended by Cyber Essentials. The cost of the exercise is trivial compared to the cost and reputational damage if there has been a breach.

What next?

The National Cyber Security Centre has published a wide range of resources to help understand Cyber Essentials and become accredited. Take a look now and put your mind at rest by becoming accredited – Cyber Essentials Overview.


Data protection team over-stretched? Our experienced team can fill the gaps with our no-nonsense advice and support. For more information CONTACT US.

Getting to grips with Accountability

Accountability is a key principle underpinning GDPR and has become the foundation of successful data protection and privacy programmes.
It can though be difficult to know where to start and how to keep up the momentum.

Luckily the ICO has developed what I think is a great tool, and it’s just been updated it to make it even more user friendly.

The Accountability Framework can really help DPOs and privacy teams. It takes less than an hour to complete – which sounds to me like an hour well spent!

When working with our clients I often find they benefit from help both to recognise their data compliance gaps and then to scope out practical solutions. Any help from the ICO to support businesses down this road should be encouraged.

The Framework focuses on helping you to assess the effectiveness of the measures you have in place to protect personal data, understand where your weaknesses lie and gain clarity on the areas you need to improve.

It’s aimed at senior management, DPOs and those with responsibility for records management and information security.

Ten core areas of accountability

The Framework identifies ten important areas organisations are accountable for.

1. Leadership and oversight
2. Policies and procedures
3. Training and awareness
4. Individual’s rights
5. Transparency
6. Records of processing and lawful basis
7. Contracts and data sharing
8. Risks and data protection impact assessments
9. Records management and security
10. Breach response and monitoring.

Self-assessment tool and tracker

A vital part of the Framework is the self-assessment tool. It enables you to assess your level of compliance in each of the 10 core areas above.
For each area the Framework lays out the ICO’s expectations and asks you to rate how your organisation performs against key measures.

At the end you receive a report which grades your organisation’s performance on each area and helps you to:

  • understand your current compliance levels
  • identify gaps in your privacy programme
  • confirm the next steps you should take to improve accountability
  • communicate what support is needed from senior management to enhance compliance

If you want to go further, you can use the accountability tracker (provided in Excel) to record more detail and create an action plan so you can your track progress over time.

You may also find this useful when you provide management information, e.g. to your Board and/or to other stakeholders.

Recent improvements to the Framework

After listening to feedback, the ICO has made changes to:

  • improve the Framework’s layout. For example the 10 core topic areas have changed since the original version, making it easier to navigate
  • adjustments to the Accountability Tracker, so it complements people’s existing working practices

An example: training and awareness

The Framework provides practical ways in which you can meet the legal requirements. ‘Training and awareness’ is a great example.

The ICO expects organisations to provide appropriate data protection and information governance training for staff, including induction for new starters prior to accessing personal data and within one month of their start date.

The training must be relevant, accurate and up to date. Refresher training should be provided at regular intervals.

Specialised roles or functions with key data protection responsibilities should receive additional training and professional development, beyond the basic level.

Organisation should be able to demonstrate that staff understand the training, for example, through assessments or surveys.

In addition, you should regularly raise organisational awareness of data protection, information governance and your data policies and procedures in meetings or staff forums and make it easy for staff to access the relevant material.

What next?

The ICO tells us the next steps for the Framework include adding real life case studies which aim to illustrate the innovative ways organisations can demonstrate their accountability.

They also plan to run online workshops to look at how they can adapt and improve the self-assessment tool to better meet business needs. You can register your interest here.

Help for small businesses too

The ICO reminds us that if you work for a smaller organisation you will most likely benefit from their existing resources, available on their SME hub.

For example, you should take a look at their assessment for small business owners and sole traders and you may want to try the data protection self-assessment toolkit. ICO Accountability Framework 


We can help – if you’d like help improving your business’s data protection programme and demonstrate accountability. From delivering practical and engaging training through to helping with Data Subject Access Requests, impact assessments or protecting against a data breach. GET IN TOUCH

Are we controller, or are we processor?

July 2021

(…and I’m on my knees looking for the answer, are we Controller, or are we Processor?)

In the past 5 years since the final text of GDPR was published, deciding whether you are acting as a controller or a processor has been a contentious area for some businesses.

On paper the definitions may seem straight-forward, but as ever the devil’s in the detail and interpretation.

I was interested to see a recent ICO enforcement notice which concluded a marketing company was acting as controller, despite classifying itself as a processor.

This case was pretty clear-cut; the company clearly used personal data it received from other companies for its own purposes and financial gain.

But the distinction can be more nuanced.

Many a debate (and disagreement) has been had between DPOs, lawyers and other privacy professionals when trying to classify the relationship between different parties.

It’s not unusual for it to be automatically assumed all suppliers providing a service are processors, but this isn’t necessarily the case. Sometimes joint controllership, or distinct controllers, is more appropriate.

Organisations more often than not act as both, acting as controller and processor for specific processing tasks. Few companies will solely be a processor, most will be a controller for at least their own employee data, and often for their own marketing activities too.

So what does the law say a controller and processor are, and how should we interpret this?

The GDPR tells us a controller means ‘the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data’.

A processor means ‘a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller’.

There are some key questions which will help organisations to reach a conclusion, such as;

  • Are we responsible for deciding the purposes for which personal data are processed?
  • Are we responsible for deciding how and what data is collected?
  • Do we decide the lawful basis for the processing tasks we carry out?
  • Do we make sure people are informed about the processing of their data?
  • Do we handle individual privacy rights, such as data subject access requests?
  • Is it us who’ll notify the Regulator / affected individuals in the event of a significant data breach?
    And so on…

If you’re answering ‘yes’, you’re a controller. And the ICO makes it clear it doesn’t matter if a contract describes you as a processor; “organisations that determine the purposes and means of processing will be controllers regardless of how they are described in any contract about processing services”.

Why it’s important to get this right

Controllers have a higher level of accountability to comply with all data protection principles, and are also responsible for the compliance of their processors.

If you are a processor, you must only handle data on behalf of another organisation and under their instructions.

This means if you’re doing anything else with this data, for your own purposes, you can’t be a processor for those purposes. You must be the controller – at least for those purposes which were not instructed to you by another party.

Let’s be clear though, this doesn’t mean a processor can’t make some technical decisions about how personal data is processed.

Data protection law does not prevent processors providing added value services for their clients. But as a processor you must always process data in accordance with the controller’s instructions.

Processors also have a number of direct obligations under UK GDPR – such as the technical and organisation measures it uses to protect personal data. A processor is responsible for ensuring the compliance of any sub-processors it may use to fulfil their services to a controller.

If the relationship is controller to processor, you must make sure you have a suitable agreement in place which covers key data protection requirements.

Often overlooked is the need to have clear documented instructions from the controller. These instructions are often provided as an Annex to the main contract, so they can be updated if the processing changes.

What’s clear from the recent ICO ruling is even if your contract says you are a processor, if you are in fact in control of the processing, this will be overturned.

In this case, the marketing company has been given three months to mend their ways. Actions required include notifying individuals that the company is processing their data, ceasing to process personal data where this is not possible and making sure robust evidence of consent is retained.

The ICO doesn’t let us mark our own homework; it’s interested in what we do as opposed to what we say we do!

In July 2021 the European Data Protection Board published adopted guidelines on the concepts of controller and processor.


Data protection team over-stretched? Ease the strain with our no-nonsense advice and support via our flexible Privacy Manager Service. Find out how our experience team can help you. CONTACT US.

Children’s Code: deadline for complying looms

July 2021

Protecting children’s online privacy may be an area which is clearly relevant to your business and complying with the code will already be firmly on your agenda. For others, however, it may not be so obvious.

The Children’s Code (formally known as the Age Appropriate Design Code) came into force on 2nd September 2020 and organisations were given a twelve-month transition period to comply.

What’s made clear is it’s up to you to assess whether your online services are ‘likely’ to be used by children. And if you think they aren’t, you should document why.

The scope of the code is pretty broad – it applies to relevant online services likely to be accessed by children. This means it’s worth checking the following:

  • What’s meant by a child?
  • What online services are covered?
  • What does ‘likely’ to be accessed by a child mean?

I’ve taken a look at how to try and answer these questions, but first a bit about what the code is and what it aims to achieve…

What is the Children’s Code?

It’s a statutory code of practice aimed at protecting children’s privacy when they’re online. The code introduces 15 standards organisations need to meet.

These standards are aimed at making sure online services safeguard children’s personal data. They’re not technical standards per se, instead focusing on principles and privacy features. To comply, it’s likely different technical solutions will need to be adopted by different services.

This UK code is a first, but the Information Commissioner’s Office believes it reflects the direction of travel across the world. Elizabeth Denham the Information Commissioner says;

A generation from now we will all be astonished that there was ever a time when there wasn’t specific regulation to protect kids online. It will be as normal as putting on a seatbelt.

“This code makes clear that kids are not like adults online, and their data needs greater protections. We want children to be online, learning and playing and experiencing the world, but with the right protections in place.

“We do understand that companies, particularly small businesses, will need support to comply with the code and that’s why we have taken the decision to give businesses a year to prepare, and why we’re offering help and support.

We all know (especially during the pandemic and seemingly endless lockdowns) children of all ages spend a lot of time online. There are real concerns every time they open an app or play a game, data may be gathered about them.

This data may then be used to encourage them to spend more time using a service or to tailor advertisements – this might be appropriate for adults but not for minors.

The ICO stresses the desire is not to restrict children from benefiting from the digital world, rather to make sure the amount of data collected about them, and its subsequent use is minimised.

All businesses which fall under the code’s scope need to show they’ve taken the necessary measures. Those who don’t comply, but should, are warned they will have difficulties demonstrating compliance should the ICO come knocking and regulatory action could be more likely.

How old is a child?

The code adopts the definition of a child under the UN Convention on the Right of the Child (UNCRC) which is anyone under the age of 18. This means the code applies to online services likely to be accessed older children aged 16 and 17, not just young children.

This shouldn’t be confused with the age of a consent for a child, which for online services is 13 in the UK (but may vary across EU countries from 13 to 16).

What online services does the Children’s Code cover?

The code applies to what are termed ‘relevant information society services’. Put simply Information Society Services (ISS) means online services. ISS, according to the code, are;

Essentially this means that most online services are ISS, including apps, programs and many websites including search engines, social media platforms, online messaging or internet based voice telephony services, online marketplaces, content streaming services (e.g. video, music or gaming services), online games, news or educational websites, and any websites offering other goods or services to users over the internet. Electronic services for controlling connected toys and other connected devices are also ISS.

It also doesn’t matter if your service is free. For example, a free online game or search engine funded by advertising falls under the definition of an ISS.

Also, not-for-profit apps and educational sites are in-scope. Small businesses with websites selling products online or offering an online- only service via the website are also in-scope.

Who does the Children’s Code NOT apply to?

  • Some public authority services – for example, an online public service which is not provided on a commercial basis, or a police force with an online service with processes personal data for law enforcement purposes.
  • Information about your business – if you operate a website which just provides details about your real-world business with no ability to buy products online or access a specific online service. An online booking form for an in-person appointment is also not an ISS.
  • General broadcast services – scheduled TV or radio programmes to a general audience broadcast over the internet that are not at the request of an individual. BUT on demand services are covered.
  • Preventative or counselling services – for example websites or apps which specifically provide online counselling or other preventative services to children. BUT more general health, fitness and wellbeing apps are covered.

How do you determine ‘likely’ to be accessed by a child?

If your business clearly aims its services at under 18s you’re fully in scope, but the code also covers services which may not be specifically ‘aimed or targeted’ at children, but are likely to be accessed by them.

The word ‘likely’ was deliberately used to make sure services children were using in reality weren’t excluded.

Organisations need to therefore judge whether their services are likely or not to be used by children. The code gives us some pointers here about assessing this:

  • Is the possibility of this happening more probable than not?
  • Is the nature and content of the service appealing to children even if not intended for them (remember this includes older 16 and 17 year olds)?
  • Do you have measures in place to prevent children gaining access to an adult only service?
  • Organisations are faced with a risk-based decision about whether it would be proportionate to conform or not with the code.

What about services ‘unlikely’ to be accessed by children?

There’s an important point here. The code states:

If you decide that your service is not likely to be accessed by children and that you are therefore not going to implement the code then you should document and support your reasons for your decision. You may wish to refer to market research, current evidence on user behaviour, the user base of similar or existing services and service types and testing of access restriction measures.

If you initially judge that the service is not likely to be accessed by children, but evidence later emerges that a significant number of children are in fact accessing your service, you will need to conform to the standards in this code or review your access restrictions if you do not think it is appropriate for children to use your service.

There’s a clear expectation organisations will need to evidence their decision that they do not need to conform with the code.

What are the 15 standards of the Children’s Code?

If you judge the code does apply to you, here is a top-level summary of the 15 standards:

1. Best interest of the child – you need to consider the needs of children using your service and how best to support those needs. The best interests of the child should be your primary consideration when designing and developing your online service.

2. Data Protection Impact Assessments (DPIAs) – these should be carried out to identify and minimise any data protection risks to children.

3. Age appropriate application – you need to assess the age range of your audience. The needs of children at different ages should be central to your design and development. You also need to make sure children are given an appropriate level of protection about how their information is used.

4. Transparency – GDPR specifically states privacy information must be easy for children to find and easy for them to understand. The code says you should consider bite-sized, ‘just in time’ notices, when you are collecting children’s data.

5. Detrimental use of data – you must not use children’s information in a way which would be detrimental to children’s physical or mental health and well-being. You should not use the data in a manner which would go against industry codes or practice, other regulatory provisions or Government advice.

6. Policies and community standards – if children provide you with their personal information when they join or access your service, you must uphold your own published T&Cs, policies and community standards. Children should be able to expect the service to operate in the way you say it will and for you to do what you say you are doing.

7. Default settings – privacy settings for children must be set to ‘high’ by default, unless you can demonstrate a compelling reason for taking a different stance. It’s not enough to allow children to activate high privacy settings, you need to provide these by default.

8. Data minimisation – you should only collect and keep the minimum amount of data about children that’s necessary to provide your service.

9. Data sharing – you should not share children’s data unless you can demonstrate a compelling reason to do so.

10. Geolocation – any geolocation privacy settings should be switched off by default. Children should have to actively change the default setting to allow their geolocation to be used (unless you can demonstrate a compelling reason for them to be switched on).

11. Parental controls – you need to make it clear to children if parental controls are in place and if they are being tracked or monitored by their parents. You should also provide information to parents about the child’s right to privacy under UNCRC.

12. Profiling – options which use profiling should be switched off by default (again, unless you can demonstrate a compelling reason for these to be switched on). This doesn’t mean profiling is banned, but if it conducted you should make sure you have measures in place to protect children from harmful effects. If profiling is part of the service you are providing you need to be sure this is completely necessary.

13. Nudge techniques – you should not use techniques which lead or encourage children to activate options which mean they give you more of their personal information or turn off privacy protections.

14. Connected toys and devices – you need to conform with the code if you sell toys or devices (such as talking teddy bears, fitness bands or home-hub interactive speakers) which collect personal data and transmit it via a network connection. If your electronic toys or devices do not connect to the internet and only store data in the device itself the code does not apply.

15. Online tools – it must be easy for children to exercise their data protection rights and report any concerns. The right of erasure is particular relevant when it comes to a children. Mechanisms used to help children exercise their rights should be age appropriate, prominent and easy to use.

The Code is a significant step-change in online safeguarding – it impacts on many businesses who’ve never had to consider whether children were interacting with their online platforms or not.

As ever proportionate but effective tools and policies will be required to show you’ve given due regard to the code, especially by those businesses likely to attract younger users. Risk-based decisions will be required, based on evidence, adding to existing impact assessments.

The full code can be found here: The Children’s Code


Struggling with data protection? Ease the strain with our no-nonsense advice and support via our flexible Privacy Manager Service. Find out how our experience team can help you. CONTACT US.


Adequacy and the new SCCs – what does it all mean?

Great news for businesses! The European Commission finally adopts adequacy decisions for data transfers. Alongside this, the long-awaited new EU Standard Contractual Clauses have been published. What does this all mean?

The European Commission has adopted two adequacy decisions concerning transfers of personal data between the UK and EU, under the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED).

These agreements confirm the UK as having ‘adequate’ data protection for the transfer of personal data from the EU – thereby paving the way for lawful transfers between the EU and UK.

This is really helpful for UK businesses which rely on service providers or partners based in the EU – who would otherwise have needed to rely on other transfer mechanisms, such as Standard Contractual Clauses, to ensure data transfers to the UK are lawful.

Just in time too!

The news emerged on 28 June 2021 – only two days before expiration of the six month post-Brexit transition period under the UK-EU Trade and Cooperation Agreement.

There’s a caveat. These agreements are dependent on the UK’s legislative and regulatory environment for data. If the UK decides to go its own way with data protection laws, for example, diverges from the GDPR, the EU could potentially withdraw adequacy.

Positive reactions

Unsurprisingly, reaction to this news has been overwhelmingly positive.

The ICO:

“This is a positive result for UK businesses and organisations. Approved adequacy means that businesses can continue to receive data from the EU without having to make any changes to their data protection practices. Adequacy is the best outcome as it means organisations can carry on with data protection as usual. And people will continue to enjoy the protections that their data will be used fairly, lawfully and transparently. The result is also a testament to the strength of the UK’s data protection regime.”

The UK Direct Marketing Association:

“A positive decision on data adequacy is a huge relief for thousands of businesses across the UK – over half of businesses surveyed by the DMA just before Brexit stated this was important for the future of their business. The government estimated that without adequacy the UK economy could lose up to £85 billion, so this announcement is a significant boost after a challenging year.”

Where do we stand for data transfers outside of the UK & EU?

Whilst the European’s Commissions’ decision is indeed a terrific boost, UK businesses will still need to ensure their transfers to/from other areas outside the EU are lawful.

For many businesses, this will mean a continued reliance on SCCs in contracts with trading partners outside UK and the EU.

The European Commission has also recently published its final Implementing Decision adopting new Standard Contractual Clauses. They’ve been updated to:

  • align with the GDPR,
  • allow for more flexibility, depending on whether parties are processors or controllers,
  • address requirements following the Schrems II ruling of July 2020.

New SCCs are ready to use!

Organisations can start to use the new SCCs from 27 June 2021. The Commission have allowed for a transition period. Exporters and importers can continue signing the existing SCCs for a further 3 months until 27 September 2021, however after that date no new contracts can be signed using the existing SCCs.

Exporters and importers will then have until 27 December 2022 to replace contracts which use the current SCCs with the new SCCs. That’s unless the underlying processing operations change, in which case the new SCCs should be used from that point on.

What is different about these new SCCs?

There are several key differences you may wish to note.

1. Modular approach: Specific sets of clauses can be used for different types of transfers:

  • controller-to-controller,
  • controller-to-processor,
  • processor-to-processor
  • processor-to-controller.

There is an option for more than two parties to join and use the clauses through the docking clause.

2. Identification of a competent supervisory authority: The new SCCs specify that the supervisory authority of the data exporter will be the competent supervisory authority. If the data exporter is not established in an EU member state, but falls within the scope of GDPR, the supervisory authority should be identified as follows:

  • if the exporter has an EU representative, the supervisory authority will be the one where the representative is established.
  • if the data exporter does not require an EU representative, the supervisory authority will be the one of the Member States in which the data subjects whose personal data is transferred are located.

By entering into a contract bearing the new SCCs, the data importer agrees to accept the authority of the that supervisory authority and respond to it’s enquiries, comply with the measures adopted by them and submit to their audit regime.

3. Requirement to assess local laws: With a nod the Schrems II ruling of July 2020, the new SCCs contain a warranty stating both parties have carried out an assessment of the local laws in the jurisdiction the personal data will be transferred to, and they have no reason to believe those laws would prevent the importer from complying with its obligations under the clauses.

Additional guidance has been provided in Clause 18 (d) (12) around factors to take into account when giving this warranty. The parties will be required to document the assessment and make it available to a data protection supervisory authority on request.

4. Security measures: The new SCCs require that the technical and organisational measures (TOMs) adopted to safeguard the personal data transfers are described in specific terms in Annex II, clearly indicating which measures apply to each transfer.

5. No separate contractual measures are required: Contracting using the new SCCs will neatly avoid any requirement for controllers to impose separate contractual measures on a processor, in order to comply with the their obligations under Article 28 of GDPR.

6. Access by public authorities: Provisions are included which data importers will have to comply with if they receive a binding request from a public authority for disclosure of personal data transferred under new SCCs.

Don’t forget our Supplier Management Checklist

The DPN has published a 6-point supplier management checklist. This is designed to help controllers to manage their suppliers – wherever in the world they are based. We hope you find it useful. You may also wish to view the recording our recent webinar ‘How to avoid privacy errors with your suppliers.

In summary…

At last we have some much-needed clarity on international transfers. But if your business needs to rely on SCCs, there could well be quite a bit of work to be done to bring your supplier contracts into line by December 2022.

For your reference, here are the links to the European Commission’s two adequacy decisions: