Suppliers – why your contracts and security are important

Processors and controllers are both accountable

Do you provide a service to clients and handle your client’s personal data? If you’re acting as a processor, a recent GDPR fine serves as a helpful reminder to be sure to have all your ducks in a row.

There’s a clear warning you shouldn’t just assume the contracts your clients ask you to sign are okay, nor can you just say you have robust security measures in place, you actually have to have them!

In this recent case a software publisher, acting as a processor for their clients, was fined 1.5 million Euros by the French regulator (CNIL) following a data breach involving sensitive health data.

It was found data was exfiltrated by unauthorised parties from a poorly protected server. In a nutshell the key findings were:

  • Significant gaps in the processor’s security processes
  • Contractual documentation which failed to include mandatory obligations required under Article 28 of GDPR.

It’s worth noting the fine was based on both these counts. The ruling makes it clear processors should be wary of relying on their clients to make sure contractual terms are up to scratch. It’s the responsibility of both parties.

Here’s a quick recap on the how suppliers can minimise their risks.

Getting the relationship clear

The most important first step is to establish the relationship between your company and another.

  • Are you handling a client’s data on their behalf, under their instruction, to provide a service to them?
  • Are you acting as controller, clearly determining how the personal data will be used for your own purpose(s)?
  • Are you both? i.e. acting as a controller in certain circumstances, but a processor for specific services you provide to clients.

Are we controller or are we processor?

What are the contractual requirements?

Once you’re clear you are a processor, acting under your client’s instructions, the law states your arrangements with clients must be covered by a binding agreement. EU and UK GDPR set out specific provisions which must be written into such contracts. In brief these are as follows:

1. Types of personal data & categories of data subject

The contract needs to specify what types of personal data you’ll be handling. It should also include details of whether this data relates to your client’s employees, patients, customers, and so forth.

2. Nature, purpose, duration of processing

The contract should describe the nature of the service(s) you provide, what purpose(s) this serves and the term of the contract. The agreement should cover instructions from your client of what you are permitted to do with their data.

3. The rights and duties of each party

The obligations of both parties should be clearly defined. For example, the client’s obligation to have a lawful basis for processing, its responsibility to fulfil individual privacy rights and your commitment as a supplier to not use your client’s data for any other purpose.

4. Technical and organisational measures

As a supplier you need to provide sufficient guarantees to implement proportionate technical and organisational measures to meet requirements of UK/EU GDPR.

5. Sub-processors

If you engage other companies (‘sub processors’) to support you in delivering your services, you’ll need specific or general written authorisation from your client(s). If you make any changes to which sub-processors you use (including software providers), you’ll need to tell your client and give them the opportunity to object. Contractual terms should stipulate that you are accountable for your sub-processors.

6. International transfers

If relevant, the agreement should include details and provisions for any transfers of personal data to a third country. For example if you are based in the UK, a transfer to any other country. This would include details of any sub-processors based outside the UK. A transfer is often associated with the act of sending or transmitting personal data from one country to another. It should be noted the definition also covers cases where personal data is made ‘available’, in other words can be accessed in a third country.

7. Duty of confidentiality

There must be a confidentiality clause, which commits you to ensuring any of your staff authorised to access the client’s data are committed to a duty of confidentiality or are under a statutory obligation of confidentiality.

8. Assisting your clients

The contract should cover your commitment to assisting your clients, where necessary, with handling individual privacy rights, handling data breaches and conducting data protection impact assessments.

9. Return or destruction of data

It should be clear what happens to the client’s data when the contract ends. Does the client want you to return the data or destroy it?

10. Audits and inspections

As a processor you must agree to make available all information necessary to demonstrate your compliance and agree to audits, including inspections by your client or their authorised auditor.

Processors have obligations

This recent CNIL fine shows you can’t just sign a contract, sit back and relax.

As a processor you’re responsible for your sub-processors, data transfers, staff training and confidentiality, assisting your clients when necessary and so forth. You have to be sure to implement the technical and organisation measures you said you would to protect your client’s data.

While some clients will ask you to jump through multiple hoops as part of their due diligence process, making you clearly demonstrate your security measures are robust, others may not be so picky. But that doesn’t release you from your responsibilities.

The law and this recent fine make it clear processors can be held liable. In the event of a breach, your contractual arrangements and internal practices could come under rigorous scrutiny.

Is working from home a security nightmare?

September 2021

Yes! Here’s our checklist of what do to and watch out for with your WFH teams.

I was on yet another zoom call with my DPN colleagues the other day and we were baffled by some dreadful echoing of our voices. Everything I said bounced back at me.

We logged out, logged back in again but nothing changed. I turned my phone off – no change. Then I remembered that I was sitting in the kitchen with my Alexa turned on. When I unplugged Alexa, the echo disappeared.

That felt odd – we concluded that my Alexa was listening for instructions and so was listening into our call. That felt creepy!!

As we all work from home, this led to a discussion about whether we should put in place additional measures to maintain security over and above the work we had recently done to achieve Cyber Essentials.

The cyber essentials questionnaire doesn’t mention Alexa style devices or much about the location of workspace when you’re WFH.

With thanks to the ICO guidance and the Cyber Essentials documentation, here is our checklist for safely working from home.

1. Policies

Make sure you have policies and procedures in place which all your employees must adhere to. Make sure employees have read and understood the policies you’ve created. Even better, test them on it.

2. BYOD (Bring your own device)

Do decide whether employees can use their own devices. Some organisations have very firm “no personal devices” policies but some are more ambiguous. It is an inescapable fact that letting employees use their own devices is high risk; you’re mixing up business related apps and software with random stuff your employee may have downloaded from the web.

3. Network Access

Decide how employees are going to access business servers – is there a VPN in place? Do you need strong security protocols? It’s important to be proportionate with security measures. Obviously, a bank will feel different to a consultancy that handles no data.

4. WFH in coffee shops/cafes

Does your employee ever work outside the home? In a café for instance? Should you supply them with screens for their devices? Have they been briefed on the importance of keeping their devices secure in a public space and never leaving them alone?

5. The home environment

Does your WFH employee share their home with others? Are they using their personal broadband connection? If so, make sure they change the original passcode on the Wi-Fi to make access more secure. Can they lock their rooms or lock their devices away? Are there any Alexa style devices nearby?

In some instances, you may decide there is no circumstance under which an employee can work from home if the data they’re handling is too sensitive. Make sure you risk assess who can and cannot work at home and provide clear guidance.

6. 2FA and MFA

Where possible, enforce two factor or multi-factor authentication. There is often a lot of resistance to this additional security but, if available, they should be mandatory.

7. Passwords

How about password length – I suspect a surprising number of people still use simple passwords like, say, “12345”. They should be unique and complex with a mixture of letters, numbers and symbols and, ideally, change enforced on a regular basis.

Increasingly it makes sense to use a password manager to keep all you unique and complex passwords in one place. You still need one master password for that system but at least that’s only one you need to remember.

8. Software updates

Are you able to update the user’s software remotely? If they’re using their own device, how do you ensure software is up to date? What safeguards are in place?

9. Cloud Storage

How are documents and files stored? Is there a cloud-based storage facility such as Sharepoint? How is this accessed and who controls the access? There are plenty of opportunities to inadvertently share a document with multiple people by allowing the sharing of links. Try not to let that happen.

10. Email

When using email, all the usual safeguards should apply when it comes to phishing attacks. The IT team should be carrying out tests on a regular basis and provide up to date training on what to watch out for.

Even though our cabinet ministers seem to do it, never ever use your personal email account for work related correspondence!!

How does this all add up?

If you do nothing else, consider the following actions:

  • Gain Cyber Essentials or Cyber Essentials Plus certification: Ensure that you’ve carried out the Cyber Essentials evaluation. It’s particularly important for small businesses but large organisations have also found it useful as well.
  • Conduct a DPIA: Carry out a Data Protection Impact Assessment. This can identify the circumstances under which people are working from home and introduce measures to mitigate the identified risks.
  • Create or bolster your Infosec policy: Create and maintain a robust and proportionate information security policy and ensure all employees are familiar with its contents. Maybe a short test would work well?

The NHS “data grab” – should we celebrate or be fearful?

June 2021

A new GP data sharing scheme – called General Practice Data for Planning and Research – was due to go live on 1st July to replace the current GP Extraction Service (GPES).

On June 8th, and following some significant negative feedback, NHS Digital has announced that the launch would be put back until 1st September to give more time to explain what the project entails. We welcome this announcement.

This new system provides data sets for research and analysis to support public health initiatives and to research new treatments/cures.

The GPES system involved direct contact with GPs to provide patient data for research purposes.  According to NHS Digital, the GPES system lacks transparency, is out of date and no longer fit for purpose.

With this new system, patients are able to opt out of sharing their data but, in actual fact, an opt-out of sharing data has been available since 2013 so opting out, per se, is not really new news.

What does the new scheme do?

Under the new scheme NHS Digital will collect and centralise data on treatments, referrals, and appointments over the past 10 years, alongside other sensitive records for patients’ entire medical history.

This data is then available, on request, to suitably qualified research and analysis organisations who are able to demonstrate a justifiable legal basis for a specified legitimate purpose for processing that data.

The new service is designed to enable faster access to pseudonymised data on around 55 million patients for planners and researchers.

What has changed?

The previous scheme relied on requests for data for research and analysis being made directly to GP surgeries who were required to make a judgement about complex data requests.

By centralising this activity, the technical infrastructure, security arrangements and governance will be managed by NHS Digital.

What do we need to know?

  • Data is pseudonymised at source by GPs, meaning no personal data is passed to NHS Digital. However the data will include indirect identifiers which could lead to identification if very strict controls are not employed.
  • It is possible to opt-out of sharing this data both at GP level but also at NHS Digital level for any hospital related data. The deadline for the GP opt-out is 23rd June after which any data exported to the new system cannot be removed.
  • A new platform has been developed to replace the elderly system that was used for the previous requests.
  • Through a centralised system of managing data, a more consistent and robust governance framework has been put in place to support this project. In the past, GPs made their own judgement about what to share and provided little visibility of who had actually requested what data.
  • Following the Partridge report in 2014 which was tasked to scrutinise the use of data for health and social care, the levels of transparency around providing information about data processing through NHS Digital has improved.
  • As things stand, NHS Digital has a legal status which means they are an arms-length body that the government can’t directly control.

Why is this scheme considered important?

  • This data has previously supported the vaccine rollout scheme by identifying specific target audiences.
  • It provides the ability to carry out research to understand the causes and consequences of common and rare diseases.
  • It supports planning and commissioning of health services nationally and regionally.
  • It will provide local planning support for prioritising care delivery and investment.

Should we be worried?

  • This is the most sensitive of sensitive health data with a complete history from your GP being shared – it’s hardly surprising that people are concerned about it.
  • If sufficient controls are not put in place, there is a risk the identity of individuals could be revealed through querying this data using the indirect identifiers or from combining data sets.
  • There is always a danger of a catastrophic data breach and scrutinising infosec arrangements are key.
  • The NHS has a poor track record on projects such as this with the failure of the ‘care.data project’ in 2013. Some people have long memories.
  • There is a fear amongst some special interest groups such as Med Confidential and GPs themselves that, maybe, this data is being sold to third parties for commercial gain – clearly this concern has a political dimension.
  • It’s not clear whether the wider population understand what this project entails. It was due to go live in July and many worried it was being introduced with indecent haste.  Interestingly the NHS Digital Board appears to have decided they don’t need to inform individuals about this new scheme and that individuals need only opt-out, not opt-in.

What are the data protection and privacy implications?

  • Is it good enough that data is pseudonymised? Should it be anonymised? Or should researchers begin using synthetic data? I’d like to see how NHS Digital will be more transparent about how they will ensure that people will not be re-identified from their indirect identifiers.
  • A full dataset has not been published so, at this stage, it’s hard to ascertain how easy it would be to identify individuals.
  • Are there safeguards in place to ensure that such a rich dataset is not combined in other organisations for commercial gain?
  • Has anyone completed a DPIA (Data Protection Impact Assessment)? We assume that the ICO is heavily involved but the process has not been particularly transparent.
  • NHS Digital says they’ve introduced a robust governance regime and have involved a Caldicott Guardian (put simply the health data equivalent of a DPO) to ensure necessary processes are in place to protect this data.
  • Has NHS Digital fully considered Data Protection by Design and Default? In all honesty, I’d not heard of this programme until recently. Although the health professionals have been collaborating, I’m not sure patients have really been involved to any great extent. With such sensitive data, it’s not really surprising people are worried.
  • It appears there are safeguards in place to assess what data is required each time a request is made. A key data protection principle is data minimisation – i.e. make sure only data necessary for the job is shared. Having said that, data scientists may well argue they want everything to be able to identify unusual patterns or trends.
  • Data Processing Agreements will be put in place between NHS digital and the research partner – although I’ve not seen them, the inference is there is a level of robustness around how data is handled and processed.
  • Another data protection principle is ‘lawfulness, fairness and transparency’ – the documentation related to this service is hard to follow. There is a simple video on Youtube explaining to patients what’s in it for them and what to expect? It’s had 4,380 views! Whether you believe there’s a conspiracy or not the communication has been woeful!

In summary

Overall, there are plenty of very good reasons why this data set should be pooled – not least because this system is replacing an old one that’s about to fall over. Beyond that, there are obvious public health benefits which have not been fully realised in the past.

However, to announce this new scheme in a couple of blog posts in April does suggest to patients that the scheme is being rushed out without proper scrutiny. Piggy backing on the ground gained for using data for COVID purposes makes sense but explaining the benefits to patients is also absolutely necessary.

If you add an overlay of distrust in government actions you can see why there may be problems ahead. The tragedy is that a lot of this could have been avoided.

At the heart of data protection legislation is the notion of transparency and this is a perfect example of why it is so important. Investing more time and energy into explaining what will happen would have been repaid many times over. It feels very short sighted to not have made more effort.

 

Data protection team over-stretched or need some specialist support? Find out how we can help with no-nonsense practical privacy advice –Contact Us. 

Brexit: Do you need an EU Representative?

December 2020

Amongst the current whirlwind of Brexit-related stuff – international data transfers, adequacy decisions and possible UK data regime divergence – it would be easy to overlook the GDPR requirements regarding appointing an EU representative.

As of 1st January 2021 organisations in the UK, like others based outside the European Economic Area (EEA), may fall under this obligation. Conversely, organisations based outside the UK may fall under a requirement to have a UK representative.

Do you need an EU representative?

If you’re based in the UK and;

  • offer goods and services to individuals in the EEA
    or
  • monitor individual’s behaviour

And

  • you don’t have a branch, office or establishment in an EEA state

You’ll need to appoint an EU Representative.

What constitutes ‘Offering Goods and Services’?

The European Data Protection Board (EDPB) guidelines on GDPR territorial scope provides helpful pointers on whether you would be considered as ‘offering goods and services’ to EU citizens.

Just because your website might be accessible to EU citizens isn’t enough to warrant the necessity of having an EU Representative. It needs to be ‘apparent or envisaged’ your products and services are being offered to individuals in one or more EU member states.

Let’s take a look at what that means. Does your organisation;

  • describe products and services in the language of an EU member state?
  • offer prices in Euros?
  • actively run marketing and advertising campaigns targeting an EU country audience?
  • mention dedicated contact details to be reached from an EU country?
  • use any top-level domain names, such as .de or .eu?
  • describe travel instructions from one or more EU member state to where your service is provided?
  • mention clients/customers based in one or more EU states?
  • offer to deliver goods to EU member states?

Answering ‘Yes’ to one or more of the above means it’s likely you fall under the requirements of GDPR Article 27 to appoint an EU Rep.

You will not need to appoint a representative if you are;

  • a public authority
    or
  • your processing is only occasional, is of low risk to the data protection rights of individuals, and does not involve the large-scale use of special category or criminal offence data.

For example, here at the DPN we don’t need to appoint an EU Representative. Our website is clearly accessible to EU citizens, people can sign up for our newsletter or webinars from anywhere in the world, and we may do some consultancy work for an EU-based company. However, we are a small business and our answer to all the above questions is NO.

However, if you are actively targeting marketing or advertising campaigns at EU citizens, you are likely to fall under the requirement.

What does an EU Representative do?

You’ve established you need an EU Representative? You need to know what their responsibilities are before finding a company to provide this service.

Your EU representative has the following core responsibilities:

  • co-operating with the EU supervisory authorities on your behalf
  • facilitating communications between EU citizens and your organisation
  • being accessible to individuals in all relevant member states (i.e. clearly mentioned in your privacy notice as the contact for EU citizens)
  • supporting you to manage your Record of Processing Activities (RoPA) in accordance with Article 30 of the GDPR.

A number of professional services have sprung up offering to be representatives, with Ireland proving a particularly popular location, not least because there are no language issues for UK companies.

However, you should be mindful you need to pick a relevant country, if your clients/customers are primarily Italian, your representative should be based in Italy.

What about UK Representatives?

Under UK GDPR (which will sit alongside an amended version of the UK DPA 2018) there will also be an obligation on organisations based outside the UK to appoint a UK representative if they have no office, branch, establishment in the UK and they;

  • offer good and services to UK citizens
    or
  • monitor the behaviour of UK citizens

Again, if your processing of UK citizen’s data is occasional, is of low risk to the data protection rights of individuals, and does not involve the large-scale use of special category or criminal offence data, the requirement for a UK Rep will not apply.

Finally, if you haven’t done so already any UK organisation needs to update their policies and privacy notices to reflect that the UK will be outside the EU. You may also need to just double check any DPIAs and other assessments regarding international data transfers.

Also see the ICO’s Guidance on EU Representatives