Data Protection by Design: Part 2 – How to approach it

September 2020

How to implement Data Protection by Design 

Following my colleague Phil Donn’s popular article on Privacy By Design (Part 1), I’m delving into the detail of what to consider when you are developing new applications, products and service and the how to approach the assessment process.

Good privacy requires collaboration

As a reminder, Data Protection By Design requires organisations to embed data protection into the design of any new processing, such as an app, product or service, right from the start.

This implies the DPO or Privacy team need to work with any project team leading the development, from the outset. In practice, this means your teams need to highlight any plans at the earliest stages.

A crucial part of a data protection or privacy role is encouraging the wider business to approach you for your input into changes which have implications for privacy.

Building strong relationships with your Project and Development teams, as well as with your CISO or Information Security team, will really help you make a step change to embed data protection into the culture as well as the processes of the organisation.

What are the key privacy considerations for Data Protection by Design?

Here are some useful pointers when assessing data protection for new apps, services and products.

  • Purpose of processing – be very clear about the purpose(s) you are processing personal data for. Make sure these purposes are both lawful and carried out fairly. This is especially important where any special category data or other sensitive data may be used.
  • End-to-end security – how will data be secured both in transit (in and out of the app, service or product) and when it’s at rest?
  • Access controls – check access to data will be restricted only to those who need it for specific business purposes. And make sure the level of access (e.g. view, use, edit, and so on) is appropriate for each user group.
  • Minimisation – collect and use the minimum amounts of personal data required to achieve the desired outcomes.
  • Default settings – aim to agree proactive not reactive measures to protect the privacy of individuals.
  • Data sharing – will personal data be shared with any third parties? If so, what will the lawful basis be for sharing this data?
  • Transparency – have we notified individuals of this new processing? (Remember, this may include employees as well as customers). If we’re using AI, can we explain the logic behind any decisions which may affect individuals? Have we told people their data will be shared?
  • Information rights – make sure processes are in place to handle information rights. For example, can data be accessed to respond to Subject Access Requests? Can data be erased or rectified?
  • Storage limitation –appropriate data retention periods should be set and adhered to. These need to take into account any laws which may apply. To find out more see our Data Retention Guidance.
  • Monitoring – what monitoring will or needs to take place at each stage to ensure data is protected?

The assessment process

If there’s likely to be high risk to individuals, you should carry out a Data Protection Impact Assessment. This should include an assessment covering the requirements above.

Many organisations use a set of screening questions to confirm if a DPIA is likely to be required and I would recommend this approach.

In most cases it will also be appropriate for the Project team to consult with their CISO or Information Security Team. It’s likely a Security Impact Assessment (SIA) will also need to be carried out.

In fact, adopting a joint set of screening questions which indicate if there’s a need for a security assessment and/or a DP assessment is even better!

Embrace the development lifecycle

The typical stages involved when developing a new app, product or service are:

Planning > Design > Development > Testing > Early life evaluation > Production

Sometimes these stages merge together, it’s not always clear where one ends and another starts, or they may run in parallel.

This can make the timing of a data protection assessment tricky, particularly if your business uses an Agile development methodology, where the application design, development and testing happen rapidly in bi-weekly ‘sprints’.

I find when Agile is used the answers to certain data protection questions are not necessarily available early on. Key decisions affecting the design may be deferred until later stages of the project. The final outcomes of the processing can be a moving feast.

I always take the data protection assessment process for new developments step by step. Engaging with the Project team as early as possible and starting with the privacy fundamentals.

For example, try to establish answers to the following questions:

  • What data will be used?
  • Will any new data be collected?
  • What are the purposes for processing?
  • What will the outcomes look like?
  • How will individuals be notified about any new processing?
  • Is the app, service or product likely to enable decisions to be made which could affect certain individuals?

An ongoing dialogue with the Project team is helpful. This can be scheduled in advance of key development sprints and any budget decisions which could affect development.

This way the more detailed data protection requirements can be assessed as the design evolves – enabling appropriate measures and controls to protect personal data to be agreed prior to development and before any investment decisions.

Let me give you an example…

I recently helped a to carry out a DPIA for a new application which aimed to improve efficiency by looking at operational workflow data, including certain data on employees who carried out specific tasks.

When we started the design was only partially known, it wasn’t yet agreed whether certain components were in or out of scope, let alone designed. Therefore data protection considerations such as the minimisation of data (to include only that necessary for the processing), appropriate access controls and specific retention periods had not and couldn’t be decided.

We worked through these items as the scope was agreed. I gave input as possible designs were considered, prior to development sprints. We gradually agreed and deployed appropriate measures and controls to protect the privacy of individuals.

Too often in my experience the privacy team is called in too late.  This only leads to frustration if privacy issues are raised in the later stages of a project.  It can cause costly delays, or the poor privacy team is pushed into making hasty decisions. All of which is unnecessary, if teams know to go to the privacy team from the outset.

It can take time and perseverance to get your colleagues on board.  To help them to understand the benefits of thinking about data protection from the start and throughout the lifecycle of projects. But once you do, it makes your business operations run all the more smoothly.

 

Can we help? Our experienced team can support you with embedding Data Protection By Design into your organisation, or with specific assessments –  contact us

 

Data Protection by Design: Part 1 – The Basics

August 2020

Data Protection by Design and by Default – What does it mean? 

You might hear the terms ‘privacy by design’ and ‘data protection by design and by default’ being used when discussing data protection. We’re frequently told to think privacy first, by considering data protection at the outset of any project and embedding it into policies and processes.

That’s all very well, but what does ‘Data Protection by Design’ really mean (and why is it also called ‘Privacy by Design’)? Do you need to be concerned about it? And how do you approach it in practice?

When you delve into the detail, this stuff quickly becomes complex. I’m going to try and avoid ‘privacy speak’ and jargon as much as I can and give an overview of how it all started and where we are now.

What is Privacy/Data Protection by Design?

Data Protection by Design (and also ‘by Default’) are terms ushered in by GDPR.

But the concept’s not new; the roots lie in Privacy by Design which has been around for some time. The brains behind Privacy by Design is Ann Cavoukian (a former Information and Privacy Commissioner for the Canadian province of Ontario). The concept was officially recognised as an essential component of fundamental privacy protection in 2010.

Cavoukian’s approach led to a new way of integrating privacy into products, business processes and policies. At its core it’s all about incorporating privacy measures at the design stage of a project or policy, rather than bolting them on afterwards.

The basis of this approach is to allow businesses to protect data and privacy without compromising commercial effectiveness right from Day One. I’m sure practitioners in other fields, for example Health and Safety or HR, will be familiar with this approach too.

Privacy by Design is based on seven principles designed to embed privacy into a project’s lifecycle. For more detail take a look at the IAPP’s Privacy by Design the foundational principles.

Fast forward to GDPR…

In the past, Privacy by Design was considered a great approach to take and adopted by many businesses worldwide – but it wasn’t mandatory. What’s different now is GDPR has made it a legal requirement.

GDPR also gave us the new term Data Protection by Design and by Default. This means organisations who fall under the scope of GDPR are obliged to put appropriate technical and organisational measures in place. These are commonly referred to as TOMs.

ICO guidance explains why, ‘businesses have a general obligation to implement appropriate technical and organisational measures to show that you have considered and integrated the principles of data protection into your processing activities.’

You need to make sure data protection principles, such as data minimisation and purpose limitation, are implemented effectively from the start. Crucially, such measures also need to focus on protecting people’s privacy rights.

The ICO has produced detailed guidance on the topic, to help you navigate how to consider data protection and privacy issues at the start of your projects, products and processes.

As an aside, this doesn’t mean everything grinding to a halt, claiming ‘I can’t do that because of GDPR’!

The more familiar you become with the basic principles, the easier it is to explain and incorporate them into your business. That’s not to say it’s always a piece of cake – sometimes it isn’t – but neither does it have to be the ball and chain some make it out to be.

Do you need to worry about this stuff?

There’s a short answer to this question – Yes! It’s a legal requirement under GDPR, albeit some organisations will take this very seriously and others will take a laxer approach.

How to make a start

This is a topic that can feel overwhelming to begin with. It’s common to think, “how on earth do I get everyone across our business to think about data protection and consider people’s privacy in everything we do?”

Here are a few tips on organisational measures;

  • Benefits – think about how this approach is good for business and for your employees. It’s not just about trying to avoid data breaches, it’s about being trustworthy, taking care about how you handle and use people’s information. Privacy can be a brand asset; it can save costs and improve the bottom line. Increasingly organisations want to work with partners who can demonstrate sound privacy credentials. In many instances some of the most sensitive data your handle will be that of your employees. You all have an interest in making sure you handle everyone’s personal data in a secure and private way.
  • Collaborate with InfoSec – The two disciplines of privacy and security are intrinsically linked. Businesses are most successful at protecting personal data when the Info Sec and Data Protection teams are joined up, working in tandem.
  • Innovation – gone are the days when data protection was the place where dreams went to die! Sure, there are checks and balances that need to be considered when a great idea has privacy risks. When this happens, it’s up to the data protection team to be as innovative as their colleagues in helping that idea flourish. You never know – your approach to privacy can add value to a project, not diminish its effectiveness.
  • Awareness – think about fresh ways to get the message across – data protection matters. This is a balancing act, because we wouldn’t want to scare people to the extent they worry about the slightest thing. Try to explain that once data protection principles are embedded, much of it is common sense.
  • DPIAs – data protection impact assessments are one of the most important tools in your data protection by design toolbox (you don’t have one?). DPIAs are like a fire alarm – are your developers busy creating the most fabulous app ever? The DPIA should alert them to issues which, if ignored, might be project-breaking to fix later. As an aside, many DPIA templates I’ve seen are unduly complex and impossible for most staff to even attempt. So, try and make this an easier process – jettison the jargon and ask straight-forward questions.
  • Data Governance – I apologise, this really is the dreariest of terms. Nonetheless, it’s seriously worth developing a governance framework across your business which sets out who is responsible, who is accountable for your data and how the data is used. It can help to make sure processes and policies are robust and kept up to date.
  • Training – there’s nothing more empowering than effective training; making sure your people understand data protection principles, what privacy risks might look like and understand how it’s relevant to their job. Once this stuff is explained simply and effectively, it’s amazing how quickly this falls into place.

There’s an old saying: “What’s the best way to eat an entire elephant?” The answer is, “by breaking it into pieces first.”

You know your business – all you need to do now is break down the data protection stuff into manageable chunks as you apply them to your projects. The first couple might be tricky, but after that? There’s no substitute for getting stuck in and applying the principles to real-world problems. And the good news is there’s plenty of advice, training, templates and guidance available.

Use of automated facial recognition by South Wales Police ruled ‘unlawful’

August 2020

The Court of Appeal has upheld a legal challenge against the use of automated facial recognition (AFR) technology by South Wales Police (SWP).

The appeal was brought by Ed Bridges from Cardiff, backed by the civil rights group Liberty.

The AFR technology in question uses cameras to scan faces within a crowd, then matches these images against a ‘Watch List’ (which can include images of suspects, missing people and persons of interest). This flags up potential matches to officers.

Mr Bridges argued his human rights were breached when his biometric data was analysed without his knowledge or consent.

Liberty’s barrister, Dan Squires QC, argued there were insufficient safeguards within the current laws to protect people from an arbitrary use of the technology, or to ensure its use is proportional.

The Court upheld three of the five specific points of appeal, finding that:

  • There was no clear guidance on where AFR Locate (the technology used) could be used and who could be put on a watchlist. The Court held that this was too broad a discretion to afford to the police officers to meet the standard required by law under Article 8 of the Human Rights Convention.’The Court decided the level of discretion given to police officers was too great to meet the required standard under human rights law (Article 8 of the Human Right Convention)
  • The Data Protection Impact Assessment (DPIA) carried out by South Wales Police was found as ‘deficient’ because it was written on the basis that Article 8 of the Human Rights Convention was not infringed.
  • SWP did not take reasonable steps to find out if the software had a bias on racial or gender grounds.

 

This successful appeal followed the dismissal of the case at the Divisional Court on 4 September 2019 by two senior judges, who concluded that use of AFR technology was not unlawful.

Talking about the latest verdict, Mr Bridges commented:

“I’m delighted that the court has agreed that facial recognition clearly threatens our rights. This technology is an intrusive and discriminatory mass surveillance tool.

“For three years now, South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.”

SWP have confirmed that they do not seek to appeal against the Court of Appeal’s judgment.

What impact is this ruling on facial recognition likely to have?

The ruling’s impact will extend across other police forces. However, it may not prevent them from using AFR technologies in the future.
The judges commented the benefits from AFR are “potentially great” and the intrusion into people’s privacy were “minor”. However, more care is clearly needed regarding how it’s used.

To move forward, police forces will need clearer more detailed guidance. For example, the ruling indicates officers should document who they are looking for and what evidence they have that those targets are likely to be in the monitored area.

The England and Wales’ Surveillance Camera Commissioner, Tony Porter, suggested that the Home Office should update their Code of Practice.

It will be interesting to watch how this develops. The benefits clearly need to be carefully balanced with the privacy risks.