Body worn cameras and privacy implications

Is your use of body worn cameras reasonable and proportionate?

Wearable technologies such as body worn cameras (‘bodycams’) and other recording devices are often used for public safety and security purposes. An obvious example is by the Police and other public services. They can also used for recreational pursuits.

The use of these devices has increased significantly over recent years as the technology advances in leaps and bounds, and prices fall.

Bodycams can pose a real challenge from a data protection perspective due to their portability. When a bodycam is in use, it effectively becomes a mobile surveillance system which is highly likely to capture images or audio of individuals. This should be regarded as personal data. Businesses using these devices need to be mindful of relevant data protection requirements.

We need to bear in mind, when bodycams are combined with the use of facial recognition technology to uniquely identify individuals (e.g. for safety and security purposes), the data protection concerns increase. If you are actively processing special category data, such as biometric data used within facial recognition systems, you need to identify a suitable condition under Article  9, UK GDPR.

Eight key privacy principles and obligations for bodycams

  1. Fair and lawful processing – you must be able to demonstrate your use of bodycams is both fair and legal. The necessity must be carefully considered. The potential for bodycams to be intrusive means meeting a relatively high threshold to demonstrate your use is genuinely necessary.
  2. Limited purposes & data minimisation – cameras should only record the minimum amount of personal information necessary for specified purposes.
  3. Transparency – did you clearly tell people before you began recording? Are they aware of their rights?
  4. Information security – Any recordings must be stored securely. If video footage is stored on the device itself, or on a memory stick (even temporarily) there’s an additional risk of loss or theft of personal data.
  5. Restricted access – clearly defined rules should be in place covering who can access recordings and for what purposes.
  6. Sharing – the disclosure of images and information should only take place when it’s necessary for specified purposes, or for law enforcement. And checks should be in place before disclosing to law enforcement or other agencies.
  7. Storage limitation – data on individuals (including video & audio) must be retained only for the minimum amount of time required, and then deleted.
  8. Individual rights – you must be able to respond appropriately to any privacy rights requests from individuals (such as the right of access or right to erasure).

Carrying out an impact assessment

In most situations it would be wise to conduct a Data Protection Impact Assessment (DPIA) to formally assess and document your approach and how you will meet the data protection obligations.

The DPIA will need to consider each of the relevant principles and evaluate if measures and controls in place are adequate to protect individuals whose personal data may be captured.

Other considerations

The ICO published ‘Guidance on video surveillance’ earlier in 2022. This covers the processing of personal data by video surveillance systems by both public and private sector organisations. Their scope for surveillance systems includes CCTV, ANPR, bodycams, drones (UAVs), facial recognition technology (FRT), and also dashcams and smart doorbell cameras. They pick up on many of the points I’ve summarised above.

The Biometrics and Surveillance Camera Commissioner published a draft update to its ‘Surveillance camera code of practice’ in August 2021, which is still out for consultation.

The draft code includes twelve guiding principles for surveillance system operators however, as you may anticipate, many of these overlap with the privacy principles I’ve picked out above. I’ve highlighted some other areas covered in this draft code:

  • There must be clear responsibility and accountability for all surveillance camera system activities including images and information collected, held and used.
  • Clear rules, policies and procedures must be in place before a surveillance camera system is used, and these must be communicated to all who need to comply with them.
  • Surveillance camera system operators should consider any approved operational, technical and competency standards relevant to a system and its purpose and work to meet and maintain those standards.
  • There should be effective review and audit mechanisms to ensure legal requirements, policies and standards are complied with in practice, and regular reports should be published.
  • When the use of a surveillance camera system is in pursuit of a legitimate aim, and there’s a pressing need for its use, it should then be used in the most effective way to support public safety and law enforcement with the aim of processing images and information of evidential value.
  • Any information used to support a surveillance camera system which compares against a reference database for matching purposes (for example, ANPR or facial recognition) should be accurate and kept up to date.

I hope you found this article useful. If you’d like to know more please just drop us a line and arrange a chat.

Three Steps to Transparency Heaven

June 2022

A strategic approach to transparency

Transparency is enshrined in one of the key data protection principles: Principle (a): Lawfulness, fairness and transparency….

You must be clear, open and honest with people from the start about how you will use their personal data. 

There’s also a requirement to consider a data protection by design and default approach. To legitimately take this approach requires some planning and clear communication between teams about which data is used for what. 

It’s obvious that most companies can pull together a privacy notice. However, as with many things to do with GDPR, creating engaging communications which deliver the correct information in a digestible format appears easier said than done.

Recent fines related to lack of transparency

In May we saw a €4.2m fine for Uber by Italian Data Protection Authority (the Garante) for data protection violations. Amongst other things, the privacy notice was incorrect and incomplete whilst there were not enough details on the purposes of processing and the data subject’s rights had not been spelled out.  

Earlier this year, Klarna Bank AB was fined by the Swedish Data Protection Authority (IMY) for lack of transparency. 

Be warned, the regulators are taking a look at these documents.

Step 1: Creating your Privacy Notice

Privacy notices have become rather formulaic since 2018 and my colleague Phil wrote a handy checklist of what must and should be included. Take note and have a look to see if you have ticked all the boxes. 

Step 2: Housekeeping your Privacy Notice

The privacy notice is a dynamic document. Keeping it up to date is important. 

  • New data processing activities: Make sure you’re made aware of new technology, new teams, new business processes which may all generate new data processing activities that need to be notified. 
  • Record of Processing Activities: Create a routine to keep your RoPA up to date and that any changes are clearly flagged to the DP team.
  • Regulatory changes: Review any change in regulatory guidance. International data transfers are a perfect case in point where the guidance has changed. Changes may necessitate an update to your privacy notice.
  • Supplier due diligence: Review your supplier arrangements – are they carrying out new data processing activities which need to be captured in the notice. Are new suppliers in place and have they been audited/reviewed?
  • Marketing innovations: Ask your marketing team about their plans as digital marketing developments move at breakneck speed. The use of AI for targeting and segmentation, innovations in digital advertising as well as the evolution of social media platforms all present privacy challenges. In addition you may need to inform consumers of material changes. 

Step 3: Breathing life into your Privacy Notice

It’s a marketing challenge to get people to pay attention to the privacy notice.

  • Use different communication methods – not everyone likes reading long screeds of text. Look at creative communication methods such as infographics, videos, cartoons to get the message across. Channel 4 are an exemplar as are The Guardian.
  • Use plain English – whenever you write it down, make sure it’s couched in terms your target audience will understand. Various reports place average reading age as 8, 9 or 11. Plain English, short sentences, easy to understand words should be deployed to get your message across.
  • Include information tailored to different target audiences: Companies will sometimes carry out data processing for clients, for consumers and for employees. Trying to cram all this information into one document makes it nigh on impossible for anyone to understand what’s going on. Separate it out and clearly signal what’s relevant to each group.
  • Use layers of communication – the ICO advocates a layered approach to communicating complicated messages. If you create a thread through your messages from clear top-level headlines with clear links to additional information, there’s a higher chance of achieving better levels of comprehension.
  • Keep it short and sweet – having read some of the documents produced by corporates, I am struck by how repetitive they can be. Not only do you lose the will to live, but comprehension levels are low and confusion levels are high. All of which is rather unhelpful.
  • Be upfront and transparent – do not obfuscate and confuse your audience. Although it can feel scary to tell individuals what is happening with their personal data, audiences appreciate the openness when processing is explained clearly. They need to know what’s in it for them. 

Overall, this is a major marketing challenge. Explaining how you use personal data is an important branding project which allows a company to reflect their values and their respect for their customers.

The marketing teams need to get close to their privacy colleagues and use their formidable communication skills to make these important data messages resonate and make sense.

Four years on from GDPR, now is a good time to take a look at your privacy notice to see if it needs a refresh.

 

Dark patterns: is your website tricking people?

April 2022

Why should we be concerned about dark patterns?

Do you ever feel like a website or app has been designed to manipulate you into doing things you really don’t want to do? I bet we all have. Welcome to the murky world of ‘dark patterns’.

This term was originally penned in 2010 by user experience specialist, Harry Brignull, who defines dark patterns as “features of interface design crafted to trick people into doing things they might not want to do and which benefit the business”.

Whenever we use the internet, businesses are fighting for our attention and it’s often hard for them to get cut through. And we often don’t have the time or inclination to read the small print, we just want to achieve what we set out to do.

Business can take advantage of this. Sometimes they make it difficult to do things which should, on the face of it, be really simple. Like cancel or say no. They may try to lead you down a different path that suits their business better and leads to higher profits.

These practices are in the spotlight, and businesses could face more scrutiny in future. Sometimes dark patterns are deliberate, sometimes they may be accidental.

What are dark patterns?

There are many interpretations of dark patterns and many different examples. Here’s just a handful to give you a flavour – it’s by no means an exhaustive list.

  • ‘Roach Motel’ – this is where the user journey makes it easy to get into a situation, but hard to get out. (Perhaps ‘Hotel California’ might have been a better name?). For example, when it’s easy to sign up to a service, but very difficult to cancel it because it’s buried somewhere you wouldn’t think to look. And when you eventually find it, you still have to wade through several messages urging you not to cancel.
  • FOMO (Fear Of Missing Out) – this for example is when you’re hurried into making a purchase by ‘urgent’ messages showing a countdown clock or alert messages saying the offer will end imminently.
  • Overloading – this is when we’re confronted with a large number of requests, information, options or possibilities to prompt us to share more data. Or it could be used to prompt us to unintentionally allow our data to be handled in a way we’d never expect.
  • Skipping – this is where the design of the interface or user experience is done is such a way that we forget, or don’t think about, the data protection implications. Some cookie notices are designed this way.
  • Stirring – this affects the choices we make by appealing to our emotions or using visual cues. For example, using a certain colour for buttons you’d naturally click for routine actions – getting you into the habit of clicking on that colour. Then suddenly using that colour button for the paid for service, and making the free option you were after hard to spot.
  • Subliminal advertising – this is the use of images or sounds to influence our responses without us being consciously aware of it. This is banned in many countries as deceptive and unethical.

Social engineering?

Some argue the ‘big players’ in search and social media have been the worst culprits in the evolution and proliferation of dark patterns. For instance, the Netflix video ‘The Social Dilemma’ argued that Google and Facebook have teams of engineers mining behavioural data for insights on user psychology to help them evolve their interface and user experience.

The mountain of data harvested when we search, browse, like, comment, post and so on can be used against us, to drive us to behave they way they want us to – without us even realising. The rapid growth of AI could push this all to a whole new level if left unchecked.

The privacy challenge

Unsurprisingly there’s a massive cross-over between dark patterns and negatively impacting on a user’s privacy. The way user interfaces are designed can play a vital role in good or bad privacy.

In the EU, discussions about dark patterns (under the EU GDPR) tend to concentrate on to how dark patterns can be used to manipulate buyers to give their consent – and point out consent would be invalid if it’s achieved deceptively or not given freely.

Here are some specific privacy related examples.

  • Tricking you to installing an application you didn’t want, i.e. consent is not unambiguous or freely given.
  • When the default privacy settings are biased to push you in a certain direction. For example, on cookie notices where it’s much simpler to accept than object, and can take more clicks to object. Confusing language may also be used to manipulate behaviour.
  • ‘Privacy Zuckering’, is a term used for when you’re tricked into publicly sharing more information about yourself than you really intended to. Named after Facebook’s co-founder & CEO, but it isn’t unique to them, for example, LinkedIn have been fined for this.
  • When an email unsubscribe link is hidden within other text.
  • Where more screen space is given to selecting options the business wants you to take, and less space for what might be more preferable options the customer. For example, e.g. the frequency of a subscription, rather than a one-off purchase.

Should businesses avoid using dark patterns?

Many will argue YES! Data ethics is right at the heart of the debate. Businesses should ask themselves if what they are doing is fair and reasonable to try to encourage sales and if their practices could be seen as deceptive. Are they doing enough to protect their customers?

Here are just a few reasons to avoid using dark patterns:

  • They annoy your customers and damage their experience of your brand. A survey by Hubspot found 80% of respondents said they had stopped doing business with a company because of a poor customer experience. If your customers are dissatisfied, they can and will switch to another provider.
  • They could lead to higher website abandon rates.
  • Consent gathered by manipulating consumer behaviour is unlikely to meet the GDPR consent requirements, i.e. freely given, informed, explicit and unambiguous. So your processing could turn out to be unlawful.

Can these effects happen by mistake?

Dark patterns aren’t always deliberate. They can arise due to loss of focus, short-sightedness, poorly trained AI models, or a number of other factors. However they are more likely to occur when designers are put under pressure to deliver on time for a launch date, particularly when commercial objectives are prioritised above all else.

Cliff Kuang, author of “User Friendly”, says there’s a tendency to make it easy for users to perform the tasks that suit the company’s preferred outcomes. The controls for limiting functionality or privacy controls can sometimes be an afterthought.

What can businesses do to prevent this?

In practice it’s not easy to strike the right balance. We want to provide helpful information help our website / app users to make decisions. It’s likely we want to ‘nudge’ them in the ‘right’ direction. But we should be careful we don’t do this in ways which confuse, mislead or hurry users into doing things they don’t really want to do.

It’s not like the web is unique in this aim (it’s just that we have a ton of data to help us). In supermarkets, you used to always see sweets displayed beside the checkout. A captive queuing audience, and if it didn’t work on you, a clear ‘nudge’ to your kids! But a practice now largely frowned upon.

So how can we do ‘good sales’ online without using manipulation or coercion? It’s all about finding a healthy balance.

Here’s a few suggestions which might help your teams:

  • Train your product developers, designers & UX experts – not only in data protection but also in dark patterns and design ethics. In particular, help them recognise dark patterns and understand the negative impacts they can cause. Explain the principles of privacy by design and the conditions for valid consent.
  • Don’t allow business pressure and priorities to dictate over good ethics and privacy by design.
  • Remember data must always be collected and processed fairly and lawfully.

Can dark patterns be regulated?

The proliferation of dark patterns over recent years has largely been unrestricted by regulation.

In the UK & Europe, where UK & EU GDPR are in force, discussions about dark patterns have mostly gravitated around matters relating to consent – where that consent may have been gathered by manipulation and may not meet the required conditions.

In France, the CNIL (France’s data protection authority) has stressed the design of user interfaces is critical to help protect privacy. In 2019 CNIL took the view that consent gathered using dark patterns does not qualify as valid freely given consent.

Fast forward to 2022 and the European Data Protection Board (EDPB) has released guidelines; Dark patterns in social media platform interfaces: How to recognise and avoid them.

These guidelines offer examples, best practices and practical recommendations to designers and users of social media platforms, on how to assess and avoid dark patterns in social media interfaces which contravene GDPR requirements. The guidance also contains useful lessons for all websites and applications.

They remind us we should take into account the principles of fairness, transparency, data minimisation, accountability and purpose limitation, as well the requirements of data protection by design and by default.

We anticipate EU regulation of dark patterns may soon be coming our way. The International Association of Privacy Professionals (IAPP) recently said, “Privacy and data protection regulators and lawmakers are increasingly focusing their attention on the impacts of so-called ‘dark patterns’ in technical design on user choice, privacy, data protection and commerce.”

Moves to tackle dark patterns in the US

The US Federal Trade Commission has indicated it’s giving serious attention to business use of dark patterns and has issued a complaint against Age of Learning for its use of dark patterns for their ABC Mouse service.

Looking at state-led regulations, in California modifications to the CCPA have been proposed to tackle dark patterns. The Colorado Privacy Act also looks to address this topic.

What next?

It’s clear businesses should be mindful of dark patterns and consider taking an ethical stance to protect their customers / website users. Could your website development teams be intentionally or accidentally going too far? Its good practice to train website and development teams so they can prevent dark patterns occurring, intentionally or by mistake.

What does the IKEA CCTV story tell us?

April 2022

Only set up video surveillance if underpinned by data protection by design and default

What happened?

Following an internal investigation, IKEA was forced to apologise for placing CCTV cameras in the ceiling voids above the staff bathroom facilities in their Peterborough depot. The cameras were discovered and removed in September 2021, but the investigation has only just concluded in late March 2022.

An IKEA spokesman said:

 “Whilst the intention at the time was to ensure the health and safety of co-workers, we understand the fact that colleagues were filmed unknowingly in these circumstances will have caused real concern, and for this we are sincerely sorry.”

The cameras were installed following “serious concerns about the use of drugs onsite, which, owing to the nature of work carried out at the site, could have very serious consequences for the safety of our co-workers”.

They had been sanctioned following “multiple attempts to address serious concerns about drug use, and the use of false urine samples as a way of disguising it”.

“The cameras placed within the voids were positioned only to record irregular activity in the ceiling voids,” he said.

“They were not intended to, and did not, record footage in the toilet cubicles themselves. However, as aresult of ceiling tiles becoming dislodged, two cameras inadvertently recorded footage of the communal areas of two bathrooms for a period of time in 2017. The footage was not viewed at the time and was only recovered as part of these investigations.”

Apology and new ICO guidance

The key question raised by this incident is where to draw the line. When is it inappropriate to set up CCTV? In this instance, the company had concerns about drug misuse – but was that a good enough reason? I think a lot of us intuitively felt the answer was no. 

This apology conveniently coincides with the recent publication of some new guidance on video surveillance from ICO regarding UK GDPR and Data Protection Act 2018.

This guidance is not based on any changes in the legislation – more an update to provide greater clarity about what you should be considering.

Video surveillance definition

The ICO guidance includes all the following in a commercial setting:

  • Traditional CCTV
  • ANPR (automatic number plate recognition)
  • Body Worn Video (BWV)
  • Facial Recognition Technology (FRT)
  • Drones
  • Commercially available technologies such as smart doorbells and dashcams (not domestic settings)

Guidance for domestic use is slightly different.

Before setting up your video surveillance activity 

As part of the system setup, it’s important to create a record of the activities taking place. This should be included in the company RoPA (Record of Processing Activities).

As part of this exercise, one needs to identify:

  • the purpose of the lawful use of surveillance
  • the appropriate lawful basis for processing
  • the necessary and proportionate justification for any processing
  • identification of any data-sharing agreements
  • the retention periods for any personal data

 As with any activity relating to the processing of personal data, the organisation should take a data protection by design and default approach when setting up the surveillance system.

Before installing anything, you should also carry out a DPIA (Data Protection Impact Assessment) for any processing that’s likely to result in a high risk for individuals. This includes:

  • Processing special category data
  • Monitoring publicly accessible places on a large scale
  • Monitoring individuals at a workplace

A DPIA means you can identify any key risks as well as potential mitigation for managing these. You should assess whether the surveillance is appropriate in the circumstances.

In an employee context it’s important to consult with the workforce, consider their reasonable expectations and the potential impact on their rights and freedoms. One could speculate that IKEA may not have gone through that exercise.

Introducing video surveillance

Once the risk assessment and RoPA are completed, other areas of consideration include:

  • Surveillance material should be securely stored – need to prevent unauthorised access
  • Any data which can be transmitted wirelessly or over the internet requires encryption to prevent interceptions
  • How easily data can be exported to fulfil DSARs
  • Ensuring adequate signage is in place to define the scope of what’s captured and used.

Additional considerations for Body Worn Video  

  • It’s more intrusive than CCTV so the privacy concerns are greater
  • Whether the data is stored centrally or on individual devices
  • What user access controls are required
  • Establishing device usage logs
  • Whether you want to have the continuous or intermittent recording
  • Whether audio and video should be treated as two separate feeds

In any instance where video surveillance is in use, it’s paramount individuals are aware of the activity and understand how that data is being used.

Ransomware attack leads to £98k ICO fine

March 2022

Solicitors firm failed to implement ‘adequate technical and organisational measures’

Are you using Multi-Factor Authentication? Are patch updates installed promptly? Do you encrypt sensitive data?

Reports of cyber security incidents in the UK rose 20% in the last 6 months of 2021.

These figures from the ICO, combined with the heightened threat in the current climate, provide a stark warning to be alert.

The ICO says; “The attacks are becoming increasingly damaging and this trend is likely to continue. Malicious and criminal actors are finding new ways to pressure organisations to pay.”

Against this backdrop the ICO has issued a fine to Solicitors’ firm following a ransomware attack in 2020.

The organisation affected was Tuckers Solicitors LLP (“Tuckers”) which is described on its website as the UK’s leading criminal defence lawyers, specialising in criminal law, civil liberties and regulatory proceedings.

While each organisation will face varying risks, this case highlights some important points for us all.

Here’s a summary of what happened, the key findings and the steps we can all take. For increasing numbers of organisations this case will unfortunately sound all too familiar.

What happened?

On 24 August 2020 Tuckers realised parts of its IT system had become unavailable. Shortly after IT discovered a ransomware note.

  • Within 24 hours it was established the incident was a personal data breach and it was reported to the ICO.
  • The attacker, once inside Tuckers’ network, installed various tools which allowed for the creation of a user account. This account was used to encrypt a significant volume of data on an archive server within the network.
  • The attack led to the encryption of more than 900,000 files of which over 24,000 related to ‘court bundles’.
  • 60 of these bundles were exfiltrated by the attacker and released on the ‘dark web’. These compromised files included both personal data and special category data.
  • The attacker’s actions impacted on the archive server and backups. Processing on other services and systems were not affected.
  • By 7 September 2020, Tuckers updated the ICO to say the servers had been moved to a new environment and the business was operating as normal. The compromised data was effectively permanently lost, however material was still available in management system unaffected by the attack.
  • Tuckers notified all but seven of the parties identifiable within the 60 court bundles which had been released, who they did not have contact details for.

Neither Tuckers, nor third party investigators, were able to determine conclusively how the attacker was able to access the network in the first place. However, evidence was found of a known system vulnerability which could have been used to either access the network or further exploit areas of Tuckers once in side the network.

What data was exfiltrated?

The data released on the ‘dark web’ included:

  • Basic identifiers
  • Health data
  • Economic and financial data
  • Criminal convictions
  • Data revealing racial or ethnic origin

This included medical files, witness statements and alleged crimes. It also related to ongoing criminal court and civil proceedings.

Tuckers explained to the Regulator, based on its understanding, the personal data breach had not had any impact on the conduct or outcome of relevant proceedings.

However, the highly sensitive nature of the data involved increased the risk and potential adverse impact on those affected.

Four key takeaways

The ICO makes it clear in its enforcement notice that primary culpability for the incident rests with the attacker. But clear infringements by Tuckers were found.

The Regulator says a lack of sufficient technical and organisation measures gave the attacker a weakness to exploit.

Takeaways from this case:

1) Multi-Factor Authentication (MFA)

Tuckers’ GDPR and Data Protection Policy required two-factor authentication, where available. It was found that Multi-Factor Authentication (MFA) was not used for its ‘remote access solution’.

The ICO says the use of MFA is a relatively low-cost preventative measure which Tuckers should have implemented.

The Regulator concluded the lack of MFA created a substantial risk of personal data on Tuckers’ systems being exposed to consequences such as this attack.

Takeaway: If you currently don’t use MFA, now would be a good time to implement it.

2) Patch management

The case reveals a high-risk security patch was installed in June 2020, more than FOUR months after its release.

The ICO accepts the attacker could have exploited this vulnerability during the un-patched period.

Considering the highly sensitive nature of the personal data Tuckers were handling, the Regulator concludes they should not have been doing so in an infrastructure containing known critical vulnerabilities. In other words the patch should have been installed much sooner.

Takeaway: Make sure patches are installed promptly, especially where data is sensitive.

3) Encryption

During the investigation Tuckers informed the ICO the firm had not used encryption to protect data on the affected archived server.

While the Regulator accepts this may not have prevented the ransomware attack itself, it believes it would have mitigated some of the risks posed to the affected individuals.

Takeaway: There are free, open-source encryption solutions are available. Alternatively more sophisticated paid for solutions are available for those handling more sensitive data.

Also it’s worth checking you’re adequately protecting archives to the same standard as other systems.

4) Retention

The enforcement notice reveals some ‘court bundles’ affected in the attack were being stored beyond the set 7-year retention period.

Takeaway: This again exposes a common issue for many organisations. Too often data is held longer than is necessary, which can increase the scale & impact of a data breach.

Our comprehensive Data Retention Guidance is packed with useful tools, templates and advice on tackling how long you keep personal data for.

What else can organisations do?

Clearly, we can’t be complacent and shouldn’t cut corners. We need to take all appropriate steps to protect personal data and avoid common pitfalls. Here are some useful resources to help you:

  • Cyber Essentials – The enforcement action notes that prior to the attack Tuckers was aware its security was not at the level of the NCSC Cyber Essentials. In October 2019, it was assessed against the ‘Cyber Essentials’ criteria and failed to meet crucial aspects of its requirements.

Cyber Essentials was launched in 2014 and is an information security assurance scheme operated by the National Cyber Security Centre. It helps to make sure you have the basis controls in place to protect networks/systems from threats.

Cyber Essentials – gain peace of mind with your information security
National Cyber Security Centre

  • ICO Ransomware guidance – The ICO has recently published guidance which covers security policies, access controls, vulnerability management, detection capabilities and much more.
  • DPN Data Breach Guide – Our practical guide covers how to be prepared, how to assess the risk and how to decide whether a breach should be reported or not.

You can read the full details of this case here: ICO Enforcement Action – Tuckers Solicitors LLP

Data Breach Guide

How to handle a data breach

The stakes are high when you suffer a data breach. The stats reveal breaches are endemic.

The ICO says reports of cyber attacks increased 20% in the last six months of 2021. The main cause of breaches remains non-cyber incidents. In a recent DPN survey 68% of respondents said they’d reported at least one breach in the past 12 months.

This white paper helps you to:
• Prepare
• Assess the risks
• Reach a decision on whether to report or not

Get your copy now…

How to get buy-in for DPIAs

February 2022

How do we get people engaged with Data Protection Impact Assessments?

DPIAs often get a bad rap. Privacy people often say their project managers and team leaders don’t understand and don’t like them.  They’re too onerous, they get started but often linger incomplete.

So, how do you get people in the business to understand and play along?

Let’s be clear – risk assessments (and a DPIA is one of these) can be one of the most useful tools in your data protection toolkit. Used properly, they can really help identify, assess and tackle risks before they even see the light of day.

When should you carry out a DPIA?

Just to recap we know we need to conduct DPIAs where our projects, initiatives, system changes and so on, are likely to represent a high risk to those whose data is involved. Note ‘high risk’. You’ll need to take account of the scope, type and manner of the proposed processing.

It’s not always easy to judge where this threshold falls, so some businesses end up carrying out far more DPIAs than needed, whilst others carry out too few. Fortunately the ICO have given examples of processing ‘likely to result in high risk’ to help you make this call.

Regulated sectors, such as financial services & telecoms, have more to think about and may adopt a cautious approach.

Engage with your teams

First rule of DPIA Club is… we MUST talk about it!

Build relationships with the people who ‘do new stuff’ with your data. The people who run development projects and the key stakeholders – such as heads of the main functions which process personal data across your business, e.g. Marketing, Operations, HR, etc. If you have a Procurement team, then target them too.

Ask what projects they have on the horizon. The aim is to make them aware of DPIA requirements and ask them to give you an early ‘heads up’ if they are looking to onboard a new service provider or indeed use data for an innovative new project.

Let them know tech projects and system migrations almost always involve some kind of personal data processing. They should be mindful of the potential for this to lead to privacy risks.

If they think about data protection from the outset it will save valuable time and money in the long run. Save unwelcome hiccups along the line. Give them examples of how things have gone wrong or could go wrong.

You could raise awareness across the business using your intranet, email reminders, posters, drop-in clinics … what ever it takes to get the message across.

A regular dialogue about upcoming technology projects, or using a DPIA screening form (or for larger businesses a technology ‘gating’ process) are good ways to get a heads up on new projects. These will help to quickly identify if a DPIA is needed or not.

Steve Priestly, Head of Data Protection (UK & MET), Travelex:

‘We place a key focus on highlighting to stakeholders of the benefits of early engagement in the DPIA process. Continual collaboration with your stakeholders is also key, understanding what they are trying to achieve. Lastly, ongoing DPIA education and awareness will help in the long-term to imbed a strong data privacy culture.’  

Use a good DPIA template

In my opinion too many businesses use complex and jargon-filled screening questionnaires and DPIA templates, which many people find hard to understand. They ask questions in ‘GDPR-talk’ which people find hard to grasp & answer and they often don’t really help people to identify what privacy risks actually look like.

Take a look at your DPIA template with fresh eyes. If you don’t like it use a better one, or adapt it to fit your business ways of working.

*****

Webinar: How to create meaningful DPIAs

22 February 2022 @ 4pm (GMT) – Book your place now

 *****

Be prepared for Agile working

So many development projects are Agile now and this requires adapting your approach. You won’t get all the answers you need at the start. Stay close to the project as it evolves and be ready to roll your DPIA in line with scheduled sprints or scrums, but before data migrates. DPIAs – How to assess projects in an Agile environment

DPIA approaches

It’s a good idea to keep tabs on how many data projects are in progress, how many lead to DPIAs and what the status of these is. This means you will know if you need to drum up more engagement or not.

Here are a couple of examples of the approaches taken by different businesses.

Use of technology tools

Stephen Baigrie, Managing Counsel, IT, Procurement & Privacy at Balfour Beatty:

“At Balfour Beatty we use an online privacy compliance platform to manage DPIAs and to enable early stakeholder engagement. We worked with our Group Data Protection Officer and Information Security team to formulate user-friendly assessment templates.

We use a pre-DPIA screening qualifier to help identify if a full DPIA is required and run a working group with Data Protection, Legal and Information Security stakeholders to track DPIAs and vendor due diligence matters.”

“Where appropriate, we adopt a self-service model for DPIA completion to help improve privacy awareness and seek to be agile by continuously improving and evolving our privacy processes.”

An integral part of the change governance process

Christopher Whitewood (CIPP/E, CIPM) Privacy & Data Protection Officer at Direct Line Group:]

“We have mandated that a risk assessment must be conducted as part of our change governance process. Our DPIA is included as part of a single online risk assessment form which allows for an early risk assessment by Privacy, Security and Business Continuity Teams.”

“A simple approach allows business areas to fill out one form with a layered question set to determine where further investigation is needed. The online form has been adapted to consider any data ethical concerns at an early stage, but also has the added bonus of the scored risk assessment to form the basis to drive assurance activity.”

So to conclude, I hope this has given you some fresh ideas how to engage with your colleagues about DPIAs. Good luck!

ICO Opinion on Ad Tech – Old wine in a new bottle?

December 2021

Does the ICO Opinion piece tell us anything new?

The ICO has published an “Opinion” which can be interpreted as a shot across the bows for any Ad Tech company who is planning to launch their new targeting solutions for the post-third-party cookie world. 

If these companies thought new targeting solutions would get waved through because they don’t involve third-party cookies, it’s clear that Google’s difficulties with their Sandbox solution say otherwise. 

Google is currently knee-deep in discussions with both Competition and Marketing Authority (CMA) and ICO to come up with a targeting solution that is fair to consumers whilst also avoiding the accusation of being anti-competitive. 

In the ICO’s opinion piece they set out the clear parameters for developing these solutions in a privacy-friendly manner. You won’t be too surprised to hear all the usual concerns being re-heated in this discussion. To quote the ICO:

  1. Engineer data protection requirements by default into the design of the initiative
  2. Offer users the choice of receiving adverts without tracking, profiling, or targeting based on personal data. 
  3. Be transparent about how and why personal data is processed across the ecosystem and who is responsible for that processing
  4. Articulate the specific purposes for processing personal data and demonstrate how this is fair, lawful, and transparent
  5. Address existing privacy risks and mitigate any new privacy risks that the proposals introduce

This opinion piece is the latest publication from the ICO in a relatively long-running piece of work on the use of cookies and similar technologies for the processing of personal data in online advertising. In their original report in 2019, the ICO reported a wide range of concerns with the following which needed to be rectified:

  • Legal requirements on cookie use;
  • Lawfulness, fairness, and transparency;
  • Security;
  • Controllership arrangements;
  • Data retention;
  • Risk assessments; and
  • Application of data protection by design principles. 

You can read the back story here

The state of play in 2021

Since the ICO has started its investigations in 2019, the market has continued to develop new ways of targeting advertising that does not rely on third-party cookies. The net result is that the world has moved to a less intrusive way of tracking which has been welcomed by ICO. Some examples include: 

  • With Google Chrome’s announcement re: cookies, there is an expectation that third-party cookies will be phased out by end of 2022. 
  • There have been increases in the transparency of online tracking – notably Apple’s “App Tracking Transparency” ATT
  • There are new mechanisms being developed to help individuals indicate their privacy preferences simply and effectively
  • Browser developers are introducing tracking prevention in their software.  A notable example is the Google Privacy Sandbox which will enable targeting with alternative technologies.

How should we interpret this opinion piece?

A lot of what has been included is information from the 2019 reports. In effect, it’s a summary of previous activities plus additional material to bring you up to date. Although it is a rather long piece, there is some clear guidance for the way forward for developers of new solutions. 

Furthermore, it is bluntly warning technology firms that they are in the ICO’s sights: 

“In general, the Commissioner’s view is that these developments are not yet sufficiently mature to assess in detail. They have not shown how they demonstrate participants’ compliance with the law, or how they result in better data protection outcomes compared to the existing ecosystem” Source: ICO

Data protection by design is paramount – no excuses for non-compliance this time

The ICO opinion clearly flags to developers that they will accept no excuses for developing non-compliant solutions. In the past, there have been difficulties because the Ad Tech solutions have been in place for some time with the data protection guidance being retrofitted to an existing ecosystem. 

With the demise of third-party cookies and the advent of a variety of new solutions, there can be no excuse for ensuring that privacy is engineered into the design of the solutions. 

It explicitly highlights the need to respect the interests, rights, and freedoms of individuals. Developers need to evidence that these considerations have been taken into account.  

Users must be given a real choice

In the first instance, users must be given the ability to receive adverts without tracking, profiling, or targeting based on personal data. There must be meaningful control and developers must demonstrate that there is user choice through the data lifecycle. 

Accountability – show your homework

There is an expectation that there will be transparency around how and why personal data is processed and who is responsible for that processing. In the current ecosystem, this is largely impossible to achieve and there is no transparency across the supply chain. 

Articulate the purpose of processing data

Each new solution should describe the purpose of processing personal data and demonstrate how this is fair, lawful, and transparent. Can suppliers assess the necessity and proportionality of this processing? The 2019 report highlighted that the processing appeared excessive relative to the outcomes achieved. How will processors change their ways? 

Addressing risk and reducing harm

As a start, it’s important to articulate the privacy risks, likely through a DPIA, but also explain how those risks will be mitigated. The previous ICO reports indicated their disappointment with the low volume of DPIAs produced by Ad Tech providers. This needed to change. 

To conclude with a useful developer checklist

The ICO provides a checklist of how to apply these principles in practice. You can probably jump to this section if you really want to know what is expected: 

  1. Demonstrate and explain the design choices.
  2. Be fair and transparent about the benefits.
  3. Minimise data collection and further processing.
  4. Protect users and give them meaningful control.
  5. Embed the principle of necessity and proportionality.
  6. Maintain lawfulness, risk assessments, and information rights.
  7. Consider the use of special category data.

The ICO is very clear that the industry must change. There is no appetite to approve solutions that fundamentally adopt the same flawed ways of working. There is also a clear acknowledgment that some solutions are potentially anti-competitive so a partnership with the CMA will continue. You have been warned!